Implementing workplace-based assessment across medical specialties in the United Kingdom

Royal College of Physicians, London, UK.
Medical Education (Impact Factor: 3.2). 05/2008; 42(4):364-73. DOI: 10.1111/j.1365-2923.2008.03010.x
Source: PubMed


To evaluate the reliability and feasibility of assessing the performance of medical specialist registrars (SpRs) using three methods: the mini-clinical evaluation exercise (mini-CEX), directly observed procedural skills (DOPS) and multi-source feedback (MSF) to help inform annual decisions about the outcome of SpR training.
We conducted a feasibility study and generalisability analysis based on the application of these assessment methods and the resulting data. A total of 230 SpRs (from 17 specialties) in 58 UK hospitals took part from 2003 to 2004. Main outcome measures included: time taken for each assessment, and variance component analysis of mean scores and derivation of 95% confidence intervals for individual doctors' scores based on the standard error of measurement. Responses to direct questions on questionnaires were analysed, as were the themes emerging from open-comment responses.
The methods can provide reliable scores with appropriate sampling. In our sample, all trainees who completed the number of assessments recommended by the Royal Colleges of Physicians had scores that were 95% certain to be better than unsatisfactory. The mean time taken to complete the mini-CEX (including feedback) was 25 minutes. The DOPS required the duration of the procedure being assessed plus an additional third of this time for feedback. The mean time required for each rater to complete his or her MSF form was 6 minutes.
This is the first attempt to evaluate the use of comprehensive workplace assessment across the medical specialties in the UK. The methods are feasible to conduct and can make reliable distinctions between doctors' performances. With adaptation, they may be appropriate for assessing the workplace performance of other grades and specialties of doctor. This may be helpful in informing foundation assessment.

Download full-text


Available from: Andrew Wragg, Sep 29, 2015
1 Follower
177 Reads
  • Source
    • "The DOPS focuses on evaluating the procedural skills of postgraduate trainees by observing them in the workplace setting.18 The DOPS has been found to be reliable and is generally acceptable by medical trainees.23 In other specialties, however, such as anaesthetics, it has been considered to be a tick-box exercise that does not necessarily reflect trainee competence.24 "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper represents a systematic evaluation of the Core Medical Training Curriculum in the UK. The authors critically review the curriculum from a medical education perspective based mainly on the medical education literature as well as their personal experience of this curriculum. They conclude in practical recommendations and suggestions which, if adopted, could improve the design and implementation of this postgraduate curriculum. The systematic evaluation approach described in this paper is transferable to the evaluation of other undergraduate or postgraduate curricula, and could be a helpful guide for medical teachers involved in the delivery and evaluation of any medical curriculum.
    01/2014; 5(1):2042533313514049. DOI:10.1177/2042533313514049
  • Source
    • "Further development of structured clinical and workplace-based assessments, such as DOPS, address these limitations. DOPS has been evaluated to be a reliable and valid formative assessment tool [32]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Background As a non-invasive and readily available diagnostic tool, ultrasound is one of the most important imaging techniques in medicine. Ultrasound is usually trained during residency preferable according to German Society of Ultrasound in Medicine (DEGUM) standards. Our curriculum calls for undergraduate training in ultrasound of medical students in their 4th year of undergraduate education. An explorative pilot study evaluated the acceptance of this teaching method, and compared it to other practical activities in medical education at Muenster University. Methods 240 medical students in their 4th year of undergraduate medical education participated in the training and completed a pre- and post-questionnaire for self-assessment of technical knowledge, self-assurance of the procedure, and motivation in performing ultrasound using a Likert scale. Moreover, students were asked about their interest in pursuing a career in internal medicine. To compare this training to other educational activities a standardized online evaluation tool was used. A direct observation of procedural skills assessment (DOPS) for the first time applied on ultrasound aimed to independently assess the success of our teaching method. Results There was a significant increase in technical knowledge and self-assurance (p < 0.001) of the students’ self-assessments. The clinical relevance and self-motivation of the teaching were evaluated positively. The students’ DOPS results demonstrated proficiency in the understanding of anatomic structures shown in ultrasonographic images, including terminology, machine settings, and transducer frequencies. Conclusions Training ultrasound according to certified DEGUM standards was successful and should be offered in undergraduate medical education. The evaluation of the course affirmed the necessity, quality and clinical relevance of the course with a top ranking score of hands-on training courses within the educational activities of the Medical Faculty of Muenster.
    BMC Medical Education 06/2013; 13(1):84. DOI:10.1186/1472-6920-13-84 · 1.22 Impact Factor
  • Source
    • "Though fairly new to medical training (Kogan et al. 2009), a growing body of evidence on the validity and the reliability of formative assessment instruments is emerging (Durning et al. 2002; Holmboe et al. 2003; Wilkinson et al. 2008; LeBlanc et al. 2009). However, whereas in summative assessment validity and reliability are seen as dominant determinants of utility, in formative assessment, utility, defined as learning that results from the assessment process, is much more dependent on how stakeholders (trainees and clinical supervisors) "
    [Show abstract] [Hide abstract]
    ABSTRACT: Introduction: Recent changes in postgraduate medical training curricula usually encompass a shift towards more formative assessment, or assessment for learning. However, though theoretically well suited to postgraduate training, evidence is emerging that engaging in formative assessment in daily clinical practice is complex. Aim: We aimed to explore trainees' and supervisors' perceptions of what factors determine active engagement in formative assessment. Methods: Focus group study with postgraduate trainees and supervisors in obstetrics and gynaecology. Results: Three higher order themes emerged: individual perspectives on feedback, supportiveness of the learning environment and the credibility of feedback and/or feedback giver. Conclusion: Engaging in formative assessment with a genuine impact on learning is complex and quite a challenge to both trainees and supervisors. Individual perspectives on feedback, a supportive learning environment and credibility of feedback are all important in this process. Every one of these should be taken into account when the utility of formative assessment in postgraduate medical training is evaluated.
    Medical Teacher 04/2013; 35(8). DOI:10.3109/0142159X.2012.756576 · 1.68 Impact Factor
Show more