Implementing workplace-based assessment across the medical specialties in the United Kingdom

Royal College of Physicians, London, UK.
Medical Education (Impact Factor: 3.62). 05/2008; 42(4):364-73. DOI: 10.1111/j.1365-2923.2008.03010.x
Source: PubMed

ABSTRACT To evaluate the reliability and feasibility of assessing the performance of medical specialist registrars (SpRs) using three methods: the mini-clinical evaluation exercise (mini-CEX), directly observed procedural skills (DOPS) and multi-source feedback (MSF) to help inform annual decisions about the outcome of SpR training.
We conducted a feasibility study and generalisability analysis based on the application of these assessment methods and the resulting data. A total of 230 SpRs (from 17 specialties) in 58 UK hospitals took part from 2003 to 2004. Main outcome measures included: time taken for each assessment, and variance component analysis of mean scores and derivation of 95% confidence intervals for individual doctors' scores based on the standard error of measurement. Responses to direct questions on questionnaires were analysed, as were the themes emerging from open-comment responses.
The methods can provide reliable scores with appropriate sampling. In our sample, all trainees who completed the number of assessments recommended by the Royal Colleges of Physicians had scores that were 95% certain to be better than unsatisfactory. The mean time taken to complete the mini-CEX (including feedback) was 25 minutes. The DOPS required the duration of the procedure being assessed plus an additional third of this time for feedback. The mean time required for each rater to complete his or her MSF form was 6 minutes.
This is the first attempt to evaluate the use of comprehensive workplace assessment across the medical specialties in the UK. The methods are feasible to conduct and can make reliable distinctions between doctors' performances. With adaptation, they may be appropriate for assessing the workplace performance of other grades and specialties of doctor. This may be helpful in informing foundation assessment.


Available from: Andrew Wragg, Mar 31, 2015
1 Bookmark
  • [Show abstract] [Hide abstract]
    ABSTRACT: The aims of this study were to evaluate the correlation between the results of the objective structured clinical examination (OSCE) and clinical assessment and to test the reliability of OSCE test stations. All 4th year undergraduate dental students (n = 47, 100%) attended the OSCE in April 2010. The students were divided into two groups (morning group, group 1; afternoon group, group 2). Groups 1 and 2 were also divided into two subgroups that attended the stations in two concurrent sessions (A and B). The OSCE included 12 10-min test stations. Clinical assessment was based on long-term observation during the semesters. The disciplines assessed were cross-infection control, endodontics, paediatric dentistry, periodontology, prosthodontics and restorative dentistry. Statistical analysis using Cronbach's alpha indicated good reliability of the OSCE. The correlation between the results of the OSCE and clinical assessment in the 4th year was statistically significant in cross-infection control (ρ = 0.340, P = 0.022), endodontics (ρ = 0.298, P = 0.047), prosthodontics (ρ = 0.296, P = 0.048) and restorative dentistry (ρ = 0.376, P = 0.011). Clinical assessment in the 5th year correlated with the OSCE results statistically significant in restorative dentistry (ρ = 0.522, P = 0.001). Both the OSCE and constant longitudinal assessment are needed in clinical assessment, as they both play an important role in the overall assessment. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
    European Journal Of Dental Education 12/2014; DOI:10.1111/eje.12126 · 1.45 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: To explore Foundation trainees' and trainers' understandings and experiences of supervised learning events (SLEs), compared with workplace-based assessments (WPBAs), and their suggestions for developing SLEs.
    BMJ Open 10/2014; 4(10):e005980. DOI:10.1136/bmjopen-2014-005980 · 2.06 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background: In order to achieve the desired performance of graduates a number of traditional evaluation exercises have been practiced to assess their competence as medical students. Many of these assessments are done in a controlled environment and mostly reflect on tests of competence than performance. Mini-CEX or direct observed procedural skills (DOPS) are the real performance-based assessment of clinical skills. Increased opportunity for observation and just-in-time feedback from the role model superiors produce a positive educational impact on students learning. This also provides trainees with formative assessment to monitor their learning objectives. However, to implement assessment strategies with Mini-CEX or DOPS needs to develop institution's clear policy for a different teaching and learning culture of workplace based assessment. It also needs to develop user friendly rating form, checklist, elaboration of clinical competence and its attributes and procedural guidelines for practice. A precise role of these tools in the assessment of postgraduate program must be established before practicing them to evaluate and monitor trainee's progress. Objective: To determine DOPS for its acceptability and feasibility as a method of formative assessment of clinical skills in postgraduate program of Otolaryngology and Head-Neck Surgery.
    06/2011; 3(1). DOI:10.5959/eimj.3.1.2011.or2