Article

Direct observed procedural skills assessment in the undergraduate setting

Clinical Skills Centre, University of Dundee, Ninewells Hospital, Dundee, UK.
The Clinical Teacher 08/2012; 9(4):228-32. DOI: 10.1111/j.1743-498X.2012.00582.x
Source: PubMed

ABSTRACT

Medical students are required to undertake procedural clinical skills training before they qualify as doctors, and an assessment of these skills is a critical element of their fitness to practice.
Challenges facing educators include the logistics of observing: i.e. who is best placed to assess their competence? Evidence appears to be inconclusive about the competence of students in the workplace to adhere to standards of practice, and the time required for an effective assessment.
In this article the aim is to discuss who is best placed to assess final-year medical students in the workplace. We explore the use of direct observed procedural skills (DOPS) to assess students undertaking procedural skills in a simulated workplace setting by tutor-, peer- and self-assessment. The DOPS tool has been used to assess foundation doctors, but can it be used to effectively assess undergraduate medical students?
The main purpose of formative assessment in the simulated setting is to support student learning through the provision of feedback and debriefing. The use of the DOPS tool in this way can provide an insightful perspective of a students' assessment of procedural clinical skills. Tutors can use the DOPS tool to guide their teaching practice by tailoring their lessons towards areas in which students require more guidance. The DOPS assessment tool presents an opportunity to provide immediate and relevant feedback.

Download full-text

Full-text

Available from: Roddy A Mcleod, Feb 19, 2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Objective: This study sought to establish the current state of procedural skills training in Canadian Royal College emergency medicine (EM) residencies. Methods: A national Web-based survey was administered to residents and program directors of all 13 Canadian-accredited Royal College EM residency programs. Program directors rated the importance and experience required for competence of 45 EM procedural skills. EM residents reported their experience and comfort in performing the same procedural skills. Results: Thirteen program directors and 86 residents responded to the survey (response rate of 100% and 37%, respectively). Thirty-two (70%) procedures were considered important by > 70% of program directors, including all resuscitation and lifesaving airway procedures. Four procedures deemed important by program directors, including cricothyroidotomy, pericardiocentesis, posterior nasal pack for epistaxis, and paraphimosis reduction, had never been performed by the majority of senior residents. Program director opinion was used to categorize each procedure based on performance frequency to achieve competence. Overall, procedural experience correlated positively with comfort levels as indicated by residents. Conclusions: We established an updated needs assessment of procedural skills training for Canadian Royal College EM residency programs. This included program director opinion of important procedures and the performance frequency needed to achieve competence. However, we identified several important procedures that were never performed by most senior residents despite program director opinion regarding the experience needed for competence. Further study is required to better define objective measures for resident competence in procedural skills.
    Full-text · Article · Jul 2013 · Canadian Journal of Emergency Medicine
  • Source

    Full-text · Book · Jan 2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The authors report final-year ward simulation (FYWSE) data from the University of Dundee Medical School. Faculty who designed this assessment intend for the final score to represent an individual senior medical student’s level of clinical performance. The results are included in each student’s portfolio as one source of evidence of the student’s capability as a practitioner, professional, and scholar. Our purpose in conducting this study was to illustrate how assessment designers who are creating assessments to evaluate clinical performance might develop propositions and then collect and examine various sources of evidence to construct and evaluate a validity argument. The data were from all 154 medical students who were in their final year of study at the University of Dundee Medical School in the 2010 – 2011 academic year. To the best of our knowledge, this is the first report on an analysis of senior medical students’ clinical performance while they were taking responsibility for the management of a simulated ward. Using multi-facet Rasch measurement and a generalizability theory approach, we examined various sources of validity evidence that the medical school faculty have gathered for a set of six propositions needed to support their use of scores as measures of students’ clinical ability. Based on our analysis of the evidence, we would conclude that, by and large, the propositions appear to be sound, and the evidence seems to support their proposed score interpretation. Given the body of evidence collected thus far, their intended interpretation seems defensible.
    Full-text · Article · Mar 2015 · Advances in Health Sciences Education
Show more