Direct observed procedural skills assessment in the undergraduate setting

Clinical Skills Centre, University of Dundee, Ninewells Hospital, Dundee, UK.
The Clinical Teacher 08/2012; 9(4):228-32. DOI: 10.1111/j.1743-498X.2012.00582.x
Source: PubMed

ABSTRACT Medical students are required to undertake procedural clinical skills training before they qualify as doctors, and an assessment of these skills is a critical element of their fitness to practice.
Challenges facing educators include the logistics of observing: i.e. who is best placed to assess their competence? Evidence appears to be inconclusive about the competence of students in the workplace to adhere to standards of practice, and the time required for an effective assessment.
In this article the aim is to discuss who is best placed to assess final-year medical students in the workplace. We explore the use of direct observed procedural skills (DOPS) to assess students undertaking procedural skills in a simulated workplace setting by tutor-, peer- and self-assessment. The DOPS tool has been used to assess foundation doctors, but can it be used to effectively assess undergraduate medical students?
The main purpose of formative assessment in the simulated setting is to support student learning through the provision of feedback and debriefing. The use of the DOPS tool in this way can provide an insightful perspective of a students' assessment of procedural clinical skills. Tutors can use the DOPS tool to guide their teaching practice by tailoring their lessons towards areas in which students require more guidance. The DOPS assessment tool presents an opportunity to provide immediate and relevant feedback.

Download full-text


Available from: Roddy A Mcleod, Feb 19, 2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: There has been concern that trainees are seldom observed, assessed, and given feedback during their workplace-based education. This has led to an increasing interest in a variety of formative assessment methods that require observation and offer the opportunity for feedback. To review some of the literature on the efficacy and prevalence of formative feedback, describe the common formative assessment methods, characterize the nature of feedback, examine the effect of faculty development on its quality, and summarize the challenges still faced. The research literature on formative assessment and feedback suggests that it is a powerful means for changing the behaviour of trainees. Several methods for assessing it have been developed and there is preliminary evidence of their reliability and validity. A variety of factors enhance the efficacy of workplace-based assessment including the provision of feedback that is consistent with the needs of the learner and focused on important aspects of the performance. Faculty plays a critical role and successful implementation requires that they receive training. There is a need for formative assessment which offers trainees the opportunity for feedback. Several good methods exist and feedback has been shown to have a major influence on learning. The critical role of faculty is highlighted, as is the need for strategies to enhance their participation and training.
    Medical Teacher 12/2007; 29(9):855-71. DOI:10.1080/01421590701775453 · 2.05 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Accreditation of residency programs and certification of physicians requires assessment of competence in communication and interpersonal skills. Residency and continuing medical education program directors seek ways to teach and evaluate these competencies. This report summarizes the methods and tools used by educators, evaluators, and researchers in the field of physician-patient communication as determined by the participants in the "Kalamazoo II" conference held in April 2002. Communication and interpersonal skills form an integrated competence with two distinct parts. Communication skills are the performance of specific tasks and behaviors such as obtaining a medical history, explaining a diagnosis and prognosis, giving therapeutic instructions, and counseling. Interpersonal skills are inherently relational and process oriented; they are the effect communication has on another person such as relieving anxiety or establishing a trusting relationship. This report reviews three methods for assessment of communication and interpersonal skills: (1) checklists of observed behaviors during interactions with real or simulated patients; (2) surveys of patients' experience in clinical interactions; and (3) examinations using oral, essay, or multiple-choice response questions. These methods are incorporated into educational programs to assess learning needs, create learning opportunities, or guide feedback for learning. The same assessment tools, when administered in a standardized way, rated by an evaluator other than the teacher, and using a predetermined passing score, become a summative evaluation. The report summarizes the experience of using these methods in a variety of educational and evaluation programs and presents an extensive bibliography of literature on the topic. Professional conversation between patients and doctors shapes diagnosis, initiates therapy, and establishes a caring relationship. The degree to which these activities are successful depends, in large part, on the communication and interpersonal skills of the physician. This report focuses on how the physician's competence in professional conversation with patients might be measured. Valid, reliable, and practical measures can guide professional formation, determine readiness for independent practice, and deepen understanding of the communication itself.
    Academic Medicine 07/2004; 79(6):495-507. DOI:10.1097/00001888-200406000-00002 · 3.47 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The assessment of clinical procedural skills has traditionally focused on technical elements alone. However, in real practice, clinicians are expected to be able to integrate technical with communication and other professional skills. We describe an integrated procedural performance instrument (IPPI), where clinicians are assessed on 12 clinical procedures in a simulated clinical setting which combines simulated patients (SPs) with inanimate models or items of medical equipment. Candidates are observed remotely by assessors whose data are fed back to the clinician within 24 hours of the assessment. This paper describes the feasibility of IPPI. A full-scale IPPI and 2 pilot studies with trainee and qualified health care professionals has yielded an extensive data set including 585 scenario evaluations from candidates, 60 from clinical assessors and 31 from simulated patients (SPs). Interview and questionnaire data showed that for the majority of candidates IPPI provided a powerful and valuable learning experience. Realism was rated highly. Remote and real-time assessment worked effectively, although for some procedures limited camera resolution affected observation of fine details. IPPI offers an innovative approach to assessing clinical procedural skills. Although resource-intensive, it has the potential to provide insight into individual's performance over a spectrum of clinical scenarios and at no risk to the safety of patients. Additional benefits of IPPI include assessment in real time from experts (allowing remote rating by external examiners) as well as provision of feedback from simulated patients.
    Medical Education 12/2006; 40(11):1105-14. DOI:10.1111/j.1365-2929.2006.02612.x · 3.62 Impact Factor