How to develop a competency-based examination blueprint for longitudinal standardized patient clinical skills assessments
University of Washington School of Medicine , USA. Medical Teacher
(Impact Factor: 1.68).
07/2013; 35(11). DOI: 10.3109/0142159X.2013.809408
Background: Objective Structured Clinical Exams (OSCEs) with standardized patients (SPs) are commonly used in medical education to assess learners' clinical skills. However, assessments are often discrete rather than intentionally developmentally sequenced. Aims: We developed an examination blueprint to optimize assessment and feedback to learners with purposeful sequence as a series of longitudinally integrated assessments based on performance milestones. Integrated and progressive clinical skills assessments offer several benefits: assessment of skill development over time, systematic identification of learning needs, data for individualized feedback and learning plans, and baseline reference points for reassessment. Methods: Using a competency-based medical education (CBME) framework, we translated pre-determined competency milestones for medical students' patient encounters into a four-year SP-based OSCE examination blueprint. Results: Initial evaluation of cases using the blueprint revealed opportunities to target less frequently assessed competencies and to align assessments with milestones for each year. Conclusions: The examination blueprint can guide ongoing SP-based OSCE case design. Future iterations of examination blueprints can incorporate lessons learnt from evaluation data and student feedback.
Available from: Rose McGready
[Show abstract] [Hide abstract]
ABSTRACT: Shoklo Malaria Research Unit has been providing health care in remote clinics on the Thai-Myanmar border to refugee and migrant populations since 1986 and 1995, respectively. Clinics are staffed by local health workers with a variety of training and experience. The need for a tool to improve the competence of local health workers in basic emergency assessment and management was recognised by medical faculty after observing the case mix seen at the clinic and reviewing the teaching programme that had been delivered in the past year (Jan-13 to March-14).
To pilot the development and evaluation of a simple teaching tool to improve competence in the assessment and management of acutely unwell patients by local health workers that can be delivered onsite with minimal resources.
A structured approach to common emergencies presenting to rural clinics and utilizing equipment available in the clinics was developed. A prospective repeated-measures observed structured clinical examination (OSCE) assessment design was used to score participants in their competence to assess and manage a scenario based 'emergency patient' at baseline, immediately post-course, and 8 weeks after the delivery of the teaching course. The assessment was conducted at 3 clinic sites and staff participation was voluntary. Participants filled out questionnaires on their confidence with different scenario based emergency patients.
All staff who underwent the baseline assessment failed to carry out the essential steps in initial emergency assessment and management of an unconscious patient scenario. Following delivery of the teaching session, all groups showed improved competence in both objective assessment and subjective confidence levels.
Structured and practical teaching and learning with minimal theory in this resource limited setting had a positive short-term effect on the competence of individual staff to carry out an initial assessment and manage an acutely unwell patient. Health-worker confidence likewise improved. Workplace assessments are needed to determine if this type of skills training impacts upon mortality or near miss mortality patients at the clinic.
Available from: Rose McGready
[Show description] [Hide description]
DESCRIPTION: Structured and practical teaching and learning with minimal theory in this resource limited setting had a positive short-term effect on the competence of individual staff to carry out an initial assessment and manage an acutely unwell patient.
[Show abstract] [Hide abstract]
ABSTRACT: There is currently great interest in measuring trainee competency at all levels of medical education. In 2007, we implemented a system for assessing cardiology fellows' progress in attaining imaging skills. This paradigm could be adapted for use by other cardiology programs.
Evaluation consisted of a two-part exercise performed after years 1 and 2 of pediatric cardiology training. Part 1: a directly observed evaluation of technical skills as fellows imaged a normal subject (year 1) and a patient with complex heart disease (year 2). Part 2: fellows interpreted and wrote reports for two echocardiograms illustrating congenital heart disease. These were graded for accuracy and facility with communicating pertinent data. After 5 years of testing, fellows were surveyed about their experience. In 5 years, 40 fellows were tested at least once. Testing identified four fellows who underperformed on the technical portion and four on the interpretive portion. Surveys were completed by 33 fellows (83 %). Most (67 %) felt that intermittent observation by faculty was inadequate for assessing skills and that procedural volume was a poor surrogate for competency (58 %). Posttest feedback was constructive and valuable for 90, and 70 % felt the process helped them set goals for skill improvement. Overall, fellows felt this testing was fair and should continue. Fellow performance and responses identified programmatic issues that were creating barriers to learning. We describe a practical test to assess competency for cardiology fellows learning echocardiography. This paradigm is feasible, has excellent acceptance among trainees, and identifies trainees who need support. Materials developed could be easily adapted to help track upcoming ACGME-mandated metrics.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.