A core competency-based objective structured clinical examination (OSCE) can predict future resident performance.

Department of Emergency Medicine, Emory University School of Medicine, Atlanta, GA. USA.
Academic Emergency Medicine (Impact Factor: 2.2). 10/2010; 17 Suppl 2:S67-71. DOI: 10.1111/j.1553-2712.2010.00894.x
Source: PubMed

ABSTRACT This study evaluated the ability of an objective structured clinical examination (OSCE) administered in the first month of residency to predict future resident performance in the Accreditation Council for Graduate Medical Education (ACGME) core competencies.
Eighteen Postgraduate Year 1 (PGY-1) residents completed a five-station OSCE in the first month of postgraduate training. Performance was graded in each of the ACGME core competencies. At the end of 18 months of training, faculty evaluations of resident performance in the emergency department (ED) were used to calculate a cumulative clinical evaluation score for each core competency. The correlations between OSCE scores and clinical evaluation scores at 18 months were assessed on an overall level and in each core competency.
There was a statistically significant correlation between overall OSCE scores and overall clinical evaluation scores (R = 0.48, p < 0.05) and in the individual competencies of patient care (R = 0.49, p < 0.05), medical knowledge (R = 0.59, p < 0.05), and practice-based learning (R = 0.49, p < 0.05). No correlation was noted in the systems-based practice, interpersonal and communication skills, or professionalism competencies.
An early-residency OSCE has the ability to predict future postgraduate performance on a global level and in specific core competencies. Used appropriately, such information can be a valuable tool for program directors in monitoring residents' progress and providing more tailored guidance.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Evaluation of emergency medicine (EM) learners based on observed performance in the emergency department (ED) is limited by factors such as reproducibility and patient safety. EM educators depend on standardized and reproducible assessments such as the objective structured clinical examination (OSCE). The validity of the OSCE as an evaluation tool in EM education has not been previously studied. The objective was to assess the validity of a novel management-focused OSCE as an evaluation instrument in EM education through demonstration of performance correlation with established assessment methods and case item analysis. We conducted a prospective cohort study of fourth-year medical students enrolled in a required EM clerkship. Students enrolled in the clerkship completed a five-station EM OSCE. We used Pearson's coefficient to correlate OSCE performance with performance in the ED based on completed faculty evaluations. Indices of difficulty and discrimination were computed for each scoring item. We found a moderate and statistically-significant correlation between OSCE score and ED performance score [r(239) =0.40, p<0.001]. Of the 34 OSCE testing items the mean index of difficulty was 63.0 (SD =23.0) and the mean index of discrimination was 0.52 (SD =0.21). Student performance on the OSCE correlated with their observed performance in the ED, and indices of difficulty and differentiation demonstrated alignment with published best-practice testing standards. This evidence, along with other attributes of the OSCE, attest to its validity. Our OSCE can be further improved by modifying testing items that performed poorly and by examining and maximizing the inter-rater reliability of our evaluation instrument.
    The western journal of emergency medicine 01/2015; 16(1):121-6. DOI:10.5811/westjem.2014.11.22440
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This study explored the healthcare student's experience of an OSCE (Objective Structured Clinical Exam). The OSCE is a form of assessment in which the student demonstrates clinical skills, and underpinning knowledge, usually in simulated conditions. Historically, it has originated from medical education, and is now being adopted by other disciplines of healthcare education. Because the OSCE is a new experience for most students, it is important as educators, that we explore this assessment from the perspective of the student. A literature review revealed a paucity of research in this area. Hermeneutic phenomenology was used as this study's underpinning methodology. Data was collected through semi-structured interviews with students. Analysis revealed three main themes: (1) anxiety about the OSCE, (2) preparation was a seen as a coping strategy and (3) simulation was a further cause of anxiety. Recommendations for future practice: are that students need to be supported appropriately. Preparation of students for an OSCE requires effective planning and simulation needs to be grounded in practice. This study concludes that students valued the OSCE as a worthwhile assessment. However there are major concerns for students, which need careful consideration by academic faculty developing this type of assessment.
    01/2012; 1(1). DOI:10.7190/seej.v1i1.37
  • World Neurosurgery 09/2014; DOI:10.1016/j.wneu.2014.06.027 · 2.42 Impact Factor

Full-text (2 Sources)

Available from
Dec 17, 2014