A core competency-based objective structured clinical examination (OSCE) can predict future resident performance.
ABSTRACT This study evaluated the ability of an objective structured clinical examination (OSCE) administered in the first month of residency to predict future resident performance in the Accreditation Council for Graduate Medical Education (ACGME) core competencies.
Eighteen Postgraduate Year 1 (PGY-1) residents completed a five-station OSCE in the first month of postgraduate training. Performance was graded in each of the ACGME core competencies. At the end of 18 months of training, faculty evaluations of resident performance in the emergency department (ED) were used to calculate a cumulative clinical evaluation score for each core competency. The correlations between OSCE scores and clinical evaluation scores at 18 months were assessed on an overall level and in each core competency.
There was a statistically significant correlation between overall OSCE scores and overall clinical evaluation scores (R = 0.48, p < 0.05) and in the individual competencies of patient care (R = 0.49, p < 0.05), medical knowledge (R = 0.59, p < 0.05), and practice-based learning (R = 0.49, p < 0.05). No correlation was noted in the systems-based practice, interpersonal and communication skills, or professionalism competencies.
An early-residency OSCE has the ability to predict future postgraduate performance on a global level and in specific core competencies. Used appropriately, such information can be a valuable tool for program directors in monitoring residents' progress and providing more tailored guidance.
- SourceAvailable from: Gloria J Kuhn[show abstract] [hide abstract]
ABSTRACT: The objective was to critically appraise and highlight medical education research studies published in 2010 that were methodologically superior and whose outcomes were pertinent to teaching and education in emergency medicine (EM). A search of the English language literature in 2010 querying PubMed, Scopus, Education Resources Information Center (ERIC), and PsychInfo identified 41 EM studies that used hypothesis-testing or observational investigations of educational interventions. Five reviewers independently ranked all publications based on 10 criteria, including four related to methodology, that were chosen a priori to standardize evaluation by reviewers. This method was used previously to appraise medical education published in 2008 and 2009. Five medical education research studies met the a priori criteria for inclusion and are reviewed and summarized here. Comparing the literature of 2010 to 2008 and 2009, the number of published educational research papers increased from 30 to 36 and then to 41. The number of funded studies remained fairly stable over the past 3 years at 13 (2008), 16 (2009), and 9 (2010). As in past years, research involving the use of technology accounted for a significant number of publications (34%), including three of the five highlighted studies. Forty-one EM educational studies published in 2010 were identified. This critical appraisal reviews and highlights five studies that met a priori quality indicators. Current trends and common methodologic pitfalls in the 2010 papers are noted.Academic Emergency Medicine 10/2011; 18(10):1081-9. · 1.76 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Purpose: This retrospective study measured correlation of student performance between 2 objective structured clinical examinations (OSCEs) and an introductory integrated clinical skills course that preceded the OSCEs. The hypothesis was that there would be a strong, positive correlation between the earlier level examinations and the upper level OSCE, high enough that earlier examinations could be viewed as predictors of upper level OSCE performance. Methods: Using student scores for 5 academic terms of upper level OSCEs for 2008-2009 (n = 208) and respective earlier scores, correlation coefficients were calculated for the upper level OSCE and Clinical Skills course, and upper and lower level OSCEs. Multiple linear regression analysis was used to evaluate how well the lower level OSCE and clinical skills scores, both as lone and combined independent variables, predicted the upper level OSCE scores. Results: There was at least a moderate correlation between both sets of scores: r = .51 (p < .001) between upper level OSCE and clinical skills course, r = .54 (p < .001) between the upper and lower level OSCEs. A combination of clinical skills and lower level OSCE scores suggested a moderate prediction of upper level OSCE scores (R(2) = .38.) Conclusions: Correlations were found to be of at least a moderate level. According to linear regression analysis, a combination of the earlier scores was moderately predictive for the upper level OSCE. More research could be done to determine additional components of student performance.The Journal of chiropractic education 01/2012; 26(2):138-45.
- [show abstract] [hide abstract]
ABSTRACT: Background and objectives The purpose of this paper is to describe the use of resident performance on an observed structured clinical examination (OSCE) as a tool to refine a mood disorders curriculum, and to disseminate a mood disorders OSCE for use in other residency settings. Methods A depression-focused OSCE and a direct observation evaluation tool were developed and implemented. A total of 24 first-year family medicine residents (PGY1) participated in the OSCE, and their performance was used to direct changes in a mood disorders curriculum. Results Residents performed well on general interview behaviours, and 67% were able to uncover depression in a patient presenting with headaches. Less than 50% of the residents asked about suicidal ideation and recreational drug use. Curriculum was added that addressed the latter deficiencies. Conclusions Tracking of resident performance on specific behaviours during OSCE sessions can be used for curriculum evaluation purposes. The mood disorders curriculum in additional family medicine residency programmes can now be evaluated using our depression-focused OSCE and Clinical Performance Checklist.Mental Health in Family Medicine 01/2013; 10(1):45-51.