Article

The relationship between the National Board of Medical Examiners' prototype of the Step 2 clinical skills exam and interns' performance.

Department of Family Medicine, Medical University of South Carolina, Charleston, 29425, USA.
Academic Medicine (Impact Factor: 3.47). 06/2005; 80(5):496-501. DOI: 10.1097/00001888-200505000-00019
Source: PubMed

ABSTRACT To examine the relationship between graduates' performances on a prototype of the National Board of Medical Examiners' Step 2 CS and other undergraduate measures with their residency directors' ratings of their performances as interns.
Data were collected for the 2001 and 2002 graduates from the study institution. Checklist and interpersonal scores from the prototype Step 2 CS, along with United States Medical Licensing Examination (USMLE) Step 1 and 2 scores and undergraduate grade-point average (GPA), were correlated with residency directors' ratings (average score for six competencies, quartile ranking, and isolated interpersonal communication competency score). Stepwise linear regression was used to identify the best outcome predictors.
Quartile ranking was more highly correlated with GPA than Step 2 CS prototype interpersonal score, USMLE Step 2 score, USMLE Step 1 score, and Step 2 CS prototype checklist score. The average score on the residency director's survey was more highly correlated with GPA than USMLE Step 2 score, USMLE Step 1 score, Step 2 CS prototype interpersonal score, and Step 2 CS prototype checklist score. The best predictors for both quartile ranking and average competency score were GPA and Step 2 CS prototype interpersonal score (R(2) = 0.26 and 0.28).
Both scores from the Step 2 CS prototype significantly correlated with the interns' quartile ranking and average competency score. Only GPA and Step 2 CS prototype interpersonal score accounted for most of the variance of performance in the regression model.

0 Followers
 · 
103 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In Korea, clinical performance examination (CPX) has been included in license examination for medical doctors since 2009 in order to improve clinical performance of medical students. This study aimed to evaluate the contribution of CPX to medical education. Clinical competency in the differential diagnosis of secondary headache was compared between the incoming interns in 2009 unexposed to CPX and the incoming interns in 2010 exposed to CPX, using the data of patients who visited the emergency department due to headache (181 patients seen by 60 CPX non-exposed interns and 150 patients seen by 50 CPX-exposed interns). We obtained the data by reviewing electronic medical records and nominal lists of doctors. Clinical competency was assessed by sensitivity and specificity between the diagnostic impression by interns and the final diagnosis. The association between CPX exposure and clinical competency in secondary headache diagnosis was evaluated using multiple logistic regression analysis. When we assessed clinical competency on the basis of all listed diagnostic impressions, sensitivity and specificity were 67.9% and 80.0%, respectively, for headaches seen by CPX-exposed interns, and 51.7%, and 71.7%, respectively, for headaches seen by CPX non-exposed interns. Multivariable adjusted logistic regression analysis showed exposure to CPX was not associated with increased competency for identifying secondary headache. Exposure to CPX as a part of the medical license examination was not effective for the improvement of clinical competency of interns in identifying secondary headache.
    Korean Journal of Family Medicine 03/2014; 35(2):56-64. DOI:10.4082/kjfm.2014.35.2.56
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Basic skills in oral/CMF surgery should be taught effectively to dental students as surgical skills training is traditionally under-represented in the dental curriculum compared to its later need in daily clinical practice. Rigid curricular time frames and prospectively condensed professional education foster new effective teaching and examination formats. Transmitting and assessing clinical competence objectively (independent of subjective bias), reliably (repeatable, inter-rater consistency) and valid (representative, structured task selection) was intended and evaluated in oral/CMF surgery skills acquisition starting in summer 2009. A small-group practical skills training (PST) day initiated a one-week practical training course, covering previously formulated learning objectives. An objective structured clinical evaluation (OSCE) was held at the end of each semester. Theoretical background knowledge and clinical skills should have to be memorized within a representative number of practical tasks (test stations). A first semester (26 students) used classical practical training alone as controls, the following semesters (171 students) had PST, considered as a study group. All 197 students were assessed with OSCE's over a 3-year period. An instructor held PST based on presentations, videos and practical training, including mannequins, with pairs of students. This included history taking, communication and interpretation of laboratory/image diagnostics, structured clinical facial examination, fracture diagnosis, venipuncture, suturing, biopsy and wire loops on pig jaws for manual and clinical skills, which were later incorporated in OSCE stations. OSCE average results increased from 63.3 ± 9.7% before and to 75.5 ± 10% after the inclusion of PST (p < 0.05). Knowledge diffusion between sittings on the same test date and between consecutive semesters was insignificant. Students and faculty rated their learning/teaching experience "very good" to "good". PST was effective in optimizing clinical skills as evaluated by OSCE.
    Journal of cranio-maxillo-facial surgery: official publication of the European Association for Cranio-Maxillo-Facial Surgery 09/2013; DOI:10.1016/j.jcms.2013.07.004 · 2.60 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Objective We investigated correlations between residents’ scores on the Jefferson Scale of Empathy (JSE), residents’ perceptions of their empathy during standardized-patient encounters, and the perceptions of standardized patients. Methods Participants were 214 first-year residents in internal medicine or family medicine from 13 residency programs taking standardized patient–based clinical skills assessment in 2011. We analyzed correlations between residents’ JSE scores; standardized patients’ perspectives on residents’ empathy during OSCE encounters, using the Jefferson Scale of Patient Perceptions of Physician Empathy; and residents’ perspectives on their own empathy, using a modified version of this scale. Results Residents’ JSE scores correlated with their perceptions of their own empathy during encounters but correlated poorly with patients’ assessments of resident empathy. Conclusion The poor correlation between residents’ and standardized patients’ assessments of residents’ empathy raises questions about residents’ abilities to gauge the effectiveness of their empathic communications. The study also points to a lack of congruence between the assessment of empathy by standardized patients and residents as receivers and conveyors of empathy, respectively. Practice implications This study adds to the literature on empathy as a teachable skill set and raises questions about use of OSCEs to assess trainee empathy.
    Patient Education and Counseling 07/2014; 96(1). DOI:10.1016/j.pec.2014.04.007 · 2.60 Impact Factor