[show abstract][hide abstract] ABSTRACT: A Council of Emergency Medicine Residency Directors task force developed the Standardized Direct Observation Assessment Tool (SDOT), a 26-item checklist assessment tool to evaluate Accreditation Council for Graduate Medical Education resident core competencies by direct observation. Each of the checklist items is assigned to one or more of five core competencies. The objective of this study was to test the interrater measurement properties of the SDOT instrument.
Two videos of simulated patient-resident-attending physician encounters were produced. Academic emergency medicine faculty members not involved in the development of the form viewed the two encounters and completed the SDOT for each. Faculty demographic data were collected. Data were collected from 82 faculty members at 16 emergency medicine residency programs. The checklist items were used to generate a composite score for each core competency of patient care, medical knowledge, interpersonal and communication skills, professionalism, and systems-based practice.
Univariate analysis demonstrated a high degree of agreement between evaluators in evaluating residents for both videos. Multivariate analysis found no differences in rating by faculty when examined by experience, academic title, site, or previous use of the SDOT.
Faculty from 16 emergency medicine residency programs had a high interrater agreement when using the SDOT to evaluate resident core competency performance. This study did not test the validity of the tool. This data analysis is mainly descriptive, and scripted video scenarios may not approximate direct observation in the emergency department.
Academic Emergency Medicine 08/2006; 13(7):727-32. · 1.76 Impact Factor
[show abstract][hide abstract] ABSTRACT: Previous trials have showed a 10-30% rate of inaccuracies on applications to individual residency programs. No studies have attempted to corroborate this on a national level. Attempts by residency programs to diminish the frequency of inaccuracies on applications have not been reported. We seek to clarify the national incidence of inaccuracies on applications to emergency medicine residency programs.
This is a multi-center, single-blinded, randomized, cohort study of all applicants from LCME accredited schools to involved EM residency programs. Applications were randomly selected to investigate claims of AOA election, advanced degrees and publications. Errors were reported to applicants' deans and the NRMP.
Nine residencies reviewed 493 applications (28.6% of all applicants who applied to any EM program). 56 applications (11.4%, 95%CI 8.6-14.2%) contained at least one error. Excluding "benign" errors, 9.8% (95% CI 7.2-12.4%), contained at least one error. 41% (95% CI 35.0-47.0%) of all publications contained an error. All AOA membership claims were verified, but 13.7% (95%CI 4.4-23.1%) of claimed advanced degrees were inaccurate. Inter-rater reliability of evaluations was good. Investigators were reluctant to notify applicants' dean's offices and the NRMP.
This is the largest study to date of accuracy on application for residency and the first such multi-centered trial. High rates of incorrect data were found on applications. This data will serve as a baseline for future years of the project, with emphasis on reporting inaccuracies and warning applicants of the project's goals.
BMC Medical Education 02/2005; 5:30. · 1.41 Impact Factor