Daily encounter cards facilitate competency-based feedback while leniency bias persists

Department of Medicine and the Wilson Centre for Research in Education, University of Toronto, and Department of Emergency Medicine, St. Michael's Hospital, Toronto, Ontario, Canada.
Canadian Journal of Emergency Medicine (Impact Factor: 1.16). 02/2008; 10(1):44-50.
Source: PubMed


We sought to determine if a novel competency-based daily encounter card (DEC) that was designed to minimize leniency bias and maximize independent competency assessments could address the limitations of existing feedback mechanisms when applied to an emergency medicine rotation.
Learners in 2 tertiary academic emergency departments (EDs) presented a DEC to their teachers after each shift. DECs included dichotomous categorical rating scales (i.e., "needs attention" or "area of strength") for each of the 7 CanMEDS roles or competencies and an overall global rating scale. Teachers were instructed to choose which of the 7 competencies they wished to evaluate on each shift. Results were analyzed using both staff and resident as the units of analysis.
Fifty-four learners submitted a total of 801 DECs that were then completed by 43 different teachers over 28 months. Teachers' patterns of selecting CanMEDS competencies to assess did not differ between the 2 sites. Teachers selected an average of 3 roles per DEC (range 0-7). Only 1.3% were rated as "needs further attention." The frequency with which each competency was selected ranged from 25% (Health Advocate) to 85% (Medical Expert).
Teachers chose to direct feedback toward a breadth of competencies. They provided feedback on all 7 CanMEDS roles in the ED, yet demonstrated a marked leniency bias.

8 Reads
  • Source
    • "Several studies have examined the reliability of ECs that are organized around ad hoc designs of physician competence (Al-Jarallah et al. 2005; Brennan and Norman 1997; Richards et al. 2007; Turnbull et al. 2000). Bandiera and Lendrum (2008) reported an analysis of an end-of-shift EC based on the CanMEDS framework. However the ratings were based on a 2-point scale—'Satisfactory' or 'Needs Further Attention.' "
    [Show abstract] [Hide abstract]
    ABSTRACT: The purpose of this study was to determine the reliability of a computer-based encounter card (EC) to assess medical students during an emergency medicine rotation. From April 2011 to March 2012, multiple physicians assessed an entire medical school class during their emergency medicine rotation using the CanMEDS framework. At the end of an emergency department shift, an EC was scored (1-10) for each student on Medical Expert, 2 additional Roles, and an overall score. Analysis of 1,819 ECs (155 of 186 students) revealed the following: Collaborator, Manager, Health Advocate and Scholar were assessed on less than 25 % of ECs. On average, each student was assessed 11 times with an inter-rater reliability of 0.6. The largest source of variance was rater bias. A D-study showed that a minimum of 17 ECs were required for a reliability of 0.7. There was moderate to strong correlations between all Roles and overall score; and the factor analysis revealed all items loading on a single factor, accounting for 87 % of the variance. The global assessment of the CanMEDS Roles using ECs has significant variance in estimates of performance, derived from differences between raters. Some Roles are seldom selected for assessment, suggesting that raters have difficulty identifying related performance. Finally, correlation and factor analyses demonstrate that raters are unable to discriminate among Roles and are basing judgments on an overall impression.
    Advances in Health Sciences Education 01/2013; 18(5). DOI:10.1007/s10459-012-9440-6 · 2.12 Impact Factor
  • Source
    • "The papers included in the final selection for this review described a variety of methods for identifying and defining these outcomes (Harden et al. 1999b). The authors also collectively promoted the concept of ''progression of competence,'' meaning that learners advance along a series of defined milestones on their way to the explicit outcome goals of training (theme 1a) (Lane and Ross 1994b; Bandiera & Defining CBE Lendrum 2008). This is articulated by Ben-David (1999): ''Outcome-based frameworks require a defined scheme of levels of progression towards the outcome.'' "
    [Show abstract] [Hide abstract]
    ABSTRACT: Competency-based education (CBE) has emerged in the health professions to address criticisms of contemporary approaches to training. However, the literature has no clear, widely accepted definition of CBE that furthers innovation, debate, and scholarship in this area. To systematically review CBE-related literature in order to identify key terms and constructs to inform the development of a useful working definition of CBE for medical education. We searched electronic databases and supplemented searches by using authors' files, checking reference lists, contacting relevant organizations and conducting Internet searches. Screening was carried out by duplicate assessment, and disagreements were resolved by consensus. We included any English- or French-language sources that defined competency-based education. Data were analyzed qualitatively and summarized descriptively. We identified 15,956 records for initial relevancy screening by title and abstract. The full text of 1,826 records was then retrieved and assessed further for relevance. A total of 173 records were analyzed. We identified 4 major themes (organizing framework, rationale, contrast with time, and implementing CBE) and 6 sub-themes (outcomes defined, curriculum of competencies, demonstrable, assessment, learner-centred and societal needs). From these themes, a new definition of CBE was synthesized. This is the first comprehensive systematic review of the medical education literature related to CBE definitions. The themes and definition identified should be considered by educators to advance the field.
    Medical Teacher 08/2010; 32(8):631-7. DOI:10.3109/0142159X.2010.500898 · 1.68 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper reports the results of a consensus conference of the Council of Emergency Medicine Residency Directors (CORD) to discuss the experiential training component of residency education in the emergency department (ED) and to make recommendations on structuring clinical training. Self-selected emergency medicine (EM) educators discussed experiential training focusing on three topic areas: 1) methods to optimize training in the clinical setting, 2) identification of goals and objectives by training year, and 3) determination of measurable behaviors demonstrating achievement of goals and objectives by residents. Topic areas were organized into the following questions: 1) what is the optimal number and evolution of ED shifts for EM residents during their residency training, 2) what clinical skills are expected of a resident at each level of training, and 3) what objective measures should be used to provide evidence of resident competency? Participants attended a lecture on the goals of the conference, the questions to be answered, and the role and implementation of deliberate practice into experiential training. Attendees were divided into three groups, each discussing one question. Each group had two discussion leaders. All discussions were digitally recorded for accuracy. After discussion all groups reconvened and reported summaries of discussions and recommendations to ensure group agreement. There were 59 participants representing 42 training programs. Educators agree that essential features of designing the ED clinical experience include the need to: 1) structure and tailor the clinical experience to optimize learning, 2) establish expectations for clinical performance based on year of training, and 3) provide feedback that is explicit to year-specific performance expectations. ACADEMIC EMERGENCY MEDICINE 2010; 17:S78–S86 © 2010 by the Society for Academic Emergency Medicine
    Academic Emergency Medicine 10/2010; 17 Suppl 2(s2):S78-86. DOI:10.1111/j.1553-2712.2010.00888.x · 2.01 Impact Factor
Show more