The Pediatrics Milestones: Initial Evidence for Their Use as Learning Road Maps for Residents
Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio. Electronic address: . Academic pediatrics
(Impact Factor: 2.01).
11/2012; 13(1). DOI: 10.1016/j.acap.2012.09.003
As the next step in competency-based medical education, the Pediatrics Milestone Project seeks to provide a learner-centered approach to training and assessment. To help accomplish this goal, this study sought to determine how pediatric residents understand, interpret, and respond to the Pediatrics Milestones.
Cognitive interviews with 48 pediatric residents from all training levels at 2 training programs were conducted. Each participant reviewed one Pediatrics Milestone document (PMD). Eight total Pediatrics Milestones, chosen for their range of complexity, length, competency domain, and primary author, were included in this study. Six residents, 2 from each year of residency training, reviewed each PMD. Interviews were transcribed and coded using inductive methods, and codes were grouped into themes that emerged.
Four major themes emerged through coding and analysis: 1) the participants' degree of understanding of the PMDs is sufficient, often deep; 2) the etiology of participants' understanding is rooted in their experiences; 3) there are qualities of the PMD that may contribute to or detract from understanding; and 4) participants apply their understanding by noting the PMD describes a developmental progression that can provide a road map for learning. Additionally, we learned that residents are generally comfortable being placed in the middle of a series of developmental milestones. Two minor themes focusing on interest and practicality were also identified.
This study provides initial evidence for the Pediatrics Milestones as learner-centered documents that can be used for orientation, education, formative feedback, and, ultimately, assessment.
Available from: Michael S Beeson
[Show abstract] [Hide abstract]
ABSTRACT: The Accreditation Council for Graduate Medical Education (ACGME) Outcome Project introduced 6 general competencies relevant to medical practice but fell short of its goal to create a robust assessment system that would allow program accreditation based on outcomes. In response, the ACGME, the specialty boards, and other stakeholders collaborated to develop educational milestones, observable steps in residents' professional development that describe progress from entry to graduation and beyond.
We summarize the development of the milestones, focusing on 7 specialties, moving to the next accreditation system in July 2013, and offer evidence of their validity.
Specialty workgroups with broad representation used a 5-level developmental framework and incorporated information from literature reviews, specialty curricula, dialogue with constituents, and pilot testing.
The workgroups produced richly diverse sets of milestones that reflect the community's consideration of attributes of competence relevant to practice in the given specialty. Both their development process and the milestones themselves establish a validity argument, when contemporary views of validity for complex performance assessment are used.
Initial evidence for validity emerges from the development processes and the resulting milestones. Further advancing a validity argument will require research on the use of milestone data in resident assessment and program accreditation.
03/2013; 5(1):98-106. DOI:10.4300/JGME-05-01-33
Available from: Olle ten Cate
[Show abstract] [Hide abstract]
ABSTRACT: In any evaluation system of medical trainees there is an underlying set of assumptions about what is to be evaluated (i.e., which goals reflect the values of the system or institution), what kind of observations or assessments are useful to allow judgments 1 ; and how these are to be analyzed and compared to a standard of what is to be achieved by the learner. These assumptions can be conventionalized into a framework for evaluation. Frameworks encompass, or "frame," a group of ideas or categories to reflect the educational goals against which a trainee's level of competence or progress is gauged. Different frameworks provide different ways of looking at the practice of medicine and have different purposes. In the first place, frameworks should enable educators to determine to what extent trainees are ready for advancement, that is, whether the desired competence has been attained. They should provide both a valid mental model of competence and also terms to describe successful performance, either at the end of training or as milestones during the curriculum. Consequently, such frameworks drive learning by providing learners with a guide for what is expected. Frameworks should also enhance consistency and reliability of ratings across staff and settings. Finally, they determine the content of, and resources needed for, rater training to achieve consistency of use. This is especially important in clinical rotations, in which reliable assessments have been most difficult to achieve. Because the limitations of workplace-based assessment have persisted despite the use of traditional frameworks (such as those based on knowledge, skills, and attitudes), this Guide will explore the assumptions and characteristics of traditional and newer frameworks. In this AMEE Guide, we make a distinction between analytic, synthetic, and developmental frameworks. Analytic frameworks deconstruct competence into individual pieces, to evaluate each separately. Synthetic frameworks attempt to view competence holistically, focusing evaluation on the performance in real-world activities. Developmental frameworks focus on stages of, or milestones, in the progression toward competence. Most frameworks have one predominant perspective; some have a hybrid nature.
Medical Teacher 05/2013; 35(6). DOI:10.3109/0142159X.2013.788789 · 1.68 Impact Factor
Academic pediatrics 03/2014; 14(2 Suppl):S1-3. DOI:10.1016/j.acap.2013.11.019 · 2.01 Impact Factor
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.