Project

International CBME Collaborators (ICBMEC)

Updates
0 new
0
Recommendations
0 new
0
Followers
0 new
32
Reads
0 new
397

Project log

Jason R Frank
added 3 research items
There is an urgent need to capture the outcomes of the ongoing global implementation of competency-based medical education (CBME). However, the measurement of downstream outcomes following educational innovations, such as CBME is fraught with challenges stemming from the complexities of medical training, the breadth and variability of inputs, and the difficulties attributing outcomes to specific educational elements. In this article, we present a logic model for CBME to conceptualize an impact pathway relating to CBME and facilitate outcomes evaluation. We further identify six strategies to mitigate the challenges of outcomes measurement: (1) clearly identify the outcome of interest, (2) distinguish between outputs and outcomes, (3) carefully consider attribution versus contribution, (4) connect outcomes to the fidelity and integrity of implementation, (5) pay attention to unanticipated outcomes, and (6) embrace methodological pluralism. Embracing these challenges, we argue that careful and thoughtful evaluation strategies will move us forward in answering the all-important question: Are the desired outcomes of CBME being achieved?
Entrustable professional activities (EPAs) have emerged as a meaningful framework for achieving competency-based medical education (CBME). However, little is known about how to adapt EPAs to large-scale, multispecialty, system-wide implementations. The authors describe the design and experience of creating such a system based on EPAs and the Van Melle Core Components Framework for all accredited training programs under the auspices of the Royal College of Physicians and Surgeons of Canada. The resulting design is a unique configuration and use of EPAs, called Royal College EPAs. Others looking to implement EPAs for large-scale health professions education systems may want to consider this design approach.
Jason R Frank
added 3 research items
As the global transformation of postgraduate medical training continues, there are persistent calls for program evaluation efforts to understand the impact and outcomes of competency-based medical education (CBME) implementation. The measurement of a complex educational intervention such as CBME is challenging because of the multifaceted nature of activities and outcomes. What is needed, therefore, is an organizational taxonomy to both conceptualize and categorize multiple outcomes. In this manuscript we propose a taxonomy that builds on preceding works to organize CBME outcomes across three domains: focus (educational, clinical), level (micro, meso, macro), and timeline (training, transition to practice, practice). We also provide examples of how to conceptualize outcomes of educational interventions across medical specialties using this taxonomy. By proposing a shared language for outcomes of CBME, we hope that this taxonomy will help organize ongoing evaluation work and catalyze those seeking to engage in the evaluation effort to help understand the impact and outcomes of CBME.
With the adoption of competency-based medical education, assessment has shifted from traditional classroom domains of knows and knows how to the workplace domain of doing. This workplace-based assessment has 2 purposes; assessment of learning (summative feedback) and the assessment for learning (formative feedback). What the trainee does becomes the basis for identifying growth edges and determining readiness for advancement and ultimately independent practice. High-quality workplace-based assessment programs require thoughtful choices about the framework of assessment, the tools themselves, the platforms used, and the contexts in which the assessments take place, with an emphasis on direct observation.
The COVID-19 pandemic has disrupted many societal institutions, including health care and education. Although the pandemic’s impact was initially assumed to be temporary, there is growing conviction that medical education might change more permanently. The International Competency-based Medical Education (ICBME) collaborators, scholars devoted to improving physician training, deliberated how the pandemic raises questions about medical competence. We formulated 12 broad-reaching issues for discussion, grouped into micro-, meso-, and macro-level questions. At the individual micro level, we ask questions about adaptability, coping with uncertainty, and the value and limitations of clinical courage. At the institutional meso level, we question whether curricula could include more than core entrustable professional activities (EPAs) and focus on individualized, dynamic, and adaptable portfolios of EPAs that, at any moment, reflect current competence and preparedness for disasters. At the regulatory and societal macro level, should conditions for licensing be reconsidered? Should rules of liability be adapted to match the need for rapid redeployment? We do not propose a blueprint for the future of medical training but rather aim to provoke discussions needed to build a workforce that is competent to cope with future health care crises.
Jason R Frank
added 2 research items
Background A key component of competency‐based medical education is workplace‐based assessment which includes observation (direct or indirect) of residents. Direct observation has been emphasized as an ideal form of assessment yet challenges have been identified which may limit its adoption. At present, it remains unclear how often direct and indirect observation are being used within the clinical setting. The objective of this study was to describe patterns of observation in an emergency medicine competency‐based program two years post implementation. Methods Emergency medicine residents (n=19) recorded the type of observation they received (direct or indirect) following workplace‐based entrustable professional activity assessments from December 15, 2019 – April 30, 2020. Assessment forms were reviewed and analysed to describe patters of observation. Results Assessments were collected on all 19 eligible residents (100% participation). A total of 1070 entrustable professional activity assessments were completed during the study period, of which 798 (74.6%) had the type of observation recorded. Of these recorded observations, 546 (68.4%) were directly observed and 252 (31.6%) were indirectly observed. The length of written comments contained within assessments following direct and indirect observation did not differ significantly. There was no significant association between resident gender and observation type or resident stage of training and observation type. Certain entrustable professional activity assessments showed a clear preference towards either direct or indirect observation. Conclusions To our knowledge, this study is the first to report patterns of observation in a competency‐based residency program. The results suggest that direct observation can be quickly adopted as the primary means of workplace‐based assessment. Indirect observation comprised a sizeable minority of observations and may be an underrecognized contributor to workplace‐based assessment. The preference towards either direct or indirect observation for certain entrustable professional activity assessments suggests that the entrustable professional activity itself may influence the type of observation.
Medical education programs are failing to meet the health needs of patients and communities. Misalignments exist on multiple levels, including content (what trainees learn), pedagogy (how trainees learn), and culture (why trainees learn). To address these challenges effectively, competency-based assessment (CBA) for psychiatric medical education must simultaneously produce life-long learners who can self-regulate their own growth and trustworthy processes that determine and accelerate readiness for independent practice. The key to effectively doing so is situating assessment within a carefully designed system with several, critical, interacting components: workplace-based assessment, ongoing faculty development, learning analytics, longitudinal coaching, and fit-for-purpose clinical competency committees.
Jason R Frank
added a research item
The COVID-19 pandemic has disrupted healthcare systems around the world, impacting how we deliver medical education. The normal day-today routines have been altered for a number of reasons , including changes to scheduled training rotations, physical distancing requirements, trainee redeployment, and heightened level of concern. Medical educators will likely need to adapt their programs to maximize learning, maintain effective care delivery, and ensure competent graduates. Along with a continued focus on learner/faculty wellness, medical educators will have to optimize existing training experiences, adapt those that are no longer viable, employ new technologies, and be flexible when assessing competencies. These practical tips offer guidance on how to adapt medical education programs within the constraints of the pandemic landscape, stressing the need for communication, innovation, collaboration, flexibility, and planning within the era of competency-based medical education.
Jason R Frank
added a research item
Background With the implementation of competency-based assessment systems, education programs are collecting increasing amounts of data about medical learners. However, learning analytics are rarely employed to use this data to improve medical education. Objective We identified outstanding issues that are limiting the effective adoption of learning analytics in medical education. Methods Participants at an international summit on learning analytics in medical education generated key questions that need to be addressed to move the field forward. Small groups formulated questions related to data stewardship, learner perspectives, and program perspectives. Three investigators conducted an inductive qualitative content analysis on the participant questions, coding the data by consensus and organizing it into themes. One investigator used the themes to formulate representative questions that were refined by the other investigators. Results Sixty-seven participants from 6 countries submitted 195 questions. From them, we identified 3 major themes: implementation challenges (related to changing current practices to collect data and utilize learning analytics); data (related to data collection, security, governance, access, and analysis); and outcomes (related to the use of learning analytics for assessing learners and faculty as well as evaluating programs and systems). We present the representative questions and their implications. Conclusions Our analysis highlights themes regarding implementation, data management, and outcomes related to the use of learning analytics in medical education. These results can be used as a framework to guide stakeholder education, research, and policy development that delineates the benefits and challenges of using learning analytics in medical education.
Jason R Frank
added 3 research items
Purpose: To characterize how professionalism concerns influence individual reviewers' decisions about resident progression using simulated competence committee (CC) reviews. Method: In April 2017, the authors conducted a survey of 25 Royal College of Physicians and Surgeons of Canada emergency medicine residency program directors and senior faculty who were likely to function as members of a CC (or equivalent) at their institution. Participants took a survey with 12 resident portfolios, each containing hypothetical formative and summative assessments. Six portfolios represented residents progressing as expected (PAE) and 6 represented residents not progressing as expected (NPAE). A professionalism variable (PV) was developed for each portfolio. Two counterbalanced surveys were developed in which 6 portfolios contained a PV and 6 portfolios did not (for each PV condition, 3 portfolios represented residents PAE and 3 represented residents NPAE). Participants were asked to make progression decisions based on each portfolio. Results: Without PVs, the consistency of participants giving scores of 1 or 2 (i.e., little or no need for educational intervention) to residents PAE and to those NPAE was 92% and 10%, respectively. When a PV was added, the consistency decreased by 34% for residents PAE and increased by 4% for those NPAE (P = .01). Conclusions: When reviewing a simulated resident portfolio, individual reviewer scores for residents PAE were responsive to the addition of professionalism concerns. Considering this, educators using a CC should have a system to report, collect, and document professionalism issues.
Purpose: Despite the broad endorsement of competency-based medical education (CBME), myriad difficulties have arisen in program implementation. The authors sought to evaluate the fidelity of implementation and identify early outcomes of CBME implementation using Rapid Evaluation to facilitate transformative change. Method: Case-study methodology was used to explore the lived experience of implementing CBME in the emergency medicine postgraduate program at Queen’s University, Canada, using iterative cycles of Rapid Evaluation in 2017–2018. After the intended implementation was explicitly described, stakeholder focus groups and interviews were conducted at 3 and 9 months post-implementation to evaluate the fidelity of implementation and early outcomes. Analyses were abductive, using the CBME core components framework and data-driven approaches to understand stakeholders’ experiences. Results: In comparing planned with enacted implementation, important themes emerged with resultant opportunities for adaption. For example, lack of a shared mental model resulted in frontline difficulty with assessment and feedback, and a concern that the granularity of competency-focused assessment may result in “missing the forest for the trees,” prompting the return of global assessment. Resident engagement in personal learning plans was not uniformly adopted and learning experiences tailored to residents’ needs were slow to follow. Conclusions: Rapid Evaluation provided critical insights into the successes and challenges of operationalization of CBME. Implementing the practical components of CBME was perceived as a sprint, while realizing the principles of CBME and changing culture in postgraduate training is a marathon requiring sustained effort in the form of frequent evaluation and continuous faculty and resident development.
Jason R Frank
added 2 research items
Changing the culture of residency training through faculty development - Volume 21 Issue 4 - Andrew K. Hall, Rob Woods, Jason R. Frank
Canadian specialist emergency medicine (EM) residency training is undergoing the most significant transformation in its history. This article describes the rationale, process, and redesign of EM competency-based medical education. The rationale for this evolution in residency education includes 1) improved public trust by increasing transparency of the quality and rigour of residency education, 2) improved fiscal accountability to government and institutions regarding specialist EM training, 3) improved assessment systems to replace poor functioning end-of-rotation assessment reports and overemphasis on high-stakes, end-of-training examinations, and 4) and tailored learning for residents to address individualized needs. A working group with geographic and stakeholder representation convened over a 2-year period. A consensus process for decision-making was used. Four key design features of the new residency education design include 1) specialty EM-specific outcomes to be achieved in residency; 2) designation of four progressive stages of training, linked to required learning experiences and entrustable professional activities to be achieved at each stage; 3) tailored learning that provides residency programs and learner flexibility to adapt to local resources and learner needs; and 4) programmatic assessment that emphasizes systematic, longitudinal assessments from multiple sources, and sampling sentinel abilities. Required future study includes a program evaluation of this complex education intervention to ensure that intended outcomes are achieved and unintended outcomes are identified.
Jason R Frank
added 2 research items
Purpose: Direct observation is essential to assess and provide feedback to medical trainees. However, calls for its increased use in medical training persist as learners report that direct observation occurs infrequently. This study applied a theory-driven approach to systematically investigate barriers and enablers to direct observation in residency training. Method: From September 2016 to July 2017, semi-structured interviews of faculty and residents at The Ottawa Hospital were conducted and analyzed. An interview guide based on the Theoretical Domains Framework (TDF) was used to capture 14 domains that may influence direct observation. Interview transcripts were independently coded using direct content analysis, and specific beliefs were generated by grouping similar responses. Relevant domains were identified based on the frequencies of beliefs reported, presence of conflicting beliefs, and perceived influence on direct observation practices. Results: Twenty-five interviews (12 resident, 13 faculty) were conducted, representing 10 specialties. Ten TDF domains were identified as influencing direct observation: knowledge, skills, beliefs about consequences, social/professional role and identity, intention, goals, memory/attention/decision processes, environmental context and resources, social influences, and behavioral regulation. Discord between faculty and resident intentions, coupled with social expectations that residents should be responsible for ensuring that observations occur, was identified as a key barrier. Additionally, competing demands identified across multiple TDF domains emerged as a pervasive theme. Conclusions: This study identified key barriers and enablers to direct observation. These influencing factors provide a basis for the development of potential strategies aimed at embedding direct observation as a routine pedagogical practice in residency training.
Purpose: The rapid adoption of competency-based medical education (CBME) provides an unprecedented opportunity to study implementation. Examining "fidelity of implementation"-that is, whether CBME is being implemented as intended-is hampered, however, by the lack of a common framework. This article details the development of such a framework. Method: A two-step method was used. First, a perspective indicating how CBME is intended to bring about change was described. Accordingly, core components were identified. Drawing from the literature, the core components were organized into a draft framework. Using a modified Delphi approach, the second step examined consensus amongst an international group of experts in CBME. Results: Two different viewpoints describing how a CBME program can bring about change were found: production and reform. Because the reform model was most consistent with the characterization of CBME as a transformative innovation, this perspective was used to create a draft framework. Following the Delphi process, five core components of CBME curricula were identified: outcome competencies, sequenced progression, tailored learning experiences, competency-focused instruction, and programmatic assessment. With some modification in wording, consensus emerged amongst the panel of international experts. Conclusions: Typically, implementation evaluation relies on the creation of a specific checklist of practices. Given the ongoing evolution and complexity of CBME, this work, however, focused on identifying core components. Consistent with recent developments in program evaluation, where implementation is described as a developmental trajectory toward fidelity, identifying core components is presented as a fundamental first step toward gaining a more sophisticated understanding of implementation.
Jason R Frank
added a research item
Introduction: The specialist Emergency Medicine (EM) postgraduate training program at Queens University implemented a new Competency-Based Medical Education (CBME) model on July 1 2017. This occurred one year ahead of the national EM cohort, in the model of Competence By Design (CBD) as outlined by the Royal College of Physicians and Surgeons of Canada (RCPSC). This presents an opportunity to identify critical steps, successes, and challenges in the implementation process to inform ongoing national CBME implementation efforts. Methods: A case-study methodology with Rapid Cycle Evaluation was used to explore the lived experience of implementing CBME in EM at Queens, and capture evidence of behavioural change. Data was collected at 3- and 6- months post-implementation via multiple sources and methods, including: field observations, document analysis, and interviews with key stakeholders: residents, faculty, program director, CBME lead, academic advisors, and competence committee members. Qualitative findings have been triangulated with available quantitative electronic assessment data. Results: The critical processes of implementation have been outlined in 3 domain categories: administrative transition, resident transition, and faculty transition. Multiple themes emerged from stakeholder interviews including: need for holistic assessment beyond Entrustable Professional Activity (EPA) assessments, concerns about the utility of milestones in workplace based assessment by front-line faculty, trepidation that CBME is adding to, rather than replacing, old processes, and a need for effective data visualisation and filtering for assessment decisions by competency committees. We identified a need for administrative direction and faculty development related to: new roles and responsibilities, shared mental models of EPAs and entrustment scoring. Quantitative data indicates that the targeted number of assessments per EPA and stage of training may be too high. Conclusion: Exploring the lived experience of implementing CBME from the perspectives of all stakeholders has provided early insights regarding the successes and challenges of operationalizing CBME on the ground. Our findings will inform ongoing local implementation and higher-level national planning by the Canadian EM Specialty Committee and other programs who will be implementing CBME in the near future.
Eric S Holmboe
added a research item
Assessment has always been an essential component of postgraduate medical education and for many years focused predominantly on various types of examinations. While examinations of medical knowledge and more recently of clinical skills with standardized patients can assess learner capability in controlled settings and provide a level of assurance for the public, persistent and growing concerns regarding quality of care and patient safety worldwide has raised the importance and need for better work-based assessments. Work-based assessments, when done effectively, can more authentically capture the abilities of learners to actually provide safe, effective, patient-centered care. Furthermore, we have entered the era of interprofessional care where effective teamwork among multiple health care professionals is now paramount. Work-based assessment methods are now essential in an interprofessional healthcare world. To better prepare learners for these newer competencies and the ever-growing complexity of healthcare, many post-graduate medical education systems across the globe have turned to outcomes-based models of education, codified through competency frameworks. This commentary provides a brief overview on key methods of work-based assessment such as direct observation, multisource feedback, patient experience surveys and performance measures that are needed in a competency-based world that places a premium on educational and clinical outcomes. However, the full potential of work-based assessments will only be realized if post-graduate learners play an active role in their own assessment program. This will require a substantial culture change, and culture change only occurs through actions and changed behaviors. Co-production offers a practical and philosophical approach to engaging postgraduate learners to be active, intrinsically motivated agents for their own professional development, help to change learning culture and contribute to improving programmatic assessment in post-graduate training.
Eric S Holmboe
added a research item
The transition, if not transformation, to outcomes-based medical education likely represents a paradigm shift struggling to be realized. Paradigm shifts are messy and difficult but ultimately meaningful if done successfully. This struggle has engendered tension and disagreements, with many of these disagreements cast as either-or polarities. There is little disagreement, however, that the health care system is not effectively achieving the triple aim for all patients. Much of the tension and polarity revolve around how more effectively to prepare students and residents to work in and help change a complex health care system.Competencies were an initial attempt to facilitate this shift by creating frameworks of essential abilities needed by physicians. However, implementation of competencies has proven to be difficult. Entrustable professional activities (EPAs) in undergraduate and graduate medical education and Milestones in graduate medical education are recent concepts being tried and studied as approaches to guide the shift to outcomes. Their primary purpose is to help facilitate implementation of an outcomes-based approach by creating shared mental models of the competencies, which in turn can help to improve curricula and assessment. Understanding whether and how EPAs and Milestones effectively facilitate the shift to outcomes has been and will continue to be an iterative and ongoing reflective process across the entire medical education community using lessons from implementation and complexity science. In this Invited Commentary, the author reflects on what got the community to this point and some sources of tension involved in the struggle to move to outcomes-based education.
Jason R Frank
added a research item
Introduction: The Oral Case Presentation (OCP) has been described as a unique form of inter-physician communication integral to the practice of medicine and represents the foundation of trainee-supervisor interactions. In recent years, entrustment has been identified as an essential element of trainee supervision and learning. Despite the growing body of knowledge concerning entrustment in medical education, the influence of trust on the educational dynamic surrounding the OCP remains unknown. The objectives of this study were to (1) describe the complex nature of the OCP from the perspective of the supervisor and (2) explore the central role the OCP plays in the dyadic relationship between supervisor and trainee during the delivery of patient care. Methods: Using a constructivist grounded theory approach, semi-structured interviews were conducted from 2015 to 2016 with a purposive sample of attending Emergency Medicine (EM) physicians from the University of Ottawa. Transcripts were reviewed independently by two investigators using line-by-line coding and constant comparative analysis. Emerging concepts were coded and key themes identified through consensus. Theoretical sampling occurred until thematic saturation was reached. Results: Twenty-one attending EM physicians participated in this study (71% male). The mean number of years in practice was 14. The mean percentage of shifts with a trainee assigned was 86%. Factors relating to entrustment were identified as the principal influences on both the content of the OCP and decisions relating to trainee supervision during the OCP process. These factors included the trainee level, the trainee-supervisor relationship, the context and the task. The OCP was also found to play several important roles as supervisors balanced the delivery of patient care and trainee education. These roles were related to communication, teaching and trainee assessment. Conclusion: The OCP represents a core activity within the supervisor-trainee relationship in which trust plays a central role. Clinical supervisors value the OCP as a form of authentic assessment of skills and perceive it to be a key determinant in making entrustment decisions. Future studies designed to evaluate the utility of the OCP as an educational tool should consider entrustment as an essential element.
Jason R Frank
added 3 research items
Competency-based medical education (CBME), by definition, necessitates a robust and multifaceted assessment system. Assessment and the judgments or evaluations that arise from it are important at the level of the trainee, the program, and the public. When designing an assessment system for CBME, medical education leaders must attend to the context of the multiple settings where clinical training occurs. CBME further requires assessment processes that are more continuous and frequent, criterion-based, developmental, work-based where possible, use assessment methods and tools that meet minimum requirements for quality, use both quantitative and qualitative measures and methods, and involve the wisdom of group process in making judgments about trainee progress. Like all changes in medical education, CBME is a work in progress. Given the importance of assessment and evaluation for CBME, the medical education community will need more collaborative research to address several major challenges in assessment, including "best practices" in the context of systems and institutional culture and how to best to train faculty to be better evaluators. Finally, we must remember that expertise, not competence, is the ultimate goal. CBME does not end with graduation from a training program, but should represent a career that includes ongoing assessment.
With the introduction of Tomorrow's Doctors in 1993, medical education began the transition from a time- and process-based system to a competency-based training framework. Implementing competency-based training in postgraduate medical education poses many challenges but ultimately requires a demonstration that the learner is truly competent to progress in training or to the next phase of a professional career. Making this transition requires change at virtually all levels of postgraduate training. Key components of this change include the development of valid and reliable assessment tools such as work-based assessment using direct observation, frequent formative feedback, and learner self-directed assessment; active involvement of the learner in the educational process; and intensive faculty development that addresses curricular design and the assessment of competency.
Jason R Frank
added 9 research items
Competency-based medical education (CBME) is an approach to the design of educational systems or curricula that focuses on graduate abilities or competencies. It has been adopted in many jurisdictions, and in recent years an explosion of publications has examined its implementation and provided a critique of the approach. Assessment in a CBME context is often based on observations or judgments about an individual’s level of expertise; it emphasizes frequent, direct observation of performance along with constructive and timely feedback to ensure that learners, including clinicians, have the expertise they need to perform entrusted tasks. This paper explores recent developments since the publication in 2010 of Holmboe and colleagues’ description of CBME assessment. Seven themes regarding assessment that arose at the second invitational summit on CBME, held in 2013, are described: competency frameworks, the reconceptualization of validity, qualitative methods, milestones, feedback, assessment processes, and assessment across the medical education continuum. Medical educators interested in CBME, or assessment more generally, should consider the implications for their practice of the review of these emerging concepts.
For more than 60 years, competency-based education has been proposed as an approach to education in many disciplines. In medical education, interest in CBME has grown dramatically in the last decade. This editorial introduces a series of papers that resulted from summits held in 2013 and 2016 by the International CBME Collaborators, a scholarly network whose members are interested in developing competency-based approaches to preparing the next generation of health professionals. An overview of the papers is given, as well as a summary of landmarks in the conceptual evolution and implementation of CBME. This series follows on a first collection of papers published by the International CBME Collaborators in Medical Teacher in 2010.
Medical education is under increasing pressure to more effectively prepare physicians to meet the needs of patients and populations. With its emphasis on individual, programmatic, and institutional outcomes, competency-based medical education (CBME) has the potential to realign medical education with this societal expectation. Implementing CBME, however, comes with significant challenges. This manuscript describes four overarching challenges that must be confronted by medical educators worldwide in the implementation of CBME: (1) the need to align all regulatory stakeholders in order to facilitate the optimization of training programs and learning environments so that they support competency-based progression; (2) the purposeful integration of efforts to redesign both medical education and the delivery of clinical care; (3) the need to establish expected outcomes for individuals, programs, training institutions, and health care systems so that performance can be measured; and (4) the need to establish a culture of mutual accountability for the achievement of these defined outcomes. In overcoming these challenges, medical educators, leaders, and policy-makers will need to seek collaborative approaches to common problems and to learn from innovators who have already successfully made the transition to CBME.
Jason R Frank
added a research item
The International Competency-Based Medical Education (ICBME) Collaborators have been working since 2009 to promote understanding of competency-based medical education (CBME) and accelerate its uptake worldwide. This article presents a charter, supported by a literature-based rationale, which is meant to provide a shared mental model of CBME that will serve as a path forward in its widespread implementation.At a 2013 summit, the ICBME Collaborators laid the groundwork for this charter. Here, the fundamental principles of CBME and professional responsibilities of medical educators in its implementation process are described. The authors outline three fundamental principles: (1) Medical education must be based on the health needs of the populations served; (2) the primary focus of education and training should be the desired outcomes for learners rather than the structure and process of the educational system; and (3) the formation of a physician should be seamless across the continuum of education, training, and practice.Building on these principles, medical educators must demonstrate commitment to teaching, assessing, and role modeling the range of identified competencies. In the clinical setting, they must provide supervision that balances patient safety with the professional development of learners, being transparent with stakeholders about level of supervision needed. They must use effective and efficient assessment strategies and tools for basing transition decisions on competence rather than time in training, empowering learners to be active participants in their learning and assessment. Finally, advancing CBME requires program evaluation and research, faculty development, and a collaborative approach to realize its full potential.
Sarah Taber
added a research item
At their 2009 consensus conference, the International CBME Collaborators proposed a number of central tenets of CBME in order to advance the field of medical education. Although the proposed conceptualization of CBME offers several advantages and opportunities, including a greater emphasis on outcomes, a mechanism for the promotion of learner-centred curricula, and the potential to move away from time-based training and credentialing in medicine, it is also associated with several significant barriers to adoption. This paper examines the concepts of CBME through a broad educational policy lens, identifying considerations for medical education leaders, health care institutions, and policy-makers at both the meso (program, institutional) and macro (health care system, inter-jurisdictional, and international) levels. Through this analysis, it is clear that CBME is associated with a number of complex challenges and questions, and cannot be considered in isolation from the complex systems in which it functions. Much more work is needed to engage stakeholders in dialogue, to debate the issues, and to identify possible solutions.
Jason R Frank
added 2 research items
Competency-based medical education (CBME) aims to bring about the sequential acquisition of competencies required for practice. Although it is being adopted in centers of medical education around the globe, there is little evidence concerning whether, in comparison with traditional methods, CBME produces physicians who are better prepared for the practice environment and contributes to improved patient outcomes. Consequently, the authors, an international group of collaborators, wrote this article to provide guidance regarding the evaluation of CBME programs. CBME is a complex service intervention consisting of multiple activities that contribute to the achievement of a variety of outcomes over time. For this reason, it is difficult to apply traditional methods of program evaluation, which require conditions of control and predictability, to CBME. To address this challenge, the authors describe an approach that makes explicit the multiple potential linkages between program activities and outcomes. Referred to as contribution analysis (CA), this theory-based approach to program evaluation provides a systematic way to make credible causal claims under conditions of complexity. Although CA has yet to be applied to medical education, the authors describe how a six-step model and a postulated theory of change could be used to examine the link between CBME, physicians' preparation for practice, and patient care outcomes. The authors argue that adopting the methods of CA, particularly the rigor in thinking required to link program activities, outcomes, and theory, will serve to strengthen understanding of the impact of CBME over time. (C) 2016 by the Association of American Medical Colleges
Although competency-based medical education (CBME) has attracted renewed interest in recent years among educators and policy-makers in the health care professions, there is little agreement on many aspects of this paradigm. We convened a unique partnership - the International CBME Collaborators - to examine conceptual issues and current debates in CBME. We engaged in a multi-stage group process and held a consensus conference with the aim of reviewing the scholarly literature of competency-based medical education, identifying controversies in need of clarification, proposing definitions and concepts that could be useful to educators across many jurisdictions, and exploring future directions for this approach to preparing health professionals. In this paper, we describe the evolution of CBME from the outcomes movement in the 20th century to a renewed approach that, focused on accountability and curricular outcomes and organized around competencies, promotes greater learner-centredness and de-emphasizes time-based curricular design. In this paradigm, competence and related terms are redefined to emphasize their multi-dimensional, dynamic, developmental, and contextual nature. CBME therefore has significant implications for the planning of medical curricula and will have an important impact in reshaping the enterprise of medical education. We elaborate on this emerging CBME approach and its related concepts, and invite medical educators everywhere to enter into further dialogue about the promise and the potential perils of competency-based medical curricula for the 21st century.