The role of assessment in competency-based medical education
Full-textDOI: · Available from: Jason R Frank,
- SourceAvailable from: John Burkhardt
[Show abstract] [Hide abstract]
- "Competence is a holistic judgment that incorporates knowledge, skills and attitudes that are demonstrated in the context of practice and are influenced by the context of practice and learning. The complexities of CBE assessment and the practical and theoretical implications are receiving greater attention from educators, researchers and psychometricians (Rethans et al. 2002; Norcini 2005; Holmboe et al. 2010; Schuwirth & Ash 2013). From these analyses, several assessment issues emerge that should be carefully considered by any CBE program: Assessment data need to be collected frequently, even continuously. "
ABSTRACT: There is a growing demand for health sciences faculty with formal training in education. Addressing this need, the University of Michigan Medical School created a Master in Health Professions Education (UM-MHPE). The UM-MHPE is a competency-based education (CBE) program targeting professionals. The program is individualized and adaptive to the learner's situation using personal mentoring. Critical to CBE is an assessment process that accurately and reliably determines a learner's competence in educational domains. The program's assessment method has two principal components: an independent assessment committee and a learner repository. Learners submit evidence of competence that is evaluated by three independent assessors. The assessments are presented to an Assessment Committee who determines whether the submission provides evidence of competence. The learner receives feedback on the submission and, if needed, the actions needed to reach competency. During the program's first year, six learners presented 10 submissions for review. Assessing learners in a competency-based program has created challenges; setting standards that are not readily quantifiable is difficult. However, we argue it is a more genuine form of assessment and that this process could be adapted for use within most competency-based formats. While our approach is demanding, we document practical learning outcomes that assess competence.Medical Teacher 06/2015; DOI:10.3109/0142159X.2015.1047754
[Show abstract] [Hide abstract]
- "In order to successfully implement a competency-based model of education, one must ensure that robust assessment modalities are available to assess trainees and provide them with feedback to promote continuous improvement. In 2010, a paper by the International CBME Collaborators outlined suggestions for the role of assessment in CBME (Holmboe et al. 2010). They highlighted the need for the use of quality assessment tools, meaning that the tools should be reliable, have an educational impact, be acceptable, be cost effective, and should provide evidence for the validity of scores obtained from their use. "
ABSTRACT: The purpose of this study was to explore the use of an objective structured clinical examination for Internal Medicine residents (IM-OSCE) as a progress test for clinical skills. Data from eight administrations of an IM-OSCE were analyzed retrospectively. Data were scaled to a mean of 500 and standard deviation (SD) of 100. A time-based comparison, treating post-graduate year (PGY) as a repeated-measures factor, was used to determine how residents' performance progressed over time. Residents' total IM-OSCE scores (n = 244) increased over training from a mean of 445 (SD = 84) in PGY-1 to 534 (SD = 71) in PGY-3 (p < 0.001). In an analysis of sub-scores, including only those who participated in the IM OSCE for all three years of training (n = 46), mean structured oral scores increased from 464 (SD = 92) to 533 (SD = 83) (p < 0.001), physical examination scores increased from 464 (SD = 82) to 520 (SD = 75) (p < 0.001), and procedural skills increased from 495 (SD = 99) to 555 (SD = 67) (p = 0.033). There was no significant change in communication scores (p = 0.97). The IM-OSCE can be used to demonstrate progression of clinical skills throughout residency training. Although most of the clinical skills assessed improved as residents progressed through their training, communication skills did not appear to change.Medical Teacher 04/2015; DOI:10.3109/0142159X.2015.1029895
[Show abstract] [Hide abstract]
- "However, our experience shows that students can be more dynamic, active, demanding, flexible, autonomous, critical, and responsible when they are supported by an appropriate learning tool. It is expected that further development of the e-Portfolio will improve the achievement of competence by use of this unique combination of quantitative and formative assessments [36,37]. "
ABSTRACT: Background We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Methods Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Results Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved the maximum scores for participation as an observer or assistant. Conclusions Medical students reported that use of an electronic portfolio that provided quantitative feedback on their progress was useful when the number and complexity of targets were appropriate, but not when the portfolio offered only formative evaluations based on reflection. Students felt that use of the e-Portfolio guided their learning process by indicating knowledge gaps to themselves and teachers.BMC Medical Education 05/2013; 13(1):65. DOI:10.1186/1472-6920-13-65
Questions & Answers about this publication
- Could anyone refer me to some interesting empirical studies on the use of competence-based curricula in a university?
Any other empirical studies related to this issue would be interesting too.