Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31

Foundation for Advancement of International Medical Education and Research, Philadelphia 19104, USA.
Medical Teacher (Impact Factor: 1.68). 12/2007; 29(9):855-71. DOI: 10.1080/01421590701775453
Source: PubMed


There has been concern that trainees are seldom observed, assessed, and given feedback during their workplace-based education. This has led to an increasing interest in a variety of formative assessment methods that require observation and offer the opportunity for feedback.
To review some of the literature on the efficacy and prevalence of formative feedback, describe the common formative assessment methods, characterize the nature of feedback, examine the effect of faculty development on its quality, and summarize the challenges still faced.
The research literature on formative assessment and feedback suggests that it is a powerful means for changing the behaviour of trainees. Several methods for assessing it have been developed and there is preliminary evidence of their reliability and validity. A variety of factors enhance the efficacy of workplace-based assessment including the provision of feedback that is consistent with the needs of the learner and focused on important aspects of the performance. Faculty plays a critical role and successful implementation requires that they receive training.
There is a need for formative assessment which offers trainees the opportunity for feedback. Several good methods exist and feedback has been shown to have a major influence on learning. The critical role of faculty is highlighted, as is the need for strategies to enhance their participation and training.

Download full-text


Available from: John Norcini,
  • Source
    • "A structured interview is used to elicit the trainee's rationalization and determinants for data gathering, problem solving, and patient management. Work-based assessment using the patient's chart attends to the base of Miller's pyramid: ''knows'' and ''knows how'', which conceptualizes the recall of knowledge and the application of the knowledge to problem solving and clinical decisions, respectively (Miller 1990; Norcini & Burch 2007). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Objectives: The objective of this review is to summarize and critically appraise existing evidence on the use of chart stimulated recall (CSR) and case-based discussion (CBD) as an assessment tool for medical trainees. Methods: Medline, Embase, CINAHL, PsycINFO, Educational Resources Information Centre (ERIC), Web of Science, and the Cochrane Central Register of Controlled Trials were searched for original articles on the use of CSR or CBD as an assessment method for trainees in all medical specialties. Results: Four qualitative and three observational non-comparative studies were eligible for this review. The number of patient-chart encounters needed to achieve sufficient reliability varied across studies. None of the included studies evaluated the content validity of the tool. Both trainees and assessors expressed high level of satisfaction with the tool; however, inadequate training, different interpretation of the scoring scales and skills needed to give feedback were addressed as limitations for conducting the assessment. Conclusion: There is still no compelling evidence for the use of patient's chart to evaluate medical trainees in the workplace. A body of evidence that is valid, reliable, and documents the educational effect in support of the use of patients' charts to assess medical trainees is needed.
    Medical Teacher 02/2015; 37(S1):1-6. DOI:10.3109/0142159X.2015.1006599
  • Source
    • "Please cite this article in press as: Embo, M., et al., A framework to facilitate self-directed learning, assessment and supervision in midwifery practice: A qualitative study of supervisors' perceptions, Nurse Education in Practice (2014), assessment by linking feedback and reflections collected in the feedback unit to concrete and observable learning outcomes in the checklist, which can be discussed during assessment meetings at the midpoint and end of the internship. As such, the checklist also helps to ground summative assessment in indicators of observed student performance (Norcini and Burch, 2007; Palmer and Devitt, 2008). Having exploring students' perceptions of the MAFI in an earlier study, we conducted the present study to explore the relevance of the four goals of the MAFI to facilitate self-directed learning in the clinical workplace from the perspective of the supervisors. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Background Self-directed learning is an educational concept that has received increasing attention. The recent workplace literature, however, reports problems with the facilitation of self-directed learning in clinical practice. We developed the Midwifery Assessment and Feedback Instrument (MAFI) as a framework to facilitate self-directed learning. In the present study, we sought clinical supervisors’ perceptions of the usefulness of MAFI. Methods Interviews with fifteen clinical supervisors were audio taped, transcribed verbatim and analyzed thematically using Atlas-Ti software for qualitative data analysis. Results Four themes emerged from the analysis. (1) The competency-based educational structure promotes the setting of realistic learning outcomes and a focus on competency development, (2) instructing students to write reflections facilitates student-centred supervision, (3) creating a feedback culture is necessary to achieve continuity in supervision and (4) integrating feedback and assessment might facilitate competency development under the condition that evidence is discussed during assessment meetings. Supervisors stressed the need for direct observation, and instruction how to facilitate a self-directed learning process. Conclusion The MAFI appears to be a useful framework to promote self-directed learning in clinical practice. The effect can be advanced by creating a feedback and assessment culture where learners and supervisors share the responsibility for developing self-directed learning.
    Nurse education in practice 08/2014; 14(4). DOI:10.1016/j.nepr.2014.01.015
  • Source
    • "Research seems to indicate improved learning in academic results, enhanced patient-centredness, greater exposure to normal conditions, and more meaningful relationships with patients and academic mentors [47]. Another example of where sociocultural learning theory is influential is the recent attention to work-based assessment [48]. The intent is to provide more meaningful feedback to the learners from the professionals involved in the workplace [49], to drive self-directed learning through the use of reflection and dialogues with respected supervisors in enduring relationships to guide learning [13]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Educational practice and educational research are not aligned with each other. Current educational practice heavily relies on information transmission or content delivery to learners. Yet evidence shows that delivery is only a minor part of learning. To illustrate the directions we might take to find better educational strategies, six areas of educational evidence are briefly reviewed. The flipped classroom idea is proposed to shift our expenditure and focus in education. All information delivery could be web distributed, thus creating more time for other more expensive educational strategies to support the learner. In research our focus should shift from comparing one curriculum to the other, to research that explains why things work in education and under which conditions. This may generate ideas for creative designers to develop new educational strategies. These best practices should be shared and further researched. At the same time attention should be paid to implementation and the realization that teachers learn in a way very similar to the people they teach. If we take the evidence seriously, our educational practice will look quite different to the way it does now.
    06/2014; 3(3). DOI:10.1007/s40037-014-0129-9
Show more

Questions & Answers about this publication

  • Rozália Klára Bakó added an answer in Open Access:
    Can anyone recommend a validated evaluation form for a seminar?

    I'm organising 2 events (half day seminars) for International Open Access week and have read some literature around evaluation.  I'm hoping to publish my findings. Specifically I'd like to measure whether the seminar 1) increases peoples awareness of OA and 2) influences them to promote OA in their organisations.  I'm also interested in demonstrating that collaborative events between the academic & health sector leads to better outcomes.  Any advice appreciated.