Efficacy of high-fidelity simulation debriefing on the performance of practicing anaesthetists in simulated scenarios.

Department of Anesthesia, Women's College Hospital, 76 Grenville St., Toronto, ON, Canada.
BJA British Journal of Anaesthesia (Impact Factor: 4.24). 08/2009; 103(4):531-7. DOI: 10.1093/bja/aep222
Source: PubMed

ABSTRACT Research into adverse events in hospitalized patients suggests that a significant number are preventable. The purpose of this randomized, controlled study was to determine if simulation-based debriefing improved performance of practicing anaesthetists managing high-fidelity simulation scenarios.
The anaesthetists were randomly allocated to Group A: simulation debriefing; Group B: home study; and Group C: no intervention and secondary randomization to one of two scenarios. Six to nine months later, subjects returned to manage the alternate scenario. Facilitators blinded to study group allocation completed the performance checklists (dichotomously scored checklist, DSC) and Global Rating Scale of Performance (GRS). Two non-expert raters were trained, and assessed all videotaped performances.
Interim analysis indicated no difference between Groups B and C which were merged into one group. Seventy-four subjects were recruited, with 58 complete data sets available. There was no significant effect of group on pre-test scores. A significant improvement was seen between pre- and post-tests on the DSC in debriefed subjects (pre-test 66.8%, post-test 70.3%; F(1,57)=4.18, P=0.046). Both groups showed significant improvement in the GRS over time (F(1,57)=5.94, P=0.018), but no significant difference between the groups.
We found a modest improvement in performance on a DSC in the debriefed group and overall improvement in both control and debriefed groups using a GRS. Whether this improvement translates into clinical practice has yet to be determined.

1 Bookmark
  • [Show abstract] [Hide abstract]
    ABSTRACT: Simulation has long been integrated in anaesthesiology training, yet a comprehensive review of its effectiveness is presently lacking. Using meta-analysis and critical narrative analysis, we synthesized the evidence for the effectiveness of simulation-based anaesthesiology training. We searched MEDLINE, ERIC, and SCOPUS through May 2011 and included studies using simulation to train health professional learners. Data were abstracted independently and in duplicate. We included 77 studies (6066 participants). Compared with no intervention (52 studies), simulation was associated with moderate to large pooled effect sizes (ESs) for all outcomes (ES range 0.60-1.05) except for patient effects (ES -0.39). Compared with non-simulation instruction (11 studies), simulation was associated with moderate effects for satisfaction and skills (ES 0.39 and 0.42, respectively), large effect for behaviours (1.77), and small effects for time, knowledge, and patient effects (-0.18 to 0.23). In 17 studies comparing alternative simulation interventions, training in non-technical skills (e.g. communication) and medical management compared with training in medical management alone was associated with negligible effects for knowledge and skills (four studies, ES range 0.14-0.15). Debriefing using multiple vs single information sources was associated with negligible effects for time and skills (three studies, ES range -0.07 to 0.09). Our critical analysis showed inconsistency in measurement of non-technical skills and consistency in the (ineffective) design of debriefing. Simulation in anaesthesiology appears to be more effective than no intervention (except for patient outcomes) and non-inferior to non-simulation instruction. Few studies have clarified the key instructional designs for simulation-based anaesthesiology training.
    BJA British Journal of Anaesthesia 12/2013; · 4.24 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background Debriefing as part of the simulation experience is regarded as essential for learning. Evidence concerning best debriefing practices from the standpoint of a student nurse participant is minimal, particularly when comparing debriefing types. This study evaluated the differences in the student experience between two debriefing types: debriefing with video and debriefing without video (debriefing alone). Method Nursing students participating in an intensive care simulation were randomized into one of the two debriefing types: debriefing with video (n = 32) and debriefing alone (n = 32) following simulation completion. After debriefing was completed, students were asked to complete a debriefing experience scale, designed to evaluate the nursing student experience during debriefing. Results Statistically significant differences were found in only 3 of 20 items on the Debriefing Experience Scale. Debriefing with video had higher means with two items, “Debriefing helped me to make connections between theory and real-life situations” (p = .007) and “I had enough time to debrief thoroughly” (p = .039). Debriefing alone had a higher mean on one item ‘‘The debriefing session facilitator was an expert in the content area’’ (p = .006). Conclusion Students identified learning as part of their experience with both debriefing types. Although a few differences exist, nursing students reported overall that their experiences were minimally different between debriefing with video and debriefing alone.
    Clinical Simulation in Nursing 12/2013; 9(12):e585–e591.
  • [Show abstract] [Hide abstract]
    ABSTRACT: The objective of this review was to identify, appraise and synthesise the best available evidence for the effectiveness of debriefing as it relates to simulation-based learning for health professionals. Simulation is defined as a technique used to replace or amplify real experiences with guided experiences that evoke or replace substantial aspects of the real world in a fully interactive manner. The use of simulation for health professional education began decades ago with the use of low-fidelity simulations and has evolved at an unprecedented pace. Debriefing is considered by many to be an integral and critical part of the simulation process. However, different debriefing approaches have developed with little objective evidence of their effectiveness. Studies that evaluated the use of debriefing for the purpose of simulation-based learning for health professionals were included. Simulation studies not involving health professionals and those conducted in other settings such as such as military or aviation were excluded. A review protocol outlining the inclusion and exclusion criteria was submitted, peer reviewed by the Joanna Briggs Institute (JBI) for Evidence Based Practice, and approved prior to undertaking the review. A comprehensive search of studies published between January 2000 and September 2011 was conducted across ten electronic databases. Two independent reviewers assessed each paper prior to inclusion or exclusion using the standardised critical appraisal instruments for evidence of effectiveness developed by the Joanna Briggs Institute. Ten randomised controlled trials involving various debriefing methods were included in the review. Meta-analysis was not possible because of the different outcomes, control groups and interventions in the selected studies. The methods of debriefing included: post simulation debriefing, in-simulation debriefing, instructor facilitated debriefing and video-assisted instructor debriefing. In the included studies there was a statistically significant improvement pre-test to post-test in the performance of technical and nontechnical skills such as: vital signs assessment; psychomotor skills; cardiopulmonary resuscitation; task management; team working; and situational awareness, regardless of the type of debriefing conducted. Additionally, only one study reported consistent improvement in these outcomes with the use of video playback during debriefing. In two studies the effect of the debrief was evident months after the initial simulation experiences. These results support the widely held assumption that debriefing is an important component of simulation. It is recommended therefore that debriefing remains an integral component of all simulation-based learning experiences. However, the fact that there were no clinical or practical differences in outcomes when instructor facilitated debriefing was enhanced by video playback is an important finding since this approach is currently considered to be the 'gold standard' for debriefing. This finding therefore warrants further research.
    Nurse education today 10/2013; · 0.91 Impact Factor