Article

Faculty Development for Simulation Programs: Five Issues for the Future of Debriefing Training

Authors:
  • Alberta Children's Hospital, University of Calgary
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Statement: Debriefing is widely recognized as a critically important element of simulation-based education. Simulation educators obtain and/or seek debriefing training from various sources, including workshops at conferences, simulation educator courses, formal fellowships in debriefings, or through advanced degrees. Although there are many options available for debriefing training, little is known about how faculty development opportunities should be structured to maintain and enhance the quality of debriefing within simulation programs. In this article, we discuss 5 key issues to help shape the future of debriefing training for simulation educators, specifically the following: (1) Are we teaching the appropriate debriefing methods? (2) Are we using the appropriate methods to teach debriefing skills? (3) How can we best assess debriefing effectiveness? (4) How can peer feedback of debriefing be used to improve debriefing quality within programs? (5) How can we individualize debriefing training opportunities to the learning needs of our educators?

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... With a lack of formalized training and resources to develop educator skill in SBE, variable methods and inconsistent quality of debriefings can occur. 13,14 Paige et al 15 surveyed faculty from the American College of Surgeons and found time and funds as the largest barriers to educator training in use of SBE. Almeida et al 16 reviewed trends in simulation research and noted that studies mention the importance of trained educators but found a lack of research investigating what constitutes educator training. ...
... However, the types of debriefing approaches taught, and the frequency of training remains inconclusive. Cheng et al 13,14 suggested teaching various debriefing approaches in a blended format to allow the educator to adjust the debrief based on the needs of the learners. Bradley 38 investigated retention of debriefing skills after training and found that educators only applied 65% of the skills learned when debriefing. ...
... Explicitly defining and clearly communicating expectations such as timing, format, and ensuring confidentiality of all peer feedback reinforced trust and transparency. 13,69 To establish peer feedback processes, it was recommended that peer coaches have equal levels of training and experience. Frequently reported in the literature was use of peer feedback to evaluate debriefing skill. ...
Statement: Formal training for educators who use simulation-based education (SBE) is required by standards of best practice, simulation guidelines, regulatory, and accrediting bodies. Training efforts to establish educator competency for SBE are being offered. However, a systematic review of this body of literature has yet to be conducted. The purpose of this integrative review was to appraise formal training efforts of educators who use SBE. The aims were to summarize the training topics, describe the structure of training programs, and explore evaluation methods of educators. The New World Kirkpatrick Model guided the review. A PRISMA search approach yielded 2007 citations of which 38 met inclusion criteria. Analysis supports a formalized training process that uses a combination of didactic material, time for repetitive practice, and ongoing feedback with longitudinal and scaffolded delivery approaches. An identified gap in the literature is threshold levels for determining competency of educators. Recommendations for planning simulation training programs are provided.
... As a result, they are useful for illustrating concepts to novices and providing a shared mental model of what a good debriefing looks like. However, they are not easily integrated into debriefer feedback, mentoring or coaching [11]. While these tools appear to be widely adopted in the training of debriefers, validation studies were limited to analysis of delayed reviews of recorded debriefings [8][9][10]12]. ...
... Current faculty development programmes often use the tools listed above as an aid to achieve improved debriefings [11]. In many programmes, feedback to new debriefers follows direct observation (or video review) by more experienced colleagues. ...
... In the context of the current remote learning climate, a recent study recognised that current methods of faculty development lack a structured approach [30]. We agree that structure is clearly an important factor that faculty development programmes might lack [11]. The quantitative approaches described in our work may assist with providing this structure at the point of care by allocating our attention to observing debriefings in a focused manner. ...
Article
Full-text available
Background Debriefing is an essential skill for simulation educators and feedback for debriefers is recognised as important in progression to mastery. Existing assessment tools, such as the Debriefing Assessment for Simulation in Healthcare (DASH), may assist in rating performance but their utility is limited by subjectivity and complexity. Use of quantitative data measurements for feedback has been shown to improve performance of clinicians but has not been studied as a focus for debriefer feedback. Methods A multi-centre sample of interdisciplinary debriefings was observed. Total debriefing time, length of individual contributions and demographics were recorded. DASH scores from simulation participants, debriefers and supervising faculty were collected after each event. Conversational diagrams were drawn in real-time by supervising faculty using an approach described by Dieckmann. For each debriefing, the data points listed above were compiled on a single page and then used as a focus for feedback to the debriefer. Results Twelve debriefings were included (µ = 6.5 simulation participants per event). Debriefers receiving feedback from supervising faculty were physicians or nurses with a range of experience ( n = 7). In 9/12 cases the ratio of debriefer to simulation participant contribution length was ≧ 1:1. The diagrams for these debriefings typically resembled a fan-shape. Debriefings ( n = 3) with a ratio < 1:1 received higher DASH ratings compared with the ≧ 1:1 group ( p = 0.038). These debriefings generated star-shaped diagrams. Debriefer self-rated DASH scores (µ = 5.08/7.0) were lower than simulation participant scores (µ = 6.50/7.0). The differences reached statistical significance for all 6 DASH elements. Debriefers evaluated the ‘usefulness’ of feedback and rated it ‘highly’ (µ= 4.6/5). Conclusion Basic quantitative data measures collected during debriefings may represent a useful focus for immediate debriefer feedback in a healthcare simulation setting.
... As the need for debriefing training grows, a key consideration also emerges: how to ensure debriefing training programs are tailored to fit specific needs of program participants (Cheng et al., 2015;Peterson et al., 2017). Our findings provide additional empirical evidence of the specific needs. ...
... For both instructors and participants beyond the novice stage: understanding common perceptions and challenges of novices is also important. Having this knowledge can strengthen any peer-feedback and coaching efforts (Cheng et al., 2015(Cheng et al., , 2017 used in faculty development programs; participants with more experience can tailor their feedback to novices with these themes in mind. 2). ...
... Our findings showed that novice debriefers in our study were generally unaware that debriefing quality benchmarks exist, but instead, used their own personal criteria to determine debriefing success. Beyond learning the relevant debriefing methods (Cheng et al., 2015), novice debriefers also need to gain knowledge of published standards of best practices (The INACSL Standards Committee, 2016), and be able to use this knowledge to benchmark their own debriefing successes. ...
Article
Full-text available
Introduction Effective debriefing of simulation-based experiences is critical for learning. Approximately 33% of health professions instructors are debriefing novices. However, specific faculty development needs of novice debriefers has not been studied. This study examines how health professions instructors approach debriefing when they are new to debriefing simulation-based experiences. Methods This pilot qualitative study used a thematic analysis approach to explore novice debriefers’ experiences in conducting post-simulation debriefings. Eligible participants engaged in one-hour semi-structured interviews. Recruitment continued until data saturation was reached. We reviewed verbatim interview transcripts, hand-coded the data, and formed codes into themes. Results Nine novice debriefers participated. The overarching theme “I’m on my own…and they’re on their own,” reflects debriefers’ view that they are on their own, without resources. Debriefers also believe learners should identifying their own errors. Three main themes emerged: “Deep divide between me and the learners” portrays a separation between debriefers and learners in terms of expectations, roles, and responsibilities. “Winging it” depicts debriefers’ making-up their own debriefing approaches. “Debriefing quality: missing pieces of the puzzle” portrays novice debriefers unaware of criteria for effective debriefing. Conclusions Novice debriefers in this study perceived that they were on their own, having little to no debriefing training and mentorship. Study participants expressed debriefing struggles in several areas including discussing errors, facilitating learner participation, and assessing debriefing quality. Our findings shed light on simulation as a growing specialty by health profession educators and it is critical that resources are devoted to faculty development for debriefing skill acquisition. These findings can serve as a basis for future studies on debriefer skill acquisition.
... 12 13 The creation of validated methods of assessing simulation educators in various skills related to simulation is a necessary next step in the growth of this teaching methodology. [14][15][16] While some validated tools exist for evaluation of debriefing skills specifically, 17 18 methods of assessing other skills pertaining to simulation are very limited. Currently, there is an 'unmet need for a uniform mechanism to [. . ...
... This is surprising, given that debriefing facilitates reflective learning and is widely used and recognised as a critical part of simulation. 15 As such, we would have expected a stronger level of agreement. It is possible that the experts felt that outlining a specific plan for debriefing is not a priority when developing a written scenario as debriefing points can be generated by actual observed spontaneous actions and learner performance rather than being preplanned. ...
... Given the limited number of valid assessments available for these purposes, further development of such tools is a necessary step to advance the field as well as the quality of simulation training programme. [13][14][15][16] The SSET aids in defining a common understanding between simulation educators and trainees regarding the expectations of the development of quality scenarios. Future studies are needed to further assess inter-rater reliability as well as the psychometric properties of this tool to measure competency. ...
Article
Introduction One critical aspect of successful simulation facilitation is development of written scenarios. However, there are no validated assessment tools dedicated to the evaluation of written simulation scenarios available. Our aim was to develop a tool to evaluate the quality of written simulation demonstrating content validity. Methods A comprehensive literature search did not yield a validated assessment tool dedicated for the evaluation of written simulation scenarios. A subsequent search yielded six templates published for written simulation scenario design. From these templates, critical scenario elements were identified to create an evaluation instrument with six components of scenario quality with corresponding anchors and rating scale. Subsequently, a national group of simulation experts were engaged via survey methodology to rate the content of the proposed instrument. Ultimately, a modified two-round Delphi approach was implemented to demonstrate consensus of the final assessment tool. Results 38 responses were obtained in round 1, while 22 complete responses were obtained in round 2. Round 1 kappa values ranged from 0.44 to 1.0, indicating moderate to almost perfect rater agreement for inclusion of the six proposed components. Kappa values specifically regarding scale and anchors ranged from 0 to 0.49. After revisions, there was a significant level of agreement (p<0.05) of all items of the proposed assessment tool in the second-round survey except for item 10. Of note, all initial respondents indicated that they had never evaluated written scenarios with an assessment tool. Conclusions The Simulation Scenario Evaluation Tool, developed using a national consensus of content experts, is an instrument demonstrating content validity that assesses the quality of written simulation scenarios. This tool provides a basis to guide structured feedback regarding the quality of written simulation scenarios.
... Faculty development has been identified as a crucial component of any system to assist staff members in preparing for the challenges and everchanging responsibilities (Cheng, Grant, et al., 2015). From the literature, faculty development can be defined as a 'range of activities that institutions use to renew or assist faculty in their roles and can include workshops, seminar series, courses, fellowships, and individualized feedback' (Cheng, Grant, et al., 2015;Leslie et al., 2013, p.7). ...
... Faculty development has been identified as a crucial component of any system to assist staff members in preparing for the challenges and everchanging responsibilities (Cheng, Grant, et al., 2015). From the literature, faculty development can be defined as a 'range of activities that institutions use to renew or assist faculty in their roles and can include workshops, seminar series, courses, fellowships, and individualized feedback' (Cheng, Grant, et al., 2015;Leslie et al., 2013, p.7). ...
... The literature surrounding faculty development for implementing simulation programmes have very been sparse (Cheng, Grant, et al., 2015;Shaddeau & Deering, 2016), however from the limited literature surrounding the topic, it is clear that successful implementation requires multiple factors and stakeholder involvement (Haji et al., 2014). ...
Thesis
Full-text available
The thesis reports on the role of medical simulation in developing undergraduate acute care clinical skills competencies in South Africa. The study aim was to explore the role, including the limits and possibilities, of medical simulation as a pedagogical method in an undergraduate acute care clinical skills curriculum within a South African tertiary education environment. The study consisted of three phases, during phase one, I conducted a modified Delphi study to identify the acute care clinical skills competencies undergraduate medical students need to acquire to prepare them, in the role as newly qualified clinicians, for managing acute care cases within a South African in-hospital environment. Phase two explored what acute care clinical competencies would lend themselves to a medical simulation modality within a South African tertiary education environment by conducting a qualitative focus group discussion with curriculum experts. Phase three explored the role, including the limits and possibilities, of medical simulation as an educational modality in developing acute care clinical skills curriculum within a South African tertiary education environment through conducting semi-structured interviews with simulation experts within South Africa The findings of the phase one Delphi study contributed to developing a comprehensive list of undergraduate acute care clinical skills competencies required for the South African environment. The findings of the phase two focus group discussion challenged how medical simulation was being implemented in South Africa by proposing a framework within which medical simulation pedagogy can be used to develop competencies under the following themes, namely, foundational competencies, contextually standardised competencies, assessments, and teamwork. The third phase of semi-structured interviews with South African simulation experts sought to explore the role limits and possibilities of simulation within South Africa as a resource-constrained environment. A major finding was the value of conceptualising medical simulation as a pedagogy rather than a technology. The social and cognitive constructivist theories, and key related concepts informing this study, supported this new way of framing medical simulation. A further key finding in this phase was the importance of staff development in simulation pedagogy and less emphasis on the acquisition of expensive equipment in resource constrained environments. The implications of adopting this view have the potential for an earlier introduction and smoother transition of medical students into the clinical environment, which may lead to them being better prepared for clinical practice with resultant reduced morbidity and improved patient safety.
... Dedicated debriefing workshops should provide essential knowledge, debriefing practice opportunities and real-time feedback of acquired debriefing skills to achieve competency. Despite the prevalence of these workshops and courses, little has been published about the effectiveness of these opportunities in enhancing the debriefing quality and transfer of debriefing skills within simulation programs [20]. The most available evidence is subjective of self-reported assessments of instructors' comfort in conducting debriefing following these courses [21]. ...
... This provided learners with experiential opportunities to apply learning from both didactic and video sessions thereby accelerating debriefing skill acquisition. This approach has been shown to be effective in other educational contexts [20,[26][27]. Furthermore, coupling full-length debriefing with formative feedback and repetitive role-play amongst facilitators has been proposed as a model for faculty debriefing training [20]. ...
... This approach has been shown to be effective in other educational contexts [20,[26][27]. Furthermore, coupling full-length debriefing with formative feedback and repetitive role-play amongst facilitators has been proposed as a model for faculty debriefing training [20]. ...
Article
Introduction Effective debriefing during simulation-based training (SBT) is critical to promote learning outcomes. Despite debriefing's central role in learning and various published debriefing methods and techniques, little is known about faculty development structure for debriefing training among novice facilitators. Continuing medical education courses often use simulation-based methods but provide minimal training in debriefing techniques to novice facilitators. We describe the development, implementation, and evaluation of a structured debriefing training workshop for novice facilitators. Methods Designed and conducted by simulation debriefing experts, a debriefing workshop was provided to novice facilitators serving as faculty during the simulation-based Sedation Provider Course (PC) at the 2018 Society of Pediatric Sedation conference. Emphasizing evidence-based key elements of effective debriefing, the workshop was divided into three components: 1) an introductory 30 minute didactic, 2) 75 minutes role modeling of simulated effective and ineffective debriefing 3) 120 minutes repetitive deliberate practice sessions with summative and formative feedback. Effective transfer of learned debriefing skills was assessed during facilitators' PC debriefing using the Objective Structured Assessment of Debriefing (OSAD) tool, facilitators' self-efficacy, and PC student learners' evaluation of facilitator debriefings during the PC. Results Sixteen facilitators participated in the 4-h workshop and the next day served as PC faculty. The median OSAD score was 31 (13-40) for all facilitators. OSAD components with lowest and highest performance were "Establishing Learning Environment" with a median score of 1 (1-5) and "Engagement of Learners," with a median score of 4.75 (2.5-5). Facilitators' self- assessment in debriefing significantly improved on the 5-point Likert scale pre- and post-workshop, respectively. PC student learners' evaluations revealed high degrees of satisfaction with debriefing quality. Conclusions A proposed model integrating full-length debriefing and repetitive practice paired with summative and formative feedback provides a feasible and effective approach for debriefing training of novice facilitators for simulation-based educational courses.
... Immediately after the simulation activity, various aspects can be explored with learners during the debriefing or feedback session which is an important phase that should not be dissociated from any SBE activity [13,20,21]. Although under-reported in the simulation literature, the learners' potential feeling of deception is something they may indirectly report during the reaction phase of the debriefing. ...
... At the very latest, if the aspect of deception does not emerge during the debriefing reaction phase, it should be discussed during the analytical phase [13] in a very tactful manner to ensure their psychological safety [62,65], justifying the approach used so the educators remain trusted by the learners and so they maintain faith in the educational technique adopted. It is hence strongly advised that educators receive some formal training in relation to SBE and debriefing [20,62]. ...
Article
Full-text available
The level of performance of every clinician and of the overall multiprofessional team relies on the skills and expertise they have individually and collectively acquired through education, training, self-directed learning, and reflection. Simulation-based education (SBE) is playing an increasingly important role in that respect, and it is sometimes said that it is an art to facilitate. Many explanations can justify this assertion. Although there is generally an emphasis on making everything as realistic or “high-fidelity” as possible, it is often futile and this is where the art of simulation comes into play with an element of modulation of realism linked to the intended learning objectives. The atmosphere created by the educators; how the learners are made to engage and interact; how physical, technical, and contextual elements are simulated or represented; and what type of technology is used need to be appropriately adapted to contribute to the immersiveness of any SBE activity. Although it inevitably carries a negative connotation, some form of “deception” is more commonly used than one may think for the benefit of learners during SBE. High levels of realism are sometimes achieved by making learners believe something works or reacts as would be expected in real life, whereas it is achieved in a totally different manner. Learners do not need to know, see, or understand these “tricks of the trade”, shortcuts, or artistic or technological aspects, and this can be considered a form of benevolent deception. Similarly, information may be withheld to recreate a realistic situation and push learners to demonstrate specific learning outcomes, but it needs to be practised with caution and be justifiable. These forms of “positive” deception are part of most SBE activities and are used to help learners bridge the reality gap so they can suspend disbelief more easily, exercise critical thinking, and treat the simulation more realistically without damaging the trust they place in their educators. This article will discuss how aspects of SBE activities are often manipulated, modified, or hidden from learners to facilitate the learning experience and present a simulation fidelity model encompassing the environmental, patient, semantical, and phenomenal dimensions.
... Como o contexto e os objetivos da simulação são diversos, o valor educacional e o impacto do debriefing dependem desta intencionalidade, e métodos específicos podem ser úteis, conforme a situação proposta − o que pode indicar que um método não é superior ao outro, mas mais eficaz para atingir determinada meta (46) . Quanto à exploração da categoria sobre as técnicas de debriefing mais comuns para a simulação na enfermagem, identificaram-se as técnicas denominadas self-debriefing; debriefing Eye-Tracking; written debriefing; debriefing oral e técnica de debriefing videoassistida. ...
... Houve destaque para o debriefing oral associado ao de debriefing videoassistido. Dois estudos, realizados com estudantes de enfermagem (47) e profissionais de saúde (48) , não identificaram diferença estatisticamente significativas entre as duas técnicas quanto ao desempenho dos envolvidos, sugerindo que os educadores de enfermagem podem utilizar tanto uma quanto o outra para atingir resultados de aprendizagem na simulação (46) . ...
Article
Full-text available
Objective: To identify in the literature methods and techniques of debriefing used in teaching and learning in nursing simulation. Methods: Integrative review of PubMed/MEDLINE®, LILACS, Scopus and CINAHL® databases, with the descriptors "nursing", "nursing education", respective terms in English and Spanish, and the keyword "debriefing". Twelve primary studies, in Portuguese, English and Spanish, from January 2008 to December 2018 were included. Qualitative analysis was used to categorize the domains. Results: The identified meaningful learning methods were based on principles of transfer of learning; model of clinical reasoning, interprofessional, with good judgment and structured and self-debriefing techniques, oral, eye-tracking, video-assisted and written debriefing. Conclusions: The structured method and the oral technique with video were outstanding. One method was not superior to the other but effective for a particular proposal. No national studies have been found. Research should be conducted on the effectiveness of nursing debriefing methods and techniques.
... However, it needs to be interpreted depending on the student's experience with debriefing. Instead, the self-assessment can help significantly to improve debriefing skills if accompanied by the feedback of an external person 22,23 or a faculty debriefer. 14 The perspectives of the different experimented evaluators can help enhance the skills of the novice debriefer in a better way because of a higher capability to ensure the quality of a debriefing session. ...
... 14 The perspectives of the different experimented evaluators can help enhance the skills of the novice debriefer in a better way because of a higher capability to ensure the quality of a debriefing session. 22,23 Feedback by senior trainers is considered to create a shared mental model of the debriefing methodology. Using all 3 perspectives could provide a clearer understanding of the competence of debriefing for the instructor to improve specific abilities. ...
Article
Introduction: Virtual debriefing is a cardinal element to achieve the effectiveness of telesimulation. There are different instruments to assess face-to-face debriefing to determine the degree of effectiveness of debriefing; Debriefing Assessment for Simulation in Healthcare (DASH) instrument is one of them. This study aims to describe and compare the evaluation of raters, instructors, and students during a virtual debriefing using the DASH. Methods: A cross-sectional study was performed evaluating the virtual debriefing of 30 instructors after a heart failure telesimulation scenario. The evaluation was conducted by 30 instructors, 338 undergraduate students in the seventh semester, and 7 simulation raters. The 3 versions of the DASH instrument in Spanish were applied, respectively. Results: Two comparisons were made, student versus instructor and rater versus instructor. Cronbach α was 0.97 for each version. The averages of the results on the DASH instrument were: 6.61 (3.34-7.0), 5.95 (4.65-7.0), and 4.84 (2.68-6.02) for student, rater, and instructor versions, respectively. The size effect between student and debriefer perspectives was 0.42. In contrast, the size effect between instructor and rater was 0.72. All differences were significant. Conclusions: There are different rates between the persons who use the DASH. In this study, from the perspective of the instructor and rater, the difference was 1 point with a wide range, in contrast with the difference between instructor and student, which is relatively minor. It is necessary to consider the perspectives of experts in the subject to achieve a virtual debriefing of high quality and improve a debriefing by using the DASH.
... Feedback, in itself, is not a teaching strategy; rather, it is a communication strategy that can be used intermittently, does not require learners to think deeply, and includes learning objectives. 9 Directive feedback is a result-centered, didactic report provided to the students after the simulation. It also involves active communication of corrective information to the students based on the evaluation of the action, event, or process. ...
... It is often the facilitator that closes the performance gaps, and thus, the method does not encourage self-reflection of learners. 9,29 After-Action Review: It is a modified plus/delta method that contains the reasons for the actions. First, the facts are discovered by two questions: "What was supposed to happen?" and "What really happened?" ...
Article
Full-text available
Simulation-based education provides the transfer of the theoretical and practical knowledge gained in the learning environment to the clinical environment. Learning in a clinical simulation takes place in debriefing. Accordingly, debriefing is the main component of clinical simulation. Debriefing is conducted under the management of a facilitator and is an activity performed after the simulation. Debriefing facilitates participants’ critical thinking through reflective thinking. The primary goal in performing a debriefing is to allow facilitators and participants to participate in and reflect on the simulated clinical experience. There are various models used to facilitate the information-sharing process in a debriefing. Several models have been proposed for discussions after simulation, which have been categorized as three-phase, multiphase, and other debriefing models and frameworks. In this review, the importance and methods of debriefing in simulation-based education are discussed. Keywords: Simulation, debriefing, methods, nursing education, student
... There is still a lack of knowledge in medical simulation about the debriefing sessions in order to provide an understanding of the learning process provided therein, as well as drawing generalizable conclusions from studies to identify the best evidence of the effectiveness of debriefing (Cheng et al., 2015;Levett-Jones & Lapkin, 2014). However, a structured debriefing seemed to possess a better opportunity for reflection and an increased student activity (Decker, Fey, Sideras, Caballero, Rockstraw, et al. 2013;Neill & Wotton, 2011). ...
... However, development of appropriate frameworks and ways of structuring debriefing adapted to the participants' needs and the learning outcomes is an ongoing process. There is a need for studies that explore how to structure debriefing to maintain and enhance the quality of debriefing within simulation programs, including how the use of peer feedback can contribute to quality (Cheng et al., 2015), and how debriefing structures can help the facilitator to encourage student activity and students as self-regulated learners. ...
Article
Full-text available
Background. In the use of simulation as a learning approach, a structured debriefing is important for students to achieve learning. The facilitator’s feedback style and abilities in facilitating are crucial, and have a great impact on the learning environment. The facilitators should facilitate for student active learning, and provide helpful feedback to empower students as self-regulated learners. The aim of the study was to explore the Steinwachs structure and the Critical Response Process structure when used in debriefing in medical simulation, and how each of them affected the facilitator’s role. Method. A multi-method, comparative quasi-experimental design was used. Results. Structuring debriefing in accordance with the Critical Response Process facilitated a facilitator role that coincided with factors highlighted in theory on how to facilitate student active learning and the development self-regulating learners. Structuring debriefing in accordance with the Steinwachs structure revealed that debriefing seemed to be based more on the facilitator’s frames and dominance than the students’ frames and involvement. Conclusion. The results of this study showed that Critical Response Process (CRP) can be an appropriate structure to use in debriefing in medical simulation. It reduced the facilitator’s dominance and frames, coincident with what is empathized in collaborative, active and learner-centered learning.
... Importantly, these debriefings are not to be confused with stress debriefings as psychological intervention to prevent or treat traumatic experiences [61,62]. Addressing the gaps in our current understanding of what facilitates effective debriefing is important for developing and targeting debriefing faculty development efforts for clinical faculty [63][64][65]. It will also contribute to implementing and maintaining a culture of debriefing and open conversations in healthcare organizations. ...
... For the purpose of this study, a semi-structured interview guide with questions related to challenges and success criteria for debriefings in the clinical setting was designed. The data collection instrument was developed based on research in organizational behavior (with a particular focus on difficult conversations) [66][67][68][69][70], debriefings in healthcare [5,26,30,38,40,[63][64][65][71][72][73][74][75], and circular questioning [44,55]. Content areas were (1) experiences with debriefings in clinical and training settings; (2) characteristics of debriefings with respect to participants, place, duration, frequency, and organizational routines; (3) mental models with respect to effectiveness of debriefings and differences between debriefings and other kinds of conversations; (4) leadership in debriefings; (5) psychological safety; and (6) double-loop learning. ...
Article
Full-text available
Background The goal of this study was to identify taken-for-granted beliefs and assumptions about use, costs, and facilitation of post-event debriefing. These myths prevent the ubiquitous uptake of post-event debriefing in clinical units, and therefore the identification of process, teamwork, and latent safety threats that lead to medical error. By naming these false barriers and assumptions, the authors believe that clinical event debriefing can be implemented more broadly. Methods We interviewed an international sample of 37 clinicians, educators, scholars, researchers, and healthcare administrators from hospitals, universities, and healthcare organizations in Western Europe and the USA, who had a broad range of debriefing experience. We adopted a systemic-constructivist approach that aimed at exploring in-depth assumptions about debriefing beyond obvious constraints such as time and logistics and focused on interpersonal relationships within organizations. Using circular questions, we intended to uncover new and tacit knowledge about barriers and facilitators of regular clinical debriefings. All interviews were transcribed and analyzed following a comprehensive process of inductive open coding. Results In total, 1508.62 min of interviews (25 h, 9 min, and 2 s) were analyzed, and 1591 answers were categorized. Many implicit debriefing theories reflected current scientific evidence, particularly with respect to debriefing value and topics, the complexity and difficulty of facilitation, the importance of structuring the debriefing and engaging in reflective practice to advance debriefing skills. We also identified four debriefing myths which may prevent post-event debriefing from being implemented in clinical units. Conclusion The debriefing myths include (1) debriefing only when disaster strikes, (2) debriefing is a luxury, (3) senior clinicians should determine debriefing content, and (4) debriefers must be neutral and nonjudgmental. These myths offer valuable insights into why current debriefing practices are ad hoc and not embedded into daily unit practices. They may help ignite a renewed momentum into the implementation of post-event debriefing in clinical settings.
... 3 Los programas de formación de formadores enfatizan la importancia del debriefing y la retroalimentación como un elemento crítico para la calidad de la simulación. 4 Simulación Clínica 2021; 3 (1): 7-14 www.medigraphic.com/simulacionclinica www.medigraphic.org.mx ...
Article
Full-text available
Introducción: Hasta la fecha se desconoce la calidad de los centros de simulación en América Latina y no existe un instrumento validado para ello. El propósito de este estudio es construir un instrumento validado para la autopercepción de calidad de los centros y programas de simulación en los países de América Latina. Material y métodos: Estudio descriptivo, transversal y cuantitativo, con tres fases secuenciales: desarrollo, confiabilidad y validez. Se desarrolló un instrumento de autopercepción de calidad en español y se adaptó culturalmente al portugués, compuesto por seis dimensiones y 42 ítems. Resultados: Se obtuvieron 240 respuestas de 12 países. Se obtuvo la validez del contenido (I-CVI: 1) la consistencia del instrumento (alfa de Cronbach: 0.977). Se encontraron resultados sólidos en el análisis factorial exploratorio y en el análisis factorial confirmatorio. Conclusiones: Este instrumento bilingüe tiene buenas propiedades psicométricas en sus seis dimensiones y podría considerarse en futuros estudios para caracterizar la autopercepción de calidad de los centros y programas de simulación clínica en los países latinoamericanos de habla hispana y portuguesa
... This can be obtained by participating in simulation-based training and taking an active role as a debriefing facilitator. If available, soliciting peer feedback on one's debriefing performance can be very helpful to improve debriefing quality [36]. ...
Chapter
Feedback and debriefing are the most important components of healthcare simulation. Feedback is the one-way conveyance of information to a learner on the gap between their performance and a standard, with suggestions on how the gap might be bridged. Debriefing is a bi-directional, interactive, and reflective discussion. In this chapter, we explore debriefing for extracorporeal membranous oxygenation (ECMO) simulations. We begin by examining debriefing timing and facilitation methods including the post-event debriefing and within-event debriefing. We then review two different conversational structures used in debriefing: (1) reaction, analysis, summary; and (2) gather, analyze, summarize. Next, we explore debriefing process elements including essential elements of debriefing, conversational techniques, educational strategies, and debriefing adjuncts. We then provide an analysis of special considerations for debriefing ECMO simulations. Finally, we conclude with a brief discussion of debriefing training and assessment. The goal of this chapter is to provide a general overview of debriefing techniques and strategies that simulation educators can use when debriefing ECMO simulations.
... 3 Post simulation debriefing plays a vital role in maximising simulation-based learning in a systematic manner. 4 It is an intentional discussion that provides feedback to students and allows students to obtain a clear understanding of their actions via a reflective learning process. 5 The reflective learning process comprises three stages: awareness, critical analysis and new perspectives. ...
Article
Introduction Debriefing plays a vital role in effective simulation-based learning by providing feedback to students to foster their development of critical thinking in the learning. Objectives We evaluated the effects of rubric-based debriefing on students’ critical thinking, level of confidence in performing tasks and communication. Method This is a quasi-experimental study. Final year nursing undergraduates (n=204) of a local university participated in the study. In the intervention group, students performed two 20 min simulation sessions individually with simulated patients followed by a 15 min individual rubric-based debriefing between the two sessions and had a 5 min of that at the end. In the control group, students performed the same simulation sessions followed by a 20 min individual rubric-based debriefing at the end. The primary outcome was comparing the critical thinking of students between two groups, which was assessed by The Northeastern Illinois University Critical Thinking Rubric. The qualitative data were collected by an open-ended question. Results Based on generalised estimating equation models, the intervention effect over time was found to be statistically significant (β=2.06, 95% CI 1.04 to 3.08) in enhancing students’ critical thinking. No statistically significant differences were reported in the self-perceived confidence levels between the intervention group and control group. Qualitative data reflected positive feedback from students on simulation activities. Conclusions This is the first study to provide evidence that a rubric-based debriefing enhances students’ critical thinking in simulation learning.
... 21,23,29 Unfortunately, there is little guidance in the literature on how simulation educators should be trained to promote the acquisition and retention of debriefing skills. 5 Debriefing assessment tools such as the OSAD 50 and Debriefing Assessment for Simulation in Healthcare 4 have shown promise in supporting feedback, whereas other programs describe benefits related to peer coaching and video review. 53,54 Given the lack of mentors and limited resources within the MUST simulation program, we did not feel that peer coaching was a viable option so early in the genesis of the program. ...
Introduction: Despite the importance of debriefing, little is known about the effectiveness of training programs designed to teach debriefing skills. In this study, we evaluated the effectiveness of a faculty development program for new simulation educators at Mbarara University of Science and Technology in Uganda, Africa. Methods: Healthcare professionals were recruited to attend a 2-day simulation educator faculty development course (Sim for Life: Foundations), covering principles of scenario design, scenario execution, prebriefing, and debriefing. Debriefing strategies were contextualized to local culture and focused on debriefing structure, conversational strategies, and learner centeredness. A debriefing worksheet was used to support debriefing practice. Trained simulation educators taught simulation sessions for 12 months. Debriefings were videotaped before and after initial training and before and after 1-day refresher training at 12 months. The quality of debriefing was measured at each time point using the Objective Structured Assessment of Debriefing (OSAD) tool by trained, calibrated, and blinded raters. Results: A total of 13 participants were recruited to the study. The mean (95% confidence interval) OSAD scores pretraining, posttraining, and at 12 months before and after refresher were 18.2 (14.3-22.1), 26.7 (22.8-30.6), 25.5 (21.2-29.9), and 27.0 (22.4-31.6), respectively. There was a significant improvement from pretraining to posttraining (P < 0.001), with no significant decay from posttraining to 12 months (P = 0.54). There was no significant difference in OSAD scores pre- versus post-refresher training at 12 months (P = 0.49). Conclusions: The Sim for Life Foundations program significantly improves debriefing skills with retention of debriefing skills at 12 months.
... The principles of simulation faculty development are currently guided by lessons learned from health care education in general. 39 However, literature on simulation facilitator development for SBE in resourcelimited settings is lacking. 5 This study explores VAST facilitators' opinions on the experiences and training requirements required for delivery of VAST courses or SBE in a cross-cultural, resource-limited environment. ...
Article
Background: Lack of access to safe and affordable anesthesia and surgical care is a major contributor to avoidable death and disability across the globe. Effective education initiatives are a viable mechanism to address critical skill and process gaps in perioperative teams. Vital Anaesthesia Simulation Training (VAST) aims to overcome barriers limiting widespread application of simulation-based education (SBE) in resource-limited environments, providing immersive, low-cost, multidisciplinary SBE and simulation facilitator training. There is a dearth of knowledge regarding the factors supporting effective simulation facilitation in resource-limited environments. Frameworks evaluating simulation facilitation in high-income countries (HICs) are unlikely to fully assess the range of skills required by simulation facilitators working in resource-limited environments. This study explores the qualities of effective VAST facilitators; knowledge gained will inform the design of a framework for assessing simulation facilitators working in resource-limited contexts and promote more effective simulation faculty development. Methods: This qualitative study used in-depth interviews to explore VAST facilitators' perspectives on attributes and practices of effective simulation in resource-limited settings. Twenty VAST facilitators were purposively sampled and consented to be interviewed. They represented 6 low- and middle-income countries (LMICs) and 3 HICs. Interviews were conducted using a semistructured interview guide. Data analysis involved open coding to inductively identify themes using labels taken from the words of study participants and those from the relevant literature. Results: Emergent themes centered on 4 categories: Persona, Principles, Performance and Progression. Effective VAST facilitators embody a set of traits, style, and personal attributes (Persona) and adhere to certain Principles to optimize the simulation environment, maximize learning, and enable effective VAST Course delivery. Performance describes specific practices that well-trained facilitators demonstrate while delivering VAST courses. Finally, to advance toward competency, facilitators must seek opportunities for skill Progression.Interwoven across categories was the finding that effective VAST facilitators must be cognizant of how context, culture, and language may impact delivery of SBE. The complexity of VAST Course delivery requires that facilitators have a sensitive approach and be flexible, adaptable, and open-minded. To progress toward competency, facilitators must be open to self-reflection, be mentored, and have opportunities for practice. Conclusions: The results from this study will help to develop a simulation facilitator evaluation tool that incorporates cultural sensitivity, flexibility, and a participant-focused educational model, with broad relevance across varied resource-limited environments.
... Most of the debriefing after the in situ session is done through a simplified discussion (55%) and a simplified summary with feedback combined with a review of the recorded simulation session (31%) [13]. It is also important to allow a large part of the participants to make a self-assessment, which allows more topics to be covered in a limited period [16]. The minimum time spent on debriefing is equal to the length of the scenario itself [17]. ...
Article
Full-text available
Medical simulation is the most dynamically developing field of medical education to prepare medical personnel to work with the patients. Its main advantage is the possibility for learners to make mistakes, draw conclusions, and learn without fear about patient safety. Medical simulation is a tool using simulators — from simple trainers, through advanced mannequins, so-called patient simulators, and standardized patients (actors) to the virtual training centers. The simulator is a device that allows us to reproduce clinical cases in safe conditions. Learning from experience requires the use of clinical scenarios and appropriate preparation of an environment in which the staff will be obliged to practice and validate procedures.
... Several studies have shown that debriefing preparation workshops are a feasible and effective method of faculty training. Debriefing experience can be gained by completing simulation instructor training courses provided by various simulation programs, attending seminars, at conferences, and pursuing fellowship training or advanced degrees in simulation [4]. ...
Article
Despite having a good understanding of medicine, doctors lack clinical skills, problem-solving abilities, and the ability to apply knowledge to patient care, particularly in unanticipated circumstances. To overcome this, medical education has evolved into a system-oriented core curriculum with cognitive, psychomotor, and affective learning goals. With an emphasis on problem-based learning, the educator’s aim is to establish a long-term, predetermined improvement in the learner's behavior, acquired skills, and attitudes (Datta R, Upadhyay KK, Jaideep CN. Simulation and its role in medical education. Med J Armed Forces India. 2012;68(2):167–172. https://doi.org/10.1016/S0377-1237(12)60040-9). However, teaching these disciplines to real patients is almost impossible; this is where simulation comes in. This opinion paper will discuss the relevance and necessity of a simulation-based undergraduate curriculum in obstetrics and gynecology. What are the biggest obstacles that medical schools face in making the most of simulation-based learning, and how can they be overcome?
... [4][5][6] Despite these widely supported beliefs, training of simulation educators in debriefing has traditionally occurred in massed learning courses, as part of a comprehensive fellowship or advanced degree program, or informally without identifiable structure. [7][8][9][10][11] Recent literature has proposed multiple alternative mechanisms for more accessible and longitudinal faculty development and include the following: peer coaching, tiered development, and mentorship. [12][13][14] A consistent theme from these faculty development models is that improvement is dependent on purposeful experiences with expert feedback. ...
Introduction: The quality of healthcare simulation learning relies heavily on effective debriefers. Traditional methods of faculty development in debriefing lack a structured approach to achieve expertise via graduated and reflective practice. Methods: The Simulation Learning, Education and Research Network (SimLEARN) developed DebriefLive, a virtual teaching environment, bringing together faculty and participant debriefers from across the Veterans Health Administration. Recorded simulation sessions were viewed followed by the opportunity for participant debriefers to debrief virtual learners. Participant debriefers were then provided structured and objective debriefings of the debriefings with the added opportunity for immediate practice. Program evaluation data for the pilot sessions were collected via electronic survey including a mix of Likert scale questions as well as short answer responses. Results: On a 7-point Likert scale, participant debriefers (n = 15) rated the content as effective (mean = 6.67, SD = 0.47) and appropriate to their level (mean = 6.47, SD = 0.47). The technology of video-based scenarios (mean = 6.6, SD = 0.61), followed by avatar-based debriefing sessions (mean = 6.6, SD = 0.8), was felt to be accurate and appropriate. All participants would agree or strongly agree with recommending this training program to colleagues. Conclusions: Simulation instructors and fellows across the spectrum of the Veterans Health Administration found the innovative computer-based faculty development program DebriefLive acceptable as well as effective in increasing self-efficacy in debriefing. DebriefLive is an innovative and potentially disruptive tool, combining best practices in simulation theory and virtual technologies, for the training and assessment of debriefers.
... Initial reports highlight the role of self-debriefing or within-team debriefings (including learner self-assessment) focusing on nontechnical skills such as teamwork and communication, [173][174][175] although the generalizability to specific resuscitation and associated BLS and advanced life support remains unclear. 161,176 Additional Considerations Although cognitive load theory informs all aspects of instructional design in simulation, it also has specific relevance for feedback and debriefing. 177 When exploring performance gaps and diagnosing learning needs, 136 educators should consider the cognitive load of learners; during debriefings, they should attend to emotional reactions and provide their feedback in manageable chunks so that learners can process and act on it. ...
... À cet égard, des standards de pratique pour la formation en simulation mentionnent que les animateurs de débriefing devraient être formés et compétents [2,11,12]. Les quelques données scientifiques existantes font référence à des habiletés pour structurer le débriefing et pour promouvoir la participation active des apprenants [13][14][15][16][17][18]. Or, le manque d'habiletés des animateurs est fréquemment soulevé comme un obstacle à l'intégration de la simulation dans les programmes de formation [19]. ...
Article
Contexte : Le débriefing en simulation clinique requiert des habiletés d’animation de la part des formateurs. Bien que des outils d’évaluation validés existent et pourraient servir à leur formation, ils sont uniquement disponibles en anglais et ne sont pas adaptés à la réalité québécoise. But : (1) Traduire une rubrique d’évaluation de la qualité de l’animation du débriefing ( Objective Structured Assessment of Debriefing ) en français et l’adapter au contexte québécois ; et (2) en tester la validité et la fidélité dans un contexte de formation en sciences infirmières. Méthodes : La rubrique a été traduite et adaptée par trois traducteurs indépendants. Des experts en simulation ( n = 11) en ont validé le contenu selon une méthode Delphi modifiée. La rubrique traduite a été mise à l’essai auprès d’animateurs en simulation ( n = 10) qui ont évalué la qualité de débriefings filmés ( n = 16). Résultats : La rubrique présente une forte validité de contenu selon les experts et une forte validité concomitante en comparaison avec un score d’évaluation générale du débriefing. Les résultats pour les fidélités interjuges et test-retest montrent une variabilité dans l’attribution des scores par les animateurs de débriefing. Conclusion : La rubrique fournit un langage et des critères communs pour évaluer la qualité d’animation des débriefings. Elle pourrait servir de base à la formation des formateurs en précisant des critères de performance et des descripteurs reflétant différents niveaux de performance. Les qualités psychométriques observées suggèrent toutefois qu’une formation plus exhaustive et des modifications dans les descripteurs seraient nécessaires avant que la rubrique ne soit utilisée en contexte de recherche.
... In addition, the results of the current study revealed that the study group was more tended to acquire high total score of Debriefing Assessment for Simulation in Healthcare (DASH) (Figure 2). This finding conforms to the guide of Cheng et al (2015) who reported that the DASH scores showed evidence of good reliability and preliminary evidence of validity [29]. The same finding is also congruent with the book of Chinara (2019), in which it was indicated that the DASH is used to evaluate debriefing by learners, peers, and educators themselves. ...
Article
Introduction: Nurse Educators are constantly challenged to develop new teaching strategies to educate and train students to care for a diverse patient population in a rapidly changing health care environment [7].Simulation is an important teaching and learning strategy used in undergraduate and postgraduate nursing programs. It consists of three phases; briefing where the facilitator explains how the simulation session will be conducted, discuses intended objectives and assigns students roles; the scenario, where students have experiential or observational experience with real cases; and debriefing, which is defined as a two-way communication process between the trainee and the trainer [10]. In debriefing a retrospective assessment and discussion of student's performance take place. Little is known regarding the effectiveness of debriefing strategy and its potential for nursing education has not been fully recognized. Aim of the study: to determine the effect of debriefing strategy on retained knowledge and performance of nursing students. Research design: a non-randomized controlled clinical trial. Setting: The obstetric and gynecologic skill lab at the Faculty of Nursing, Damanhur University. Subjects: 160 nursing students representing all those who registered in Obstetric and Gynecologic Nursing course in the 2 nd semester of the 3 rd academic year 2018-2019. Tools: Five tools were used to collect the necessary data. Results: knowledge and clinical performance regarding breast and abdominal examination during pregnancy revealed highly significant differences between the study and the control groups in the post-test (P=0.000), where the former group obtained better total score than the latter group.Additionally, the study group was more likely to achieve high total score of satisfaction about debriefing and high total score of DASH. Conclusion:debriefing strategy is effective in enhancing student clinical knowledge andperformance regarding breast and abdominal examination during pregnancy.
... En effet, les émotions notamment négatives ressenties lors de la situation peuvent généralement signifier un sentiment d'échec ou deDescription • Expression des impressions et ressenties du participant Analyse • Identifier les forces et faiblesses de chaque participant Synthèse• Synthèse des apprentissages apportés par la situation déficit de performances. De plus, les participants qui ne verbalisent pas leurs émotions seront probablement plus en retrait lors du débriefing(Cheng, Grant, Dieckmann, Arora, Robinson & Eppich, 2015). Cet examen des réactions est une étape intermédiaire avant l'analyse en profondeur qui permettra de comprendre ce qui s'est passé. ...
Thesis
Les liens entre le stress et la santé ont fait l’objet d’un grand nombre de recherches, notamment auprès de la population étudiante en formation de santé. En effet, les attentes de l’université peuvent être génératrices de stress et avoir des conséquences néfastes sur la santé physique et mentale des étudiants (e.g., problèmes de sommeil, dépression, troubles du comportement alimentaire, conduites à risques, mauvaise hygiène de vie, etc.) Reconnues comme étant particulièrement exigeantes et stressantes, ces formations ont la particularité de confronter les étudiants à des sources de stress cliniques en plus des sources de stress académiques qui peuvent être rencontrées par tout étudiant. Or, peu de contenus de formation permettent aux étudiants d’apprendre à faire face à ces difficultés. Il paraît donc primordial de préparer les étudiants à s’adapter à l’environnement de travail complexe dans lequel ils évolueront professionnellement en leur fournissant des outils et des savoirs, notamment sur le plan psychologique et émotionnel. Pour cela, le concept d’intelligence émotionnelle semble pertinent. En effet, l’acquisition de compétences intra et interpersonnelles telles que la gestion, l’identification ou l’expression des émotions est essentielle et vecteur de bien-être au travail. Il nous paraît alors évident que l’acquisition de ces compétences puisse trouver sa place au sein des formations de santé. Par ailleurs, des pédagogies dites actives, engageant l’étudiant à réfléchir et à s’adapter en faisant face à des situations réelles, semblent plus efficaces et pourraient être pertinentes, notamment pour l’acquisition de ces compétences émotionnelles. Dans ce cadre, des activités telles que la simulation ou encore l’activité physique semblent être des outils pertinents à utiliser pour mettre à contribution les émotions des étudiants. D’ailleurs, plusieurs études mettent en évidence les bénéfices de ces activités sur le plan psychologique et émotionnel. Cependant, les recherches sur ce domaine restent assez peu développées et les modalités d’intervention assez différentes (i.e., programmes courts axés sur des conférences ou entretiens, ou programmes longs axés sur de la théorie et de la pratique). Ainsi, cette thèse de doctorat vise à démontrer l’influence de programmes interventionnels sur le niveau d’IE d’étudiants inscrits en formation de santé. Cette recherche consiste à comparer différents programmes d’intervention afin de déterminer les modalités les plus efficaces pour permettre leur inclusion dans la formation.
... 16 Expert simulation educators often do not ascribe to any specific debriefing method but rather use the method most relevant to the learning objectives, scenario, location, and overall situation. 17 With only 40% of survey respondents indicating that faculty receive professional development in simulation debriefing, it is not surprising that 60% do not use any structured debriefing method. Without training or mentorship, faculty may be unaware of structured debriefing methods, their proper use, or lack confidence in their debriefing facilitation skills. ...
Article
Full-text available
Background: Teaching and learning using simulation-based methods is increasing in health professions education; however, the prevalence of simulation use in respiratory care programs to date has not been explored. Methods: All 412 Commission on Accreditation for Respiratory Care (CoARC)-accredited entry-into-practice respiratory care programs were e-mailed a survey inquiring about simulation use as an educational tool in their programs. Results: Of the initial 412 programs contacted, 124 returned the survey, for a 30% response rate. More than three-quarters of programs reported using simulation including 87% of associate degree programs, 75% of bachelor's degree programs, and 100% of master's degree programs. Simulation modalities differed by course and program as did length of simulation activities and debriefings. Simulation hours may not be substituted for learner's clinical time under CoARC guidelines, and 69% of respondents agreed with this stance; however, 66% of responding programs have mandatory simulation learning activities, and 68% believe the amount of simulation should be increased. The survey also revealed respiratory care faculty have limited training in the use of simulation. Conclusions: Simulation-based teaching and learning is widespread and varied, but there is a lack of faculty development in its use among respiratory care programs.
Article
Simulation-based learning (SBL) is becoming an accepted part of health education. Providing high-quality simulation-based education depends more on skilled facilitators than on elaborate simulator equipment. In the last six years, a cross-professional facilitator course has been developed to train interprofessional staff and faculty from health educational institutions in Bergen. The course starts with two days of traditional simulation theory and practice followed by a third day five weeks later. During the third day, participants present their own experiences from practising as facilitators in their own workplaces. In this paper, we present the course content and the participants’ evaluation of the course based on the qualitative content analysis of their answers to open-ended questions. The main findings were that the course format and primary focus on practice were appreciated and that the follow-up day was especially useful to broaden the learning experience.
Chapter
Faculty development in the health professions historically involves knowledge and skill improvement in at least 5 domains: (a) professional development, (b) health professions education, (c) clinical and educational research, (d) professional communications, and (e) ethics—teaching and patient care. This chapter focuses on the singular domain of health professions education and uses simulation-based mastery learning as a platform to explore key concepts. Expert educator-coaches are integral to successful mastery learning (ML) curricula and require highly specific skills. Inadequate educator preparation jeopardizes the demonstrated benefits of mastery learning. This chapter presents a discussion about approaches tailored to prepare simulation educators to use a ML framework. We focus on three main areas: (a) healthcare simulation educator development in general; (b) specific learning needs for ML simulation educators including the ML knowledge, skills, and attitudes (KSAs) educators require; and (c) strategies that support educator development for ML.
Chapter
The demand to implement interprofessional education (IPE) simulation programs has increased over the past decade because it assists with breaking down silos, improving communication among professions, fostering understanding relate to the scope of practice, and increasing teamwork skills. However, implementing IPE simulation programs can be challenging due to barriers such as time, space, content, resources, the number of learners, costs, assessment, stereotypes, and evaluation. Based upon current literature and the authors’ own experiences of creating an IPE simulation program, this chapter presents how these barriers can be addressed.
Article
Full-text available
Purpose Individuals health with intellectual disabilities (ID) experience comorbid physical and mental health needs and have poorer outcomes resulting in early mortality. Currently, many training provisions based on ID exist; however, limited research supports their effectiveness. High-fidelity simulation is an innovative training mechanism with promising preliminary results. This study aims to evaluate the longitudinal impact of simulation training on clinical practice in ID. Design/methodology/approach A mixed-method approach was used in this study. A one-day simulation course using actors who had ID was delivered to 39 health-care professionals from across London hospitals. Nine semi-structured interviews were conducted 12–18 months post training. Findings High-fidelity simulation training is an effective training modality, which has a sustainable impact on participants, their clinical practice and patients. Core features of the training including debriefing, the use and type of actors, scenario design and the facilitators are crucial learning mechanisms which impacts learning outcomes and changes to behaviour in clinical practice and settings. Originality/value To the best of the authors’ knowledge, this study is the first to longitudinally evaluate high-fidelity simulation training designed to improve the physical and mental health needs of those with ID. The research begins to bridge an important gap in the current literature, with a need for more research.
Article
Even with insufficient evidence in the literature regarding the impact of faculty development with simulation education, faculty development is critical to building a successful simulation program. Despite simulation being ubiquitous in nursing education across the United States (US), there is little investigation of the impact of the education and training on job satisfaction, skill development and overall confidence when using simulation to teach nursing students. The purpose of this qualitative study was to investigate how has simulation education and training has influenced faculty career fulfillment, acquisition of knowledge, overall self-confidence and competence? Themes that emerged included the following: competency, professional role change, role acquisition, course learner needs, course impact, skill development, and overall ability and confidence. The study results confirmed that a gap exists where faculty without formal education may not be aware of their lack of competence and what training they need to be effective in facilitating clinical simulation.
Article
To better support the development of domain expertise, there has been considerable interest in the instructional strategies that afford opportunities for problem solving. Simulation-based learning includes case-based activities that support learners' transition into practice through experience and reflective debriefing. Simulation-based learning serves as a means for learners to practice psychomotor, affective, and cognitive skills. Although the literature suggests this is an effective strategy for supporting problem-solving skills, much of the literature is focused on measuring cognitive outcomes within a single case. As such, very few studies have simultaneously investigated (a) how debriefing in simulation-based education supports problem solving and (b) the degree to which cognitive and affective learning outcomes are variable throughout the problem-solving process. To address this gap, this repeated-measures study examined the role of debriefing strategies in simulation-based learning experiences, which was designed to practice both cognitive and affective skills. In the control condition, participants were given 2 simulation-based laboratory instructional sessions and then assessed on cognitive (problem representation and solution generation) and affective learning (perceived self-efficacy) outcomes. In the comparison condition, participants engaged in debriefing sessions after each of the 2 simulation-based laboratory instructional sessions. For the cognitive outcomes, participants were assessed using the Think-Aloud Standardized Patient Examination instrument, which measured problem representation and solution generation. Acute care confidence survey was used to measure perceived self-efficacy. The study found statistically significant gaps in early-stage measures of cognitive problem representation, but no difference was found in solution generation or perceived self-efficacy measures. This study reinforces the importance of debriefing in simulation-based education. Specifically, this study suggests that cognitive learning outcomes are variable during the problem-solving process, but affective learning outcomes are more fixed and may need more targeted debriefing.
Article
Aim For successful simulation‐based learning (SBL), a structured interactive and bidirectional debriefing should be a prerequisite. The purpose of this study is to identify the effects of instructor‐led hot debriefing (debriefing immediately after simulation) and cold debriefing (debriefing occurring after a certain period following simulation) in simulation with case‐based learning (CBL). Method This study used a nonequivalent control group pretest‐posttest design. A sample of 59 fourth‐year nursing students in South Korea were invited and randomly divided into two groups, a post‐simulation hot debriefing (PSHD, male = 4, female = 26), and cold debriefing (PSCD, male = 3, female = 26). We used clinical performance competency, satisfaction with CBL and SBL, and debriefing tools. The study period was from October to December of 2019. We analyzed the data with SPSS 23.0 software, using descriptive statistics and the t test. Results Clinical performance competency means that the scores of both groups were significantly improved in the posttest (PSHD = 33.13 ± 5.11, PSCD = 34.10 ± 4.15) as compared to those in the pretest (t = −7.010, p < .001). The knowledge (t = −12.689, p < .001) and skill (t = −5.338, p = .001) scores of clinical performance competency in the PSCD were higher than those in the PSHD. The mean satisfaction scores of the PSHD group with CBL (4.53 ± 0.60) and debriefing (4.66 ± 0.55) was higher than for those in the PSCD group. Conclusion As a result of this study, PSHD and PSCD led by an instructor improved student clinical performance competency. The PSHD method, in particular, might be a positive influence on learner satisfaction with CBL, SBL, and debriefing.
Chapter
You could have the largest, most modern simulation center in the land, but size and technology do not matter if what is going on inside your walls is not educationally sound. Quality curriculum is less about the high technology simulators, fancy manikins, and the state-of-the-art learning environment than it is about the instructors and personnel coordinating, developing, and delivering content. A reputation for quality programming taking place within your center will attract national courses, industry sponsorship, and further investment while keeping your stakeholders satisfied. Attention to faculty and staff development is the key to ensuring that what is happening within your center is educationally sound. At academic centers, research mentorship is also required to promote scholarship within education.
Article
Contexte : Le débriefing dans le secteur de la santé est souvent considéré comme un processus de réflexion centré sur l’apprenant nécessaire pour la construction des connaissances, où le formateur joue un rôle de facilitateur. Cependant, diverses études soulignent l’importance que le débriefing soit explicite pour les apprenants novices. Cet article examine l’efficacité de différents types de débriefing, simulation, pour des apprenants expérimentés qui travaillent dans des équipes d’urgence. Objectif : Cette étude vise à vérifier si le débriefing explicite est aussi efficace pour les professionnels expérimentés qu’il l’a été pour les apprenants novices. Méthodes : Cette étude prospective randomisée monocentrique a été réalisée dans le cadre d’un programme de simulation basé sur des interventions d’urgence lors d’un arrêt cardiaque. Les connaissances déclarées de chaque participant, leur auto-efficacité et l’efficacité de l’équipe ont été mesurées avant et après la formation par simulation. Résultats : Les résultats montrent un effet bénéfique sur les connaissances et l’auto-efficacité pour les deux types de débriefing. Les soins apportés aux patients ont aussi été optimisés pour les deux types de débriefing. Conclusion : Les deux types de débriefing post-simulation sont efficaces pour les équipes d’urgence expérimentées.
Book
Full-text available
Review of ten years activities in Medical Simulation Centre Ljubljana
Article
This article shares one university’s experience in deploying the Community Action Poverty Simulation (CAPS) for interprofessional learning. Little guidance exists for implementing large-scale interprofessional simulations in higher education, especially with students from various healthcare professions. We will present our efforts implementing CAPS with students from a variety of healthcare majors. First, we provide background about our institution and CAPS in an effort to describe our implementation initiative including learners and facilitators. Second, we cover the adaptation of CAPS to include best practices in simulation. Specifically, we review prebriefing and debriefing, as well as logistics in implementing the poverty simulation as an interprofessional activity at a large university. Finally, we discuss learning and institutional outcomes related to CAPS.
Article
Background Debriefing after simulation facilitates reflection and the ability to transfer learning into clinical practice. Aim To evaluate faculty debriefing to inform and guide the development of ongoing resources to support adjunct faculty members and to maximise learning potential for students. Method Faculty debriefing styles were evaluated following a high-fidelity simulation involving 720 first year undergraduate nursing students. One audio recording of a debrief was randomly selected from each of 12 faculty members. If a pre-brief was completed prior to the simulation, this was also selected as it is a component of the debrief and sets the tone prior to the simulation. Initial analysis was undertaken using the validated observational structured assessment of debriefing tool (OSAD) to briefly highlight different debriefing styles. This facilitated the thematic analysis of the data using Braun and Clarkes (2006) six step process. Results Nursing faculty used open-ended questions, silence and clinical feedback with varying levels of expertise and success. Conclusions Further research into the effect on the variation in debriefing styles, and it's related impact on student learning outcomes is needed.
Article
The value of simulation-based education can be lost without a structured and purposeful guided debrief where nursing students and health care professionals are able to think critically and reflect on the experiential learning. Debriefing enhances peer-to-peer learning and aids the nursing student in formulating best practice for the next time when this encounter may occur in the clinical setting. Debriefing should be led by a trained facilitator using evidence-based methods to ensure a safe learning environment for nursing students. Debriefing is an essential learning tool that should be considered for application in the classroom, clinical, and laboratory settings.
Article
Full-text available
Specialists’ level of professional training gains special significance in modern economic conditions. This problem is also relevant to organizations of the real sector of the economy. The present article examines the problems of training personnel for various economy sectors related to the competency approach adopted in the education system. Increasing the level of training of managerial personnel while accounting for industry characteristics and their specialization profile is noted to be one of the priority directions of work. The study is based on Russian practice in developing the country’s production potential and on the practice of scientific, technical, and economic development of foreign countries, particularly Japan, China, and others. The requirement of quantitative and qualitative staffing is an indispensable condition for the effective functioning and development of enterprises and organizations of all sectors and spheres of the national economy. In this regard, the goal of the present study is to identify, analyze, and summarize the positive and negative aspects of the competency approach to educational practice. Attainment of the set goal is based on the methodological approaches to the assessment of the existing educational system and the methods of comparative analysis of general economic consequences of such changes from the employer’s perspective. The scientific result of the study is the ways of resolving the identified problems suggested by the authors from the point of both the employers and the directions of the professional activity of higher and secondary special educational institutions. In particular, it is suggested to create corporate training centers at large organizations, the ideology of which would be based on improving the professional level of employees in accordance with the established qualification requirements.
Article
Increasingly, clinicians and clinical educators are using simulation-based education (SBE) as a valued educational modality within healthcare organizations. Typically, these facilitators have minimal time to dedicate to uninterrupted formal education and development. To enhance effective use of SBE, facilitators need learner centered opportunities that promote the development and sustainment of expert SBE skills, knowledge, attitudes and behaviors. Implementing a mentorship approach for facilitator development is a feasible option. This article proposes that the Cognitive Apprenticeship model (Collins et al., 1991) can be adapted and utilized for SBE facilitator mentorship.
Chapter
The chapter discusses the issue of medical management of incidents caused by fire and toxic gas, hazardous material, chemicals, and irradiation. It presents the classification of injuries and the requirements for the medical care during decontamination of injured persons. It also presents decontamination patterns and maps and the necessary resources and equipment.
Chapter
Disaster management is teamwork under rapidly changing, dynamic, and often unsafe conditions. This work requires skilled professionals who must have appropriate individual knowledge, a broad spectrum of technical and nontechnical skills, and excellent personal abilities, which enable them to collaborate as a team. Collaboration and coordination cannot be trained in a classroom or during a lecture, and therefore the need for interactive learning methods with the provision of feedback is high. Simulation is an excellent teaching method for developing competencies that enables team members to implement the know-how, what needs to be done, into effective team action, even under the most adverse and confusing situation of a medical emergency.
Article
Background Best practices in simulation-based education indicate that sessions should be facilitated by a trained simulation instructor. A large academic Canadian hospital required instructors to facilitate simulation-based education in a corporate nursing orientation program. Method The Plan-Do-Study-Act framework was used to develop a simulation facilitation curriculum using mixed methods to evaluate the program. Results The mean scores on the Observational Structured Assessment of Debriefing tool were greater than the target of three out of five. Themes identified from the qualitative analysis focused on a safe learning environment and use of a framework for debriefing. Conclusion A simulation faculty development curriculum for nurse educators with support from peers improved the quality of debriefing in a corporate nursing orientation program.
Article
Background The expertise of the simulation instructor is a key factor in the success of simulation-based education. This study aimed to identify the perceived competence and training priorities of Korean nursing simulation instructors using importance-performance analysis. Methods A cross-sectional descriptive survey design was used. A Web-based survey was conducted with 100 nursing simulation instructors from seven provinces of the Republic of Korea. A 26-item simulation instructor competence importance-performance scale was developed and used to identify the instructors' perceived competence and training priorities. Results Exploratory factor analysis of the scale comprised five factors that explained 66.5% of the total variance: (a) professional foundations, (b) assessment and evaluation, (c) simulation implementation, (d) reflection and facilitation, and (e) principles of simulation education. There were significant mean differences between importance and performance overall and for all five factors, with reflection and facilitation showing the largest difference. Regardless of instructor expertise level, an in-depth discussion during the debriefing was the top training priority for novice, advanced beginner, and competent instructors in the importance-performance analysis. Conclusions Structured simulation instructor training should be developed and implemented to target simulation instructor competence improvement according to training priorities customized by expertise level.
Article
Full-text available
Several recent literature reviews have been published with the aim to determine how to optimise a debriefing. A main element found in these reviews was the importance of structuring the debriefing. Within the steps usually outlined in the debriefing, the description phase allows participants to describe their recollections and establish a shared mental model of what happened during the simulation. The description phase is used in many debriefing models but how to realise this description remains unclear. We provide an original tool to ensure a highly structured description phase: the “Timeline Debriefing Tool”. The Timeline Debriefing Tool, or TDT, is constructed on visual support such as a whiteboard or a flipchart. It allows for a clear description phase, makes the process more dynamic, promotes exchanges between participants and establishes a clear and shared vision of the simulation in visual support which can be used by the instructor in the analysis phase. Moreover, the timeline allows participants to discover their performance gaps by themselves, thus beginning deeper cognitive processing in the participants’ mind and promoting reflection in the analysis phase.
Article
Full-text available
Background Simulation is an important educational tool to improve medical training and patient safety. Debriefing after simulation is crucial to maximise learning and to translate the lessons learnt to improve real clinical performance, and thus to reduce medical error. Currently there are few tools to improve performance debriefing and learning after simulations of serious paediatric situations. Purpose The purpose of this study was to develop a tool to guide and assess debriefings after simulations of serious paediatric situations, applying the current evidence base and user-based research. Study design A literature review and semistructured interviews (performed in 2010) to identify important features of a paediatric simulation debriefing. Emergent theme analysis was used to identify key components of an effective debriefing which could be used as a tool for assessing debriefing effectiveness. Results The literature review identified 34 relevant studies. Interviews were carried out with 16 paediatricians, both debriefing facilitators and learners. In total, 307 features of a debriefing were identified. These were grouped into eight dimensions representing the key components of a paediatric debriefing: the facilitator's approach, learning environment, engagement of learners, reaction, descriptive reflection, analysis, diagnosis and application. These eight dimensions were used to create a tool, the Objective Structured Assessment of Debriefing (OSAD). Each dimension can be scored on a five-point Likert scale containing descriptions for scores 1, 3 and 5 to serve as anchors and aid scoring. Conclusions The study identified the important features of a paediatric simulation debriefing, which were developed into the OSAD tool. OSAD offers a structured approach to paediatric simulation debriefing, and is based on evidence from published literature and views of simulation facilitators and learners. OSAD may be used as a guide or assessment tool to improve the quality of debriefing after paediatric simulation.
Article
Full-text available
Over the past two decades, there has been an exponential and enthusiastic adoption of simulation in healthcare education internationally. Medicine has learned much from professions that have established programs in simulation for training, such as aviation, the military and space exploration. Increased demands on training hours, limited patient encounters, and a focus on patient safety have led to a new paradigm of education in healthcare that increasingly involves technology and innovative ways to provide a standardized curriculum. A robust body of literature is growing, seeking to answer the question of how best to use simulation in healthcare education. Building on the groundwork of the Best Evidence in Medical Education (BEME) Guide on the features of simulators that lead to effective learning, this current Guide provides practical guidance to aid educators in effectively using simulation for training. It is a selective review to describe best practices and illustrative case studies. This Guide is the second part of a two-part AMEE Guide on simulation in healthcare education. The first Guide focuses on building a simulation program, and discusses more operational topics such as types of simulators, simulation center structure and set-up, fidelity management, and scenario engineering, as well as faculty preparation. This Guide will focus on the educational principles that lead to effective learning, and include topics such as feedback and debriefing, deliberate practice, and curriculum integration - all central to simulation efficacy. The important subjects of mastery learning, range of difficulty, capturing clinical variation, and individualized learning are also examined. Finally, we discuss approaches to team training and suggest future directions. Each section follows a framework of background and definition, its importance to effective use of simulation, practical points with examples, and challenges generally encountered. Simulation-based healthcare education has great potential for use throughout the healthcare education continuum, from undergraduate to continuing education. It can also be used to train a variety of healthcare providers in different disciplines from novices to experts. This Guide aims to equip healthcare educators with the tools to use this learning modality to its full capability.
Article
Full-text available
Background: This pilot study evaluated the effect of videotape-facilitated human patient simulator (HPS) practice and guidance on clinical performance indicators. Method: Nursing and nurse anesthetist students in the treatment group (n = 20) participated in HPS practice and guidance using videotape-facilitated debriefing, and the control group (n = 20) participated in HPS practice and guidance using oral debriefing alone. Results: Students in the intervention group were significantly more likely to demonstrate desirable behaviors concerning patient identification, team communication, and vital signs. The role students played in the simulation significantly impacted their performance. When scores of both the intervention and control groups were combined, team leaders, airway managers, and nurse anesthetists had higher mean total performance scores than crash cart managers, recorders, or medication nurses. Conclusion: Video-facilitated simulation feedback is potentially a useful tool in increasing desirable clinical behaviors in a simulated environment. © 2010 International Nursing Association for Clinical Simulation and Learning.
Article
Full-text available
Although feedback has been identified as a key instructional feature in simulation based medical education (SBME), we remain uncertain as to the magnitude of its effectiveness and the mechanisms by which it may be effective. We employed a meta-analysis and critical narrative synthesis to examine the effectiveness of feedback for SBME procedural skills training and to examine how it works in this context. Our results demonstrate that feedback is moderately effective during procedural skills training in SBME, with a pooled effect size favoring feedback for skill outcomes of 0.74 (95 % CI 0.38-1.09; p < .001). Terminal feedback appears more effective than concurrent feedback for novice learners' skill retention. Multiple sources of feedback, including instructor feedback, lead to short-term performance gains although data on long-term effects is lacking. The mechanism by which feedback may be operating is consistent with the guidance hypothesis, with more research needed to examine other mechanisms such as cognitive load theory and social development theory.
Article
Full-text available
Background: Among nonmedical educators, longitudinal faculty development programs most effectively increase teachers' abilities. Despite most medical settings providing little reinforcement for newinstructional skills, teaching improvement programs infrequently have explicit ongoing activities. We carried out a longitudinal program for medical educators designed to reinforce and firmly establish new teaching skills. Description: We conducted a longitudinal(18 months of biweekly 1-hr meetings) faculty development program. Its activities followed an initial structured seminar series, and the ongoing meetings involved reviewing video tapes of participants' teaching, enacting and debriefing role-plays, and a modified Balint group for medical educators. Evaluation: We assessed the program's process and outcomes using attendance, self-reported teaching behaviors, perceived program usefulness, educational administrative responsibilities, and qualitative analysis of audiotapes and session notes. Conclusions: Participants maintained high attendanceduring18months of meetings. Ratings of usefulness were high, comparable to other faculty development activities with established utility, and qualitative data support unique benefits of the ongoing meetings. The longitudinal component built on the initial seminar series; it seemed to enhance collegial support, to allow observation of instructional outcomes to reinforce newinstructional skills, and to foster greater involvement in the institution's teaching activities. Teaching improvement programs for physician educators have taken several forms, from workshops and expert consultations to month-long minisabbaticals. However, most are single or sequenced interventions, without an explicit long-term component. We present the structure of an 18-month program for clinician teachers and report observations supporting the program's utility. We hope that this descriptive study promotes greater attention to and subsequent prospective research of longitudinal faculty development.
Article
Full-text available
Background: Although technology-enhanced simulation is increasingly used in health professions education, features of effective simulation-based instructional design remain uncertain. Aims: Evaluate the effectiveness of instructional design features through a systematic review of studies comparing different simulation-based interventions. Methods: We systematically searched MEDLINE, EMBASE, CINAHL, ERIC, PsycINFO, Scopus, key journals, and previous review bibliographies through May 2011. We included original research studies that compared one simulation intervention with another and involved health professions learners. Working in duplicate, we evaluated study quality and abstracted information on learners, outcomes, and instructional design features. We pooled results using random effects meta-analysis. Results: From a pool of 10 903 articles we identified 289 eligible studies enrolling 18 971 trainees, including 208 randomized trials. Inconsistency was usually large (I( 2) > 50%). For skills outcomes, pooled effect sizes (positive numbers favoring the instructional design feature) were 0.68 for range of difficulty (20 studies; p < 0.001), 0.68 for repetitive practice (7 studies; p = 0.06), 0.66 for distributed practice (6 studies; p = 0.03), 0.65 for interactivity (89 studies; p < 0.001), 0.62 for multiple learning strategies (70 studies; p < 0.001), 0.52 for individualized learning (59 studies; p < 0.001), 0.45 for mastery learning (3 studies; p = 0.57), 0.44 for feedback (80 studies; p < 0.001), 0.34 for longer time (23 studies; p = 0.005), 0.20 for clinical variation (16 studies; p = 0.24), and -0.22 for group training (8 studies; p = 0.09). Conclusions: These results confirm quantitatively the effectiveness of several instructional design features in simulation-based education.
Article
Full-text available
INTRODUCTION: This study examined the reliability of the scores of an assessment instrument, the Debriefing Assessment for Simulation in Healthcare (DASH), in evaluating the quality of health care simulation debriefings. The secondary objective was to evaluate whether the instrument's scores demonstrate evidence of validity. METHODS: Two aspects of reliability were examined, interrater reliability and internal consistency. To assess interrater reliability, intraclass correlations were calculated for 114 simulation instructors enrolled in webinar training courses in the use of the DASH. The instructors reviewed a series of 3 standardized debriefing sessions. To assess internal consistency, Cronbach α was calculated for this cohort. Finally, 1 measure of validity was examined by comparing the scores across 3 debriefings of different quality. RESULTS: Intraclass correlation coefficients for the individual elements were predominantly greater than 0.6. The overall intraclass correlation coefficient for the combined elements was 0.74. Cronbach α was 0.89 across the webinar raters. There were statistically significant differences among the ratings for the 3 standardized debriefings (P < 0.001). CONCLUSIONS: The DASH scores showed evidence of good reliability and preliminary evidence of validity. Additional work will be needed to assess the generalizability of the DASH based on the psychometrics of DASH data from other settings.
Article
Full-text available
Peer observation of Teaching involves observers providing descriptive feedback to their peers on learning and teaching practice as a means to improve quality of teaching. This study employed and assessed peer observation as a constructive, developmental process for members of a Pediatric Teaching Faculty. This study describes how peer observation was implemented as part of a teaching faculty development program and how it was perceived by teachers. The PoT process was divided into 4 stages: pre-observation meeting, observation, post-observation feedback and reflection. Particular care was taken to ensure that teachers understood that the observation and feedback was a developmental and not an evaluative process. Twenty teachers had their teaching peer observed by trained Faculty members and gave an e-mail 'sound-bite' of their perceptions of the process. Teaching activities included lectures, problem-based learning, small group teaching, case-based teaching and ward-based teaching sessions. Teachers were given detailed verbal and written feedback based on the observer's and students' observations. Teachers' perceptions were that PoT was useful and relevant to their teaching practice. Teachers valued receiving feedback and viewed PoT as an opportunity for insight and reflection. The process of PoT was viewed as non-threatening and teachers thought that PoT enhanced the quality of their teaching, promoted professional development and was critical for Faculty development. This study demonstrated that PoT can be used in a constructive way to improve course content and delivery, to support and encourage medical teachers, and to reinforce good teaching.
Article
Full-text available
Peer observation of teaching (PoT) is most commonly done as a way of evaluating educators in lecture or small group teaching. Teaching in the clinical environment is a complex and hectic endeavor that requires nimble and innovative teaching on a daily basis. Most junior faculty start their careers with little formal training in education and with limited opportunity to be observed or to observe more experienced faculty. Formal PoT would potentially ameliorate these challenges. This article describes a collaborative peer observation process that a group of 11 clinician educators is using as a longitudinal faculty development program. Results: The process described in this article provides detailed and specific teaching feedback for the observed teaching attending while prompting the observing faculty to reflect on their own teaching style and to borrow effective teaching techniques from the observation. This article provides detailed examples from written feedback obtained during collaborative peer observation to emphasize the richness of this combined experience.
Article
Full-text available
1969 to 2003, 34 years. Simulations are now in widespread use in medical education and medical personnel evaluation. Outcomes research on the use and effectiveness of simulation technology in medical education is scattered, inconsistent and varies widely in methodological rigor and substantive focus. Review and synthesize existing evidence in educational science that addresses the question, 'What are the features and uses of high-fidelity medical simulations that lead to most effective learning?'. The search covered five literature databases (ERIC, MEDLINE, PsycINFO, Web of Science and Timelit) and employed 91 single search terms and concepts and their Boolean combinations. Hand searching, Internet searches and attention to the 'grey literature' were also used. The aim was to perform the most thorough literature search possible of peer-reviewed publications and reports in the unpublished literature that have been judged for academic quality. Four screening criteria were used to reduce the initial pool of 670 journal articles to a focused set of 109 studies: (a) elimination of review articles in favor of empirical studies; (b) use of a simulator as an educational assessment or intervention with learner outcomes measured quantitatively; (c) comparative research, either experimental or quasi-experimental; and (d) research that involves simulation as an educational intervention. Data were extracted systematically from the 109 eligible journal articles by independent coders. Each coder used a standardized data extraction protocol. Qualitative data synthesis and tabular presentation of research methods and outcomes were used. Heterogeneity of research designs, educational interventions, outcome measures and timeframe precluded data synthesis using meta-analysis. HEADLINE RESULTS: Coding accuracy for features of the journal articles is high. The extant quality of the published research is generally weak. The weight of the best available evidence suggests that high-fidelity medical simulations facilitate learning under the right conditions. These include the following: providing feedback--51 (47%) journal articles reported that educational feedback is the most important feature of simulation-based medical education; repetitive practice--43 (39%) journal articles identified repetitive practice as a key feature involving the use of high-fidelity simulations in medical education; curriculum integration--27 (25%) journal articles cited integration of simulation-based exercises into the standard medical school or postgraduate educational curriculum as an essential feature of their effective use; range of difficulty level--15 (14%) journal articles address the importance of the range of task difficulty level as an important variable in simulation-based medical education; multiple learning strategies--11 (10%) journal articles identified the adaptability of high-fidelity simulations to multiple learning strategies as an important factor in their educational effectiveness; capture clinical variation--11 (10%) journal articles cited simulators that capture a wide variety of clinical conditions as more useful than those with a narrow range; controlled environment--10 (9%) journal articles emphasized the importance of using high-fidelity simulations in a controlled environment where learners can make, detect and correct errors without adverse consequences; individualized learning--10 (9%) journal articles highlighted the importance of having reproducible, standardized educational experiences where learners are active participants, not passive bystanders; defined outcomes--seven (6%) journal articles cited the importance of having clearly stated goals with tangible outcome measures that will more likely lead to learners mastering skills; simulator validity--four (3%) journal articles provided evidence for the direct correlation of simulation validity with effective learning. While research in this field needs improvement in terms of rigor and quality, high-fidelity medical simulations are educationally effective and simulation-based education complements medical education in patient care settings.
Article
Full-text available
Recent literature describes "cognitive dispositions to respond" (CDRs) that may lead physicians to err in their clinical reasoning. To assess learner perception of high-fidelity mannequin-based simulation and debriefing to improve understanding of CDRs. Emergency medicine (EM) residents were exposed to two simulations designed to bring out the CDR concept known as "vertical line failure." Residents were then block-randomized to a technical/knowledge debriefing covering the medical subject matter or a CDR debriefing covering vertical line failure. They then completed a written survey and were interviewed by an ethnographer. Four investigators blinded to group assignment reviewed the interview transcripts and coded the comments. The comments were qualitatively analyzed and those upon which three out of four raters agreed were quantified. A random sample of 84 comments was assessed for interrater reliability using a kappa statistic. Sixty-two residents from two EM residencies participated. Survey results were compared by technical (group A, n = 32) or cognitive (group B, n = 30) debriefing. There were 255 group A and 176 group B comments quantified. The kappa statistic for coding the interview comments was 0.42. The CDR debriefing group made more, and qualitatively richer, comments regarding CDR concepts. The technical debriefing group made more comments on the medical subjects of cases. Both groups showed an appreciation for the risk of diagnostic error. Survey data indicate that technical debriefing was better received than cognitive debriefing. The authors theorize that an understanding of CDRs can be facilitated through simulation training based on the analysis of interview comments.
Article
Full-text available
In medical education programmes which rely on clinical teachers spread across diverse sites, the application of peer observation of teaching offers the potential of both supporting teachers and maintaining quality. This paper reports on a questionnaire survey carried out with general practitioner (GP) teachers of medical undergraduate students from King's College London School of Medicine at Guy's, King's College and St Thomas' Hospitals. The aim of the study was to determine GP teachers' views on a proposed programme of peer observation of their teaching. The majority of GP teachers identified benefits of the proposed scheme with 69% saying it would help improve the education of future doctors. However, despite seeing the benefits, less than half wished to take part in the programme. Two thirds cited time and paperwork as major disincentives to taking part and 62% said that they felt it would make them feel under scrutiny. No associations were found between measures of workload and willingness to take part. This suggests that a fundamental fear of scrutiny and criticism may be the main hurdle to be overcome in implementing the scheme. Imposing peer observation on GP teachers in the form proposed could create suspicion and distance between the university department and practice-based GP teachers and may even result in a loss of teachers. The introduction of peer observation is more likely to be successful if GPs' apprehensions are addressed. Using peer observation to strengthen the process of quality assurance may undermine its role in the support and development of clinical teachers.
Article
Full-text available
Preparing healthcare professionals for teaching is regarded as essential to enhancing teaching effectiveness. Although many reports describe various faculty development interventions, there is a paucity of research demonstrating their effectiveness. To synthesize the existing evidence that addresses the question: "What are the effects of faculty development interventions on the knowledge, attitudes and skills of teachers in medical education, and on the institutions in which they work?" The search, covering the period 1980-2002, included three databases (Medline, ERIC and EMBASE) and used the keywords: staff development; in-service training; medical faculty; faculty training/development; continuing medical education. Manual searches were also conducted. Articles with a focus on faculty development to improve teaching effectiveness, targeting basic and clinical scientists, were reviewed. All study designs that included outcome data beyond participant satisfaction were accepted. From an initial 2777 abstracts, 53 papers met the review criteria. Data were extracted by six coders, using the standardized BEME coding sheet, adapted for our use. Two reviewers coded each study and coding differences were resolved through discussion. Data were synthesized using Kirkpatrick's four levels of educational outcomes. Findings were grouped by type of intervention and described according to levels of outcome. In addition, 8 high-quality studies were analysed in a 'focused picture'. The majority of the interventions targeted practicing clinicians. All of the reports focused on teaching improvement and the interventions included workshops, seminar series, short courses, longitudinal programs and 'other interventions'. The study designs included 6 randomized controlled trials and 47 quasi-experimental studies, of which 31 used a pre-test-post-test design. Despite methodological limitations, the faculty development literature tends to support the following outcomes: Overall satisfaction with faculty development programs was high. Participants consistently found programs acceptable, useful and relevant to their objectives. Participants reported positive changes in attitudes toward faculty development and teaching. Participants reported increased knowledge of educational principles and gains in teaching skills. Where formal tests of knowledge were used, significant gains were shown. Changes in teaching behavior were consistently reported by participants and were also detected by students. Changes in organizational practice and student learning were not frequently investigated. However, reported changes included greater educational involvement and establishment of collegiate networks. Key features of effective faculty development contributing to effectiveness included the use of experiential learning, provision of feedback, effective peer and colleague relationships, well-designed interventions following principles of teaching and learning, and the use of a diversity of educational methods within single interventions. Methodological issues: More rigorous designs and a greater use of qualitative and mixed methods are needed to capture the complexity of the interventions. Newer methods of performance-based assessment, utilizing diverse data sources, should be explored, and reliable and valid outcome measures should be developed. The maintenance of change over time should also be considered, as should process-oriented studies comparing different faculty development strategies. Faculty development activities appear highly valued by participants, who also report changes in learning and behavior. Notwithstanding the methodological limitations in the literature, certain program characteristics appear to be consistently associated with effectiveness. Further research to explore these associations and document outcomes, at the individual and organizational level, is required.
Article
Identified are four groups of faculty development practices based on their use among 756 colleges and universities: traditional practices, instructional assistance practices, those that emphasized the assessment of faculty, and those that involve many faculty in improvement activities. Practices judged effective, and the organization and funding of programs are also discussed.
Statement: We describe an integrated conceptual framework for a blended approach to debriefing called PEARLS [Promoting Excellence And Reflective Learning in Simulation]. We provide a rationale for scripted debriefing and introduce a PEARLS debriefing tool designed to facilitate implementation of the new framework. The PEARLS framework integrates 3 common educational strategies used during debriefing, namely, (1) learner self-assessment, (2) facilitating focused discussion, and (3) providing information in the form of directive feedback and/or teaching. The PEARLS debriefing tool incorporates scripted language to guide the debriefing, depending on the strategy chosen. The PEARLS framework and debriefing script fill a need for many health care educators learning to facilitate debriefings in simulation-based education. The PEARLS offers a structured framework adaptable for debriefing simulations with a variety in goals, including clinical decision making, improving technical skills, teamwork training, and interprofessional collaboration.
Book
The fully updated Crisis Management in Anesthesiology continues to provide updated insights on the latest theories, principles, and practices in anesthesiology. From anesthesiologists and nurse anesthetists to emergency physicians and residents, this medical reference book will effectively prepare you to handle any critical incident during anesthesia. Identify and respond to a broad range of life-threatening situations with the updated Catalog of Critical Incidents, which outlines what may happen during surgery and details the steps necessary to respond to and resolve the crisis. React quickly to a range of potential threats with an added emphasis on simulation of managing critical incidents. Focus on the core knowledge that serves as a useful review. Explore new topics in the ever-expanding anesthesia practice environment with a detailed chapter on debriefing.
Article
Objectives: Debriefing is a common feature of technology-enhanced simulation (TES) education. However, evidence for its effectiveness remains unclear. We sought to characterise how debriefing is reported in the TES literature, identify debriefing features that are associated with improved outcomes, and evaluate the effectiveness of debriefing when combined with TES. Methods: We systematically searched databases, including MEDLINE, EMBASE and Scopus, and reviewed previous bibliographies for original comparative studies investigating the use of TES with debriefing in training health care providers. Reviewers, in duplicate, evaluated study quality and abstracted information on instructional design, debriefing and outcomes. Effect sizes (ES) were pooled using random-effects meta-analysis. Results: From 10 903 potentially eligible studies, we identified 177 studies (11 511 learners) that employed debriefing as part of TES. Key characteristics of debriefing (e.g. duration, educator presence and characteristics, content, structure/method, timing, use of video) were usually incompletely reported. A meta-analysis of four studies demonstrated that video-assisted debriefing has negligible and non-significant effects for time skills (ES = 0.10) compared with non-video-assisted debriefing. Meta-analysis demonstrated non-significant effects in favour of expert modelling with short debriefing in comparison with long debriefing (ES range = 0.21-0.74). Among studies comparing terminal with concurrent debriefing, results were variable depending on outcome measures and the context of training (e.g. medical resuscitation versus technical skills). Eight additional studies revealed insight into the roles of other debriefing-related factors (e.g. multimedia debriefing, learner-led debriefing, debriefing duration, content of debriefing). Among studies that compared simulation plus debriefing with no intervention, pooled ESs were favourable for all outcomes (ES range = 0.28-2.16). Conclusions: Limited evidence suggests that video-assisted debriefing yields outcomes similar to those of non-video-assisted debriefing. Other debriefing design features show mixed or non-significant results. As debriefing characteristics are usually incompletely reported, future debriefing research should describe all the key debriefing characteristics along with their associated descriptors.
Article
Importance Resuscitation training programs use simulation and debriefing as an educational modality with limited standardization of debriefing format and content. Our study attempted to address this issue by using a debriefing script to standardize debriefings. Objective To determine whether use of a scripted debriefing by novice instructors and/or simulator physical realism affects knowledge and performance in simulated cardiopulmonary arrests. Design Prospective, randomized, factorial study design. Setting The study was conducted from 2008 to 2011 at 14 Examining Pediatric Resuscitation Education Using Simulation and Scripted Debriefing (EXPRESS) network simulation programs. Interprofessional health care teams participated in 2 simulated cardiopulmonary arrests, before and after debriefing. Participants We randomized 97 participants (23 teams) to nonscripted low-realism; 93 participants (22 teams) to scripted low-realism; 103 participants (23 teams) to nonscripted high-realism; and 94 participants (22 teams) to scripted high-realism groups. Intervention Participants were randomized to 1 of 4 arms: permutations of scripted vs nonscripted debriefing and high-realism vs low-realism simulators. Main Outcomes and Measures Percentage difference (0%-100%) in multiple choice question (MCQ) test (individual scores), Behavioral Assessment Tool (BAT) (team leader performance), and the Clinical Performance Tool (CPT) (team performance) scores postintervention vs preintervention comparison (PPC). Results There was no significant difference at baseline in nonscripted vs scripted groups for MCQ (P = .87), BAT (P = .99), and CPT (P = .95) scores. Scripted debriefing showed greater improvement in knowledge (mean [95% CI] MCQ-PPC, 5.3% [4.1%-6.5%] vs 3.6% [2.3%-4.7%]; P = .04) and team leader behavioral performance (median [interquartile range (IQR)] BAT-PPC, 16% [7.4%-28.5%] vs 8% [0.2%-31.6%]; P = .03). Their improvement in clinical performance during simulated cardiopulmonary arrests was not significantly different (median [IQR] CPT-PPC, 7.9% [4.8%-15.1%] vs 6.7% [2.8%-12.7%], P = .18). Level of physical realism of the simulator had no independent effect on these outcomes. Conclusions and Relevance The use of a standardized script by novice instructors to facilitate team debriefings improves acquisition of knowledge and team leader behavioral performance during subsequent simulated cardiopulmonary arrests. Implementation of debriefing scripts in resuscitation courses may help to improve learning outcomes and standardize delivery of debriefing, particularly for novice instructors.
Article
Summary statement: Postsimulation debriefing is a critical component of effective learning in simulation-based health care education. Numerous formats in which to conduct the debriefing have been proposed. In this report, we describe the adaptation the US Army's After-Action Review (AAR) debriefing format for postsimulation debriefing in health care. The Army's AAR format is based on sound educational theory and has been used with great success in the US Army and civilian organizations for decades. Debriefing using the health care simulation AAR process requires planning, preparation, and follow-up. Conducting a postsimulation debriefing using the health care simulation AAR debriefing format includes 7 sequential steps as follows: (1) define the rules of the debriefing, (2) explain the learning objectives of the simulation, (3) benchmark performance, (4) review what was supposed to happen during the simulation, (5) identify what actually happened, (6) examine why events occurred the way they did, and (7) formalize learning by reviewing with the group what went well, what did not go well and what they would do differently if faced with a similar situation in real life. We feel that the use of the health care simulation AAR debriefing format provides a structured and supported method to conduct an effective postsimulation debriefing, with a focus on the learning objectives and reliance on preidentified performance standards.
Article
Simulation instructors often feel caught in a task-versus-relationship dilemma. They must offer clear feedback on learners' task performance without damaging their relationship with those learners, especially in formative simulation settings. Mastering the skills to resolve this dilemma is crucial for simulation faculty development. We conducted a case study of a debriefer stuck in this task-versus-relationship dilemma. Data: The "2-column case" captures debriefing dialogue and instructor's thoughts and feelings or the "subjective experience." Analysis: The "learning pathways grid" guides a peer group of faculty in a step-by-step, retrospective analysis of the debriefing. The method uses vivid language to highlight the debriefer's dilemmas and how to surmount them. The instructor's initial approach to managing the task-versus-relationship dilemma included (1) assuming that honest critiques will damage learners, (2) using vague descriptions of learner actions paired with guess-what-I-am-thinking questions, and (3) creating a context she worried would leave learners feeling neither safe nor clear how they could improve. This case study analysis identified things the instructor could do to be more effective including (1) making generous inferences about the learners' qualities, (2) normalizing the challenges posed by the simulation, (3) assuming there are different understandings of what it means to be a team. There are key assumptions and ways of interacting that help instructors resolve the task-versus-relationship dilemma. The instructor can then provide honest feedback in a rigorous yet empathic way to help sustain good or improve suboptimal performance in the future.
Article
The literature on faculty development activities for family medicine educators is reviewed and presented in three sections: (a) overview of faculty development in family medicine, (b) fellowships, and (c) workshops and other short‐term programs. Although a wide variety of formats have been utilized for faculty development purposes in family medicine, most faculty development programs have been evaluated primarily on participant satisfaction rather than on observed changes in participant behavior when in the faculty role.As funds for family medicine faculty development programs disappear or decline, new strategies for maintaining the quantity and quality of family medicine educators must be developed. We suggestan agenda for future faculty development activities in family medicine. These activities include (a) initiation of faculty development training, including teaching and research skills, during residency training programs, (b) development of fourth‐year clinical fellow positions supported by clinical revenue, (c) development of 2‐year research fellowships supported by clinical revenue and research grants, and (d) more rigorous evaluation of faculty development fellowships and workshops based on outcome measures rather than strictly on participant satisfaction.
Article
Peer observation of teaching is currently receiving widespread interest, and is being implemented in numerous institutions in all sectors of education. There appears to be, however, some uncertainty or controversy over the rationale for this and the role of the observer. In the light of this, we should be very clear about exactly what our objectives are for the implementation of peer observation, and the best way to achieve these, before espousing a potentially divisive and detrimental procedure. The purpose of this paper, therefore, is to examine some popular models of peer observation and their potential drawbacks, and to put the case for the implementation of a more active and reflective model, where the focus is less on the observed and more on the active self‐development of the observer.
Article
Purpose: To (1) provide a detailed account of the nature and scope of faculty development (FD) programs in medical education, (2) assess the quality of FD studies, and (3) identify in what areas and through what means future research can purposefully build on existing knowledge. Method: The authors searched MEDLINE, CINAHL, and ERIC for articles reporting evaluations of FD initiatives published between 1989 and 2010. They applied standard systematic review procedures for sifting abstracts, scrutinizing full texts, and abstracting data, including program characteristics, evaluation methods, and outcomes. They used a modified Kirkpatrick model to guide their data abstraction. Results: The authors included 22 articles reporting on 21 studies in their review. The most common program characteristics included a series/longitudinal format, intended for individuals, and offered to physicians only. Although the most common aim was to improve teaching effectiveness, several programs had multiple aims, including scholarship and leadership. Program evaluation focused on quantitative approaches. A number of studies employed longitudinal designs and included some follow-up component. Surveys were the most popular data collection method, participants the most common data source, and self-reported behavior changes the most commonly reported outcome. Conclusions: Although the authors' findings showed some recent expansion in the scope of the FD literature, they also highlighted areas that require further focus and growth. Future research should employ more rigorous evaluation methods, explore the role of interprofessional teams and communities of practice in the workplace, and address how different organizational and contextual factors shape the success of FD programs.
Article
BACKGROUND: Improving patient safety by training teams to successfully manage emergencies is a major concern in healthcare. Most current trainings use simulation of emergency situations to practice and reflect on relevant clinical and behavioural skills. We developed TeamGAINS, a hybrid, structured debriefing tool for simulation-based team trainings in healthcare that integrates three different debriefing approaches: guided team self-correction, advocacy-inquiry and systemic-constructivist techniques. METHODS: TeamGAINS was administered during simulation-based trainings for clinical and behavioural skills for anaesthesia staff. One of the four daily scenarios involved all trainees, whereas the remaining three scenarios each involved only two trainees with the others observing them. Training instructors were senior anaesthesiologists and psychologists. To determine debriefing quality, we used a post-test-only (debriefing quality) and a pre-post-test (psychological safety, leader inclusiveness), no-control-group design. After each debriefing all trainees completed a self-report debriefing quality scale which we developed based on the Debriefing Assessment for Simulation in Healthcare and the Observational Structured Assessment of Debriefing. Perceived psychological safety and leader inclusiveness were measured before trainees' first (premeasure) and after their last debriefing (postmeasure) at which time trainees' reactions to the overall training were measured as well. RESULTS: Four senior anaesthetists, 29 residents and 28 nurses participated in a total of 40 debriefings resulting in 235 evaluations. Utility of debriefings was evaluated as highly positive. Pre-post comparisons revealed that psychological safety and leader inclusiveness significantly increased after the debriefings. CONCLUSIONS: The results indicate that TeamGAINS could provide a useful debriefing tool for training anaesthesia staff on all levels of work experience. By combining state-of-the-art debriefing methods and integrating systemic-constructivist techniques, TeamGAINS has the potential to allow for a surfacing, reflecting on and changing of the dynamics of team interactions. Further research is necessary to systematically compare the effects of TeamGAINS' components on the debriefing itself and on trainees' changes in attitudes and behaviours.
Article
Objectives: To explore the current status of performance feedback (debriefing) in the operating room and to develop and evaluate an evidence-based, user-informed intervention termed "SHARP" to improve debriefing in surgery. Background: Effective debriefing is a key educational technique for optimizing learning in surgical settings. However, there is a lack of a debriefing culture within surgery. Few studies have prospectively evaluated educational interventions to improve the quality and quantity of performance feedback in surgery. Methods: This was a prospective pre- and post-study of 100 cases involving 22 trainers (attendings) and 30 surgical residents (postgraduate years 3-8). A trained researcher assessed the quality of debriefings provided to the trainee using the validated Objective Structured Assessment of Debriefing (OSAD) tool alongside ethnographic observation. Following the first 50 cases, an educational intervention termed "SHARP" was introduced and measures repeated for a further 50 cases. User satisfaction with SHARP was assessed via questionnaire. Twenty percent of the cases were observed independently by a second researcher to test interrater reliability. Results: Interrater reliability for OSAD was excellent (ICC = 0.994). Objective scores of debriefing (OSAD) improved significantly after the SHARP intervention: median pre = 19 (range, 8-31); median post = 33 (range, 26-40), P < 0.001. Strong correlations between observer (OSAD) and trainee rating of debriefing were obtained (median ρ = 0.566, P < 0.01). Ethnographic observations also supported a significant improvement in both quality and style of debriefings. Users reported high levels of satisfaction in terms of usefulness, feasibility, and comprehensiveness of the SHARP tool. Conclusions: SHARP is an effective and efficient means of improving performance feedback in the operating room. Its routine use should be promoted to optimize workplace-based learning and foster a positive culture of debriefing and performance improvement within surgery.
Article
Objective: : To identify the features of effective debriefing and to use this to develop and validate a tool for assessing such debriefings. Introduction: : Simulation-based training has become an accepted means of surgical skill acquisition. A key component of this is debriefing-yet there is a paucity of research to guide best practice. Methods: : Phase 1-Identification of best practice and tool development. A search of the Medline, Embase, PsycINFO, and ERIC databases identified current evidence on debriefing. End-user input was obtained through 33 semistructured interviews conducted with surgeons (n = 18) and other operating room personnel (n = 15) from 3 continents (UK, USA, Australia) using standardized qualitative methodology. An expert panel (n = 7) combined the data to create the Objective Structured Assessment of Debriefing (OSAD) tool. Phase 2-Psychometric testing. OSAD was tested for feasibility, reliability, and validity by 2 independent assessors who rated 20 debriefings following high-fidelity simulations. Results: : Phase 1: 28 reports on debriefing were retrieved from the literature. Key components of an effective debriefing identified from these reports and the 33 interviews included: approach to debriefing, learning environment, learner engagement, reaction, reflection, analysis, diagnosis of strengths and areas for improvement, and application to clinical practice. Phase 2: OSAD was feasible, reliable [inter-rater ICC (intraclass correlation coefficient) = 0.88, test-retest ICC = 0.90], and face and content valid (content validity index = 0.94). Conclusions: : OSAD provides an evidence-based, end-user informed approach to debriefing in surgery. By quantifying the quality of a debriefing, OSAD has the potential to identify areas for improving practice and to optimize learning during simulation-based training.
Article
Debriefing is a critical component of effective simulation-based medical education. The optimal format in which to conduct debriefing is unknown. The use of video review has been promoted as a means of enhancing debriefing, and video-assisted debriefing is widely used in simulation training. Few empirical studies have evaluated the impact of video-assisted debriefing, and the results of those studies have been mixed. The objective of this study was to compare the effectiveness of video-assisted debriefing to oral debriefing alone at improving performance in neonatal resuscitation. Thirty residents, divided into 15 teams of 2 members each, participated in the study. Each team completed a series of 3 neonatal resuscitation simulations. Each simulation was followed by a facilitated debriefing. Teams were randomly assigned to receive either oral debriefing alone or video-assisted debriefing after each simulation. Objective measures of performance and times to complete critical tasks in resuscitation were evaluated by blinded video review on the first (pretest) and the third (posttest) simulations using a previously validated tool. Overall neonatal resuscitation performance scores improved in both groups [mean (SD), 83% (14%) for oral pretest vs. 91% (7%) for oral posttest (P = 0.005); 81% (16%) for video pretest vs. 93% (10%) for video posttest (P < 0.001)]. There was no difference in performance scores between the 2 groups on either the pretest or posttest [overall posttest scores, 91.3% for oral vs. 93.4% for video (P = 0.59)]. Times to complete the critical tasks of resuscitation also did not differ significantly between the 2 study groups. The educational effect of the video-assisted debriefing versus oral debriefing alone was small (d = 0.08). Using this study design, we failed to show a significant educational benefit of video-assisted debriefing. Although our results suggest that the use of video-assisted debriefing may not offer significant advantage over oral debriefing alone, exactly why this is the case remains obscure. Further research is needed to define the optimal role of video review during simulation debriefing in neonatal resuscitation.
Article
OBJECTIVE:: To describe the history of the Pediatric Advanced Life Support course and outline the new developments in instructor training that will impact the way debriefing is conducted during Pediatric Advanced Life Support courses. OUTLINE:: The Pediatric Advanced Life Support course, first released by the American Heart Association in 1988, has seen substantial growth and change over the past few decades. Over that time, Pediatric Advanced Life Support has become the standard for resuscitation training for pediatric healthcare providers in North America. The incorporation of high-fidelity simulation-based learning into the most recent version of Pediatric Advanced Life Support has helped to enhance the realism of scenarios and cases, but has also placed more emphasis on the importance of post scenario debriefing. We developed two new resources: an online debriefing module designed to introduce a new model of debriefing and a debriefing tool for real-time use during Pediatric Advanced Life Support courses, to enhance and standardize the quality of debriefing by Pediatric Advanced Life Support instructors. In this article, we review the history of Pediatric Advanced Life Support and Pediatric Advanced Life Support instructor training and discuss the development and implementation of the new debriefing module and debriefing tool for Pediatric Advanced Life Support instructors. CONCLUSION:: The incorporation of the debriefing module and debriefing tool into the 2011 Pediatric Advanced Life Support instructor materials will help both new and existing Pediatric Advanced Life Support instructors develop and enhance their debriefing skills with the intention of improving the acquisition of knowledge and skills for Pediatric Advanced Life Support students.
Article
Debriefing is critical to learning from simulation experiences, yet the literature reports little research describing best practices within nursing. Debriefing for Meaningful Learning (DML) is a systematic process for debriefing in which teachers and students explicate different aspects of reflection and generate new meanings from simulation experiences. The purpose of this exploratory, quasi-experimental, pretest-posttest study was to test the relationship of DML on the development of clinical reasoning skills in prelicensure nursing students when compared with customary debriefing strategies and on students' perception of quality of the debriefing experience. Analysis of data demonstrated a greater change in clinical reasoning skills and identification of higher-quality debriefing and a positive correlation between clinical reasoning and perception of quality. Findings demonstrate that DML is an effective debriefing method. It contributes to the body of knowledge supporting the use of debriefing in simulation learning and supports the development of best teaching practices.
Article
Debriefing is a process involving the active participation of learners, guided by a facilitator or instructor whose primary goal is to identify and close gaps in knowledge and skills. A review of existing research and a process for identifying future opportunities was undertaken. A selective critical review of the literature on debriefing in simulation-based education was done. An iterative process of analysis, gathering input from audience participants, and consensus-based synthesis was conducted. Research is sparse and limited in presentation for all important topic areas where debriefing is a primary variable. The importance of a format for reporting data on debriefing in a research context was realized and a "who, when, where, what, why" approach was proposed. Also, a graphical representation of the characteristics of debriefing studies was developed (Sim-PICO) to help guide simulation researchers in appropriate experimental design and reporting. A few areas of debriefing practice where obvious gaps that deserve study were identified, such as comparing debriefing techniques, comparing trained versus untrained debriefers, and comparing the effect of different debriefing venues and times. A model for publication of research data was developed and presented which should help researchers clarify methodology in future work.
Article
Feedback is an important feature of simulation-based education. This study investigated the optimal timing of feedback for technical skills learning in novices. Thirty novice endoscopists were pretested on a colonoscopy simulator task. Participants then received feedback either during (concurrent) or after (terminal) each of their 12 practice trials. Effectiveness of training was assessed using an immediate posttest and one week later on retention and transfer tests. Measures included execution time and blinded expert assessments. Both groups performed similarly on the pre-, post-, and retention tests. At transfer, the terminal feedback group performed significantly better as measured by execution time, checklist, and global rating scores. The concurrent feedback group's performance decreased significantly on the transfer test as compared with the posttest and retention test. Not all feedback conditions seem equally effective. The use of terminal feedback resulted in better learning as demonstrated by superior performance during transfer.
Article
A key portion of medical simulation is self-reflection and instruction during a debriefing session; however, there have been surprisingly few direct comparisons of various approaches. The objective of this study was to compare two styles of managing a simulation session: postsimulation debriefing versus in-simulation debriefing. One hundred sixty-one students were randomly assigned to receive either postsimulation debriefing or in-simulation debriefing. Retrospective pre-post assessment was made through survey using Likert-scale questions assessing students' self-reported confidence and knowledge level as it relates to medical resuscitation and statements related to the simulation itself. There were statistically significant differences in the reliable self-reported results between the two groups for effectiveness of the debriefing style, debriefing leading to effective learning, and the debriefing helping them to understand the correct and incorrect actions, with the group that received postsimulation debriefing ranking all these measures higher. Both groups showed significantly higher posttest scores compared with their pretest scores for individual and overall measures. Students felt that a simulation experience followed by a debriefing session helped them learn more effectively, better understand the correct and incorrect actions, and was overall more effective compared with debriefin