ArticlePDF Available

Comparison of simulation debriefs with traditional needs assessment methods: A qualitative exploratory study in a critical care community setting

Authors:

Abstract and Figures

Objective To better understand the potential of a needs assessment approach using qualitative data from manikin-based and virtual patient simulation debriefing sessions compared with traditional data collection methods (ie, focus groups and interviews). Design Original data from simulation debrief sessions was compared and contrasted with data from an earlier assessment of critical care needs in a community setting (using focus groups and interviews), thus undertaking secondary analysis of data. Time and cost data were also examined. Debrief sessions were coded using deductive and inductive techniques. Matrices were used to explore the commonalities, differences and emergent findings across the methods. Setting Critical care unit in a community hospital setting. Results Interviews and focus groups yielded 684 and 647 min of audio-recordings, respectively. The manikin-based debrief recordings averaged 22 min (total=130 min) and virtual patient debrief recordings averaged 31 min (total=186 min). The approximate cost for the interviews and focus groups was $13 560, for manikin-based simulation debriefs was $4030 and for the virtual patient debriefs was $3475. Fifteen of 20 total themes were common across the simulation debriefs and interview/focus group data. Simulation-specific themes were identified, including fidelity (environment, equipment and psychological) and the multiple roles of the simulation instructor (educative, promoting reflection and assessing needs). Conclusions Given current fiscal realities, the dual benefit of being educative and identifying needs is appealing. While simulation is an innovative method to conduct needs assessments, it is important to recognise that there are trade-offs with the selection of methods.
Content may be subject to copyright.
1
SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
Comparison of simulation debriefs with
traditional needs assessment methods: a
qualitative exploratory study in a critical
care community setting
Aimee J Sarti,1,2 Rola Ajjawi,3 Stephanie Sutherland,1 Angele Landriault,2
John Kim,1 Pierre Cardinal1,2
To cite: SartiAJ, AjjawiR,
SutherlandS, etal.
Comparison of simulation
debriefs with traditional
needs assessment methods:
a qualitative exploratory
study in a critical care
community setting. BMJ Open
2018;8:e020570. doi:10.1136/
bmjopen-2017-020570
Prepublication history and
additional material for this
paper are available online. To
view these les, please visit
the journal online (http:// dx. doi.
org/ 10. 1136/ bmjopen- 2017-
020570).
Received 10 November 2017
Revised 5 September 2018
Accepted 10 September 2018
1Department of Critical Care,
The Ottawa Hospital, General
Campus, Ottawa, Ontario,
Canada
2Practice, Performance and
Innovation (PPI) Unit, Royal
College of Physicians and
Surgeons of Canada (RCPSC),
Ottawa, Ontario, Canada
3Centre for Research in
Assessment and Digital
Learning, Deakin University,
Geelong, Victoria, Australia
Correspondence to
DrAimee JSarti;
asarti@ toh. on. ca
Research
© Author(s) (or their
employer(s)) 2018. Re-use
permitted under CC BY-NC. No
commercial re-use. See rights
and permissions. Published by
BMJ.
ABSTRACT
Objective To better understand the potential of a needs
assessment approach using qualitative data from manikin-
based and virtual patient simulation debrieng sessions
compared with traditional data collection methods (ie,
focus groups and interviews).
Design Original data from simulation debrief sessions
was compared and contrasted with data from an earlier
assessment of critical care needs in a community setting
(using focus groups and interviews), thus undertaking
secondary analysis of data. Time and cost data were also
examined. Debrief sessions were coded using deductive
and inductive techniques. Matrices were used to explore
the commonalities, differences and emergent ndings
across the methods.
Setting Critical care unit in a community hospital setting.
Results Interviews and focus groups yielded 684 and
647 min of audio-recordings, respectively. The manikin-
based debrief recordings averaged 22 min (total=130 min)
and virtual patient debrief recordings averaged 31 min
(total=186 min). The approximate cost for the interviews
and focus groups was $13 560, for manikin-based
simulation debriefs was $4030 and for the virtual patient
debriefs was $3475. Fifteen of 20 total themes were
common across the simulation debriefs and interview/
focus group data. Simulation-specic themes were
identied, including delity (environment, equipment and
psychological) and the multiple roles of the simulation
instructor (educative, promoting reection and assessing
needs).
Conclusions Given current scal realities, the dual benet
of being educative and identifying needs is appealing.
While simulation is an innovative method to conduct needs
assessments, it is important to recognise that there are
trade-offs with the selection of methods.
INTRODUCTION
Calls for innovative strategies in conducting
needs assessments (NAs) have been made
in the medical literature over an extended
period of time.1–5 A NA is a systematic process
to collect and analyse information on a target
group’s needs (ie, examine gaps between
current and desired situations).6 Simulation
holds potential as a NA method to promote
a better understanding of these gaps given
that it aims ‘to develop an environment that
enables the learner to perform naturally
to gain insight into the complexity of the
actual workplace’7 (p. 59). Prior research
has demonstrated that simulation permits
trainees to live through a realistic experi-
ence, make mistakes in a safe environment
and practice before they actually perform on
real people.8 9 Similarly, medical educators
also find simulation experiences to be stimu-
lating and realistic and provide opportunities
for the integration of basic clinical teaching
with advanced problem solving especially
given the opportunities to reflect on the case
after the simulation scenario.8 Through a
process of experiential learning and delib-
erate practice, the use of simulation in health
professionals’ education has been shown to
consistently improve the acquisition of knowl-
edge, skills and behaviours.10 11 However,
there is a paucity of literature on the role of
simulation in performing NAs, including the
use of simulation to determine system and/
or institutional level gaps for change manage-
ment. In addition, there is a general lack
Strengths and limitations of this study
Simulation is an innovative methodology to under-
take needs assessments.
Using simulation permits the development of an
environment that enables the learner to perform
naturally and gain insight into the complexity of the
actual workplace.
Study adds to the relative dearth of qualitative work
in simulation and medical education.
Study sample is relatively small and is performed at
a single centre.
Cross-sectional nature of the study does not permit
generalisations.
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
2SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
of qualitative simulation studies in medical education
that compare simulation to more traditional qualitative
methods.12–14
Recognition and care of critically ill patients in commu-
nity settings is complex, requiring skilled staff and
optimal use of resources at the site, plus a coordinated
system for interaction with and transfer to the referral
centre when needed. In 2006, the Critical Care Strategy
was announced by the Ministry of Health and Long-Term
Care of Ontario, Canada. The purpose of this ongoing
initiative is to improve access, quality and system integra-
tion to ensure all citizens of Ontario have equal access to
high-quality critical care. In keeping with this mandate,
a comprehensive NA was completed by members of the
current research team, which identified gaps in caring
for critically ill patients at a single community hospital.15
These results provided insights into the needs of a
community to optimise care of its critically ill patients, as
well as suggestions for how a referral hospital may best
support its community site. However, the cost and time
required to complete this study was substantial, and the
process requires streamlining in order to be feasible to
implement across numerous sites.
This earlier study included interviews, focus groups,
manikin-based simulation (MBS) and virtual patient
simulation (VPS), questionnaires and a family survey.
Following each of the MBS and VPS, 20 min debrief
sessions were held and were video-recorded. These
debrief sessions were not included in the comprehensive
NA but rather were included as normal pedagogical prac-
tice in providing feedback for simulation participants and
to facilitate development of reflective skills and teaching
for simulation participants.16 However, on reviewing the
recordings, it was notable that many of the same themes
that were discussed in the larger NA were also identi-
fied by participants in these debriefs. This serendipitous
finding suggested that simulation debriefs could be of
value as data for NA either alongside or instead of tradi-
tional approaches. The overarching guiding research
questions included: (1) how do the needs identified
through simulation compare with those identified using
traditional methods of NA data collection?, (2) can
similar data be captured more efficiently in the simula-
tion debrief session compared with lengthier traditional
methods? and (3) what are the strengths and limitations
of using simulation in NA?
Specifically, this study aims to better understand the
potential of a NA approach using qualitative data from
MBS and VPS debriefing sessions to explore the system,
team and individual level needs in caring for critically ill
patients in a community context, compared with tradi-
tional methods (ie, focus groups and interviews). We also
aimed to compare feasibility in terms of time and cost.
METHODS
Secondary analysis has been recognised as an important
yet underused research approach.16 It has been defined
as the reanalysis of an existing data set, which may be used
to investigate new research questions or verify previous
research findings.17 18Using an exploratory qualitative
design, this current research used original data that were
compared and contrasted from simulation debriefs with
data from the earlier assessment of critical care needs in
a community setting, enabling exploration of the current
research question from our existing data.
Patient and public involvement
By employing a patient-centred approach to research,
the research questions and outcome measures were
informed by patient outcomes in mind. That is, by under-
standing feasible and perhaps more timely approaches
to conducting NAs earlier interventions can be imple-
mented to facilitate patient care. It should be noted that
patients nor patient advisors were involved in the recruit-
ment or conduct of this study. Presentations at hospital
medical rounds and continuing professional develop-
ment sessions are the primary mechanisms to disseminate
results to study participants.
Design and analysis
Original study data collection and analysis
The original mixed-method study was conducted between
June 2011 and February 2012. A conceptual framework,
centred on the critically ill patient, guided the design and
selection of that data collection instruments. Different
perspectives sampled included regional leaders, health-
care professionals at the community and its referral
hospital, as well as family members of patients who had
received care at the community intensive care unit. Inter-
views and focus groups were designed to follow a semi-
structured, broad, predetermined line of inquiry that was
flexible permitting exploration of themes. Data from each
interview and focus group were transcribed and entered
into NVivo software, and inductive coding techniques
were applied as informed by Creswell’s thematic analysis
approach.19 The constant comparative method was used
as data were analysed.18 Full information regarding the
original study can be found in Sarti et al.15
Simulation
Simulations were conducted at the community hospital to
obtain data on human and social capital at the community
hospital, including interdisciplinary team functioning,
crisis resource management and critical care knowledge
and skills.10 20 21 The simulation component of the NA
consisted of two forms of simulation: MBS (eg, SimMan)
and VPS (eg, interactive video with patient actors), each
followed by debriefing sessions using an expert facilitator
engaging participants in reflective and focused discussion
on a particular scenario while simultaneously providing
teaching.10 16 21–23 To maximise participants’ exposure
to the various cases, each team completed two MBS
and two VPS sessions. Canadian experts in critical care
designed the scenarios to represent prototypical clinical
encounters. These scenarios were originally developed
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
3
SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
for residents in Canada, with the Acute Critical Events
Simulation course. The scenarios, which included cases
of impending respiratory failure, shock, sepsis and
arrhythmias, were reviewed by an interdisciplinary panel,
modified to reflect the realities of practice in the commu-
nity hospital and were video-recorded. To assess perfor-
mance during simulation, custom task checklists and two
validated global rating scales were completed.22 23 Only
quantitative data from the simulations was included
in the original NA,15 given that debriefs have not been
described as NA tools.
The MBS and VPS scenarios were each followed by a
20 min debrief session, which were video-recorded. The
debriefs were designed to establish an engaging and
supportive learning environment, promote facilitated
reflection and discussion, explore performance gaps
and provide feedback to the participants with respect
to the scenarios.24 Facilitators used a blended approach,
including focused facilitation to encourage critical reflec-
tion and deeper understanding of events and also to
provide information through directed performance feed-
back and teaching.25 In addition to the standard learn-
er-centred debriefing, participants were encouraged to
discuss their practice context and reality.
Time and cost analysis
Time for each of the data collection methods, interviews,
focus groups and debriefs was captured from audio files.
Data on the financial costs were captured in budgets and
expenditure tracking documents, including equipment,
travel expenses and hourly salary rates. MBS specific costs
included manikin rental, rental van for transportation.
Both MBS and VPS required use of computer programs,
a simulation instructor and technologist. Travel was
required for both forms of simulation and focus groups.
The interviews from the earlier study were held via tele-
phone. The debriefs, interviews and focus groups all
required a facilitator, audio recorder, transcriptionist,
researcher and research assistant to perform coding and
thematic analysis. Investment costs for initial implemen-
tation of a simulation programme, annual operational
maintenance and replacement expenses were not consid-
ered. Time and cost to prepare the interview/focus group
guides and simulation cases were not included in the
analysis, as there were not enough data available to accu-
rately estimate.
Secondary data analysis
Data analysis comprised secondary thematic analysis
and comparative analysis.17 18 Comparative analysis was
required to compare and contrast the data from the
earlier study with the MBS and VPS debriefs.
Thematic analysis of the debriefs was performed.26–28
Transcripts were entered into NVivo software. Codes
identified in previous work/inquiry were applied to the
data.19 To enhance study rigour multiple coders coded
the transcripts, including two researchers who were
involved with coding in the original NA (AJS and SS)
and one researcher who was not involved with coding in
the original study (RA). Researchers actively searched
for disconfirming data and identification of additional
codes; inductive and deductive approaches were used.
Themes and their definitions were decided through
researcher discussion and negotiation. Qualitative data
from the simulation debriefs was contrasted to the qual-
itative data obtained with the earlier NA (focus groups
and interviews). The final analytic component included
reading through all the transcripts in each data collection
modality (traditional, VPS and MBS) so as to selectively
identify areas of convergence and divergence in both the
content and structure of the transcript per data collection
method.29
Study rigour
Multiple strategies were employed to minimise threats
to the validity/credibility of the study. Efforts were made
to search for disconfirming evidence through the use of
purposive sampling, with the selection of participants to
provide a balanced representation of the collective group,
including potential differences of opinion. Two forms
of triangulation were employed to achieve a balanced
perspective and enhance the reliability of the conclu-
sions: (1) data source triangulation (using multiple data
sources and informants) and (2) investigator triangula-
tion (using more than one person to collect, analyse and
interpret data).
RESULTS
Participants
There were 31 participants in the focus groups (13 from
the community hospital, 11 from the referral hospital
and 7 in an interhospital focus group; this included 12
physicians, 14 nurses and 5 respiratory therapists (RTs)
and 22 participants in the interviews (2 regional leaders,
7 community hospital leaders and 13 referral hospital
leaders). In the simulations, there were 13 participants
from the community hospital (six physicians, six nurses
and one RT) who formed six teams (see table 1).
Time and cost analysis
The 22 interviews (average 31 min; range 15–48 min)
and 6 focus groups (average 108 min; range 57–154 min)
yielded 684 min and 647 min of audio recordings, for
a total of 1331 min. The MBS debriefs averaged 22 min
(range 17–30 min; total=130 min), and VPS debriefs
averaged 31 min (range 25–48 min; total=186 min). The
results of the cost analysis are displayed in table 2. The
total cost for interviews and focus groups was approxi-
mately $13 560, for MBS was $4 030 and for VPS debriefs
was $3 475.
Comparative analysis
Data from VPS and MBS debriefs contributed to 15 of 20
total themes compared with the earlier study (see online
supplement A). When comparing the top five themes
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
4SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
in terms of highest frequency, two themes consistently
appear across all three data collection modalities: knowl-
edge, skills and abilities (KSAs) (NA interviews and focus
groups: n=104, MBS: n=53, VPS: n=127), and solutions
(NA interviews and focus groups: n=193, MBS: n=28,
VPS: n=57). Similarly, when comparing the five themes
with the lowest frequency counts, two themes appear
across all data collection modalities: leadership (NA
interviews and focus groups: n=23, MBS: n=6, VPS: n=10)
and night/weekend (NA interviews and focus groups:
n=48, MBS: n=5, VPS: n=27). Themes not identified with
either form of simulation debriefs included palliative/
end-of-life care, patients postreferral hospital, lack of
understanding, vision and family and patient thoughts.
A descriptive matrix with the themes and representa-
tive quotes from the various data collection methods is
presented in online supplement B. In general, for the
themes common to both interviews/focus groups and
simulation debriefs, similar high-level needs were identi-
fied, and similar overarching conclusions could be drawn
from the simulation debriefs compared with the earlier
NA. However, more descriptive data were discovered with
the earlier NA versus the simulation debriefs where data
were more direct and to the point.
As an exemplar, KSA was identified across all methods.
A key gap identified within this theme was the manage-
ment of respiratory failure and ventilation. This gap
was identified in the interviews, focus groups and simu-
lation debriefs. Key issues identified in the earlier study
and simulation debriefs, within this topic included basic
and difficult ventilation strategies, troubleshooting
and managing status asthmaticus. Weaning and lung
protective strategies specifically were only identified in
the interviews and focus groups. Both the earlier study
and simulation debriefs identified system-level gaps that
contributed to this need, including the need for 24-hour
RT coverage. Where this need was identified in the simu-
lation debriefs, a greater depth of data emerged during
the focus groups surrounding the nature and impact of
the gap/lack of 24-hour coverage. In the following focus
group, participants discussed challenges of weaning
patients:
We’ve been wanting to put patients on APRV at night
and it makes it difficult because as they improve their
volumes are going to get larger and it’s something
that you really have to watch on the vent, and the
nurses don’t. They’ll watch but they don’t really un-
derstand as much as what we do, the doctors have no
idea, it’s just really us. We’re leery sometimes to put
somebody on bi-level APRV, whatever you want to
call it, because we’re not here 24 hours to watch the
whole process happen.
The main themes identified from the simulation (not
found in the interview/focus group data) were related
to the fidelity of the simulation (environmental, equip-
ment and psychological) and the role of the simulation
instructor in teaching and promotion of reflection (see
online supplement C). In addition, the theme of inter-
ruption was identified only in the MBS debriefs, which
occurred when the facilitator interrupted a participant to
provide teaching/impart knowledge.
In some instances, lower fidelity led to the discovery of
gaps in practice. In the following example, the creation of
an ‘unreal’ environment led to the discovery of a system-
level gap. In this situation, the participant highlighted
that receiving blood work quickly in the MBS, which does
not match their reality and may impact patient care:
The blood work is too long in [the community hos-
pital]. It’s horrible. Like you can do a code for an
hour and you won’t even know your potassium, your
calcium, or your CBC; it’s just a disaster.
Thus, the role of the facilitator was coded as producing
several themes that only emerged within the MBS and VPS
datasets. Unlike the traditional NA facilitator, the simu-
lation facilitators carried out multiple roles. Two codes
(promoting reflection and teaching) were evident in the
educative roles the facilitator played. That is, the facili-
tator served to further engage the learners in the simu-
lated scenario by promoting reflection through reflective
cues. We defined reflection as the ‘process of learning
through and from experience towards gaining new
insights of self and/or practice’30 (p. 1). The following
is an example of the facilitator providing reflective cues
linking learning to the experience:
Table 1 Participant demographics
Earlier comprehensive NA
Interviews Total=22
Regional leaders 2
Community hospital leaders 7
Referral hospital leaders 13
Focus groups Total=31
Community hospital 6MDs, 6RNsand 1RT.
Referral hospital 4MDs, 5RNsand 2RTs.
Interhospital 2MDs, 3RNsand 2RTs.
Simulation debriefs
Manikin-based simulations
(MBS)
Total=13
(6 MD, 6 RNsand 1 RT).
Community hospital 6 teams (1MD, 1RN±RT per
team); each team performed
two MBS cases.
Virtual patient simulations
(VPS)
Total=13
(6 MDs, 6 RNsand 1 RT).
Community hospital 6 teams (1MD, 1RN±RT per
team*); each team
performed two VPS cases.**
*One VPS was completed by a physician alone (no other team
member).
**One team completed only one of the two VPS cases.
MD, physician; RN, nurse; RT, respiratory therapist.
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
5
SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
Facilitator: So that was an issue that was brought up
by a couple of other nurses, not having an RT and
not having ventilation. Having regular ventilation
control, do you agree with that or do you have a dif-
ferent opinion?
Participant: I think there should be an RT 24/24 in
this hospital.
Also, the teaching code was evident throughout both
MBS and VPS. These educative remarks/exchanges were
designed by the facilitator to provide information to the
participants to impart knowledge rather than cuing the
participants to reflect specifically on their experience.
Facilitator: The only thing I point out to you is that
sometimes we like to choose the gentler sedatives,
but they’re going to need sedation then they just may
need more adequate haemodynamic support as well.
Finally, a code that only appeared in MBS data was
one called ‘interruption’. This code highlighted the
conflicting roles of ‘educator’ and ‘researcher’. During
the simulation debriefs, at times the facilitator would
interrupt the participants to provide education. In the
following example, the participant starts to discuss a
potential need to have an oscillator (eg, a specialised
ventilator). The instructor interrupts the flow of the
Table 2 Cost comparison across the data collection tools
Items Interviews/focus groups(FGs) Virtual patient simulations High fidelity simulations
Costs with running the simulations*
Rental van– Bringing equipment to
site
N/A N/A $550
Facility rates† N/A No charge No charge
Manikin daily rental fee N/A N/A $500
Computer
software program
N/A $0
(newly developed software program
licencing fee)
$0
(software program owned)
Needles/gauze/ syringes and so on
for MBS
N/A N/A No additional charge.
Reusable materials.
Simulation instructor‡ N/A $1002
($1250–$248, Total daily cost minus
debrief)
$1074
($1250–$176, Total daily cost minus
debrief)
Technologist§ N/A $400 $400
Subtotal N/A $1402 $2524
Cost specically required for the NA/debrief*
Facilitator $1332
(22.2 hours × $60/hour)
$248
(3.1 hours × $80/hour)
$176
(2.2 hours × $80/hour)
Travel to the site¶ $360
($120×3 visits to the site for focus
groups)
$120 $120
Audio recorder No additional expense
(if you have to buy one it is about $250)
No additional expense No additional expense
Transcription** $1434
Interviews=11.4 data hours × 2.5
transcription hours per hour of data ×
$20/hour=570.
FGs=10.8 data hours × 4 transcription
hours per hour of data × $20/hour=864.
$248
(3.1 hours × 4×20=248)
$176
(2.2 hours × 4×20)
NVivo data entry†† $1554
(22.2×35×2)
$217
(3.1×35×2)
$154
(2.2×35×2)
Data analysis – coding and thematic
analysis‡‡
$8880
(22.2 data hours × 2 researchers at 80/
hour for 2.5 hours per hour of collected
data)
$1240
(3.1 × data hours × 2 researchers at 80/
hour for 2.5 hours per hour of collected
data)
$880
(2.2 data hours × 2researchers at 80/
hour for 2.5 hours per hour of collected
data)
Subtotal $13 560 $2073 $1506
Total $13 560 $3475 $4030
*Note all funds are reported in Canadian dollars.
†Facility rates at this site were not charged. Note that typical rental costs are between 200and 300 per hour.
‡Cost assumes access to a trained instructor. Instructor training would be an additional cost. The daily cost for a simulation instructor is $1250. The cost of the
debrief sessions has been separated in this table.
§Cost assumes access to a trained technologist. Training would be an additional cost.
¶Land travel at $0.54/km. Travel required for simulations and FGs (interviews were via telephone).
**Transcription costs: for one to one interview assumes 2.5 hours per 1 hour recording for transcription. For focus group and simulation debriefs assumes 4 hours
per 1-hour recording. Transcriptionist rate is $20 per hour.
††NVivo data entry: research assistant salary $35 per hour – assumes 2 hours required per hour of data.
‡‡Data analysis includes researcher salary of $80 per hour. Considers two researchers for coding with approximately 2.5 hours for each researcher per hour of
data collected.
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
6SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
simulation debrief with directed questioning to provide
education that this would not be required in their setting:
Participant: And we don’t have an oscillator if we tru-
ly needed one and we don’t…
Facilitator: Do you think you need an oscillator?
Participant: No, absolutely not.
In contrast, in the following quote, a focus group partic-
ipant describes wanting to have the resource and skills to
place Swan-Ganz catheters (a procedure not widely used
in tertiary critical care). In this instance, the moderator
does not provide education as is typical in interview/focus
groups but rather summarises and continues to probe to
ensure understanding of the needs. In this situation, the
participants leave with the same perspective that this is
perceived as being a priority.
Participant: We are not utilizing for example using
Swan-Ganz… I tried to put Swan-Ganz for some of my
patients that I thought they need it but then most of
the nurses said, well last time we had it was 10 years
ago, lost experience with that and we don’t have the
modalities… Maybe that will give the nurses more
confidence when they do it more frequent.
Facilitator: So is that ongoing education of the nurs-
ing staff…
Participant: Absolutely, because that’s what the ICU
needs.
A comparison of the three different data collection
methods (traditional, VPS and MBS) is displayed in table 3.
The areas of convergence or where all three data collec-
tion modalities revealed the same element (to varying
Table 3 Multimodal comparative data display
Observation/notation
Traditional interviews/focus
groups (FGs) VPS MBS
Skill level of facilitator Moderate High Extremely High
Time (average duration) Interviews: 31 min.
FGs: 108 min.
31 min 22 min
Structure Inquiry involves continuous
questioning and answers.
Multiple cases involving a
structure of playing part of
a case, stopping to debrief/
discuss, playing more of the
case, stopping to debrief,
discuss and so on.
Two cases in 15 min with a 5/10 min
structure, that is, 5 min devoted to
what the participants thought about the
scenario, did they like it, was it realistic
and so on, then 10 min to reect on
the case regarding their own practice
realities.
Variation in reection Reect on past experience. Serves as a prompt to
reection on reality (not
focused on VPS case).
Immediacy of reection tied tightly/
coupled to simulation scenario, thus
creating a platform for: (1) reection in/
on simulationand (2) reect on reality.
Educative purpose Low High High
Roles of the facilitator Single: researcher/needs
assessor.
Triple role: (1) teaching
(education), (2) reectionand
(3) researcher/needs
assessor.
Triple role: (1) teaching (education),
(2) reectionand (3) researcher/
needs assessor.
Trade-offs with various
roles of the moderator/
facilitator
Not applicable. Triple role=more potential for
impact.
Triple role=more potential for impact,
that is, if teaching and interrupt may
lead to less data collected for the
research purpose (ie, identifying needs).
Uncovering system
level barriers
Requires a lot of time and
perhaps multiple lines of
questioning and/or interviews.
Moderate ability to probe
system level abilities (as
people want to waver and
chat around many issues –
not as streamlined and direct
as sim scenarios).
Streamlined to uncover system level
barriers.
Technical difculties No occurrence in this dataset.
Would be limited possibility
(eg, audio recorder failure).
‘Technical glitch’ RC Sim
Team B (eg, blood gases
results do not come up) and
as a result they had to move
on.
No occurrence in this dataset but could
happen, more technical aspects hence
likely greater risk than with traditional
methods.
Multiple cases at once Not applicable. Multiple cases. One case per scenario.
MBS, manikin-based simulation; VPS, virtual patient simulation.
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
7
SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
degrees) included: variation in reflection and uncovering
system level barriers. Areas of divergence included: time,
structure, facilitator skill level and education (the degree
to which education was ‘built-in’ the method). The two
elements that were present in the simulation data collec-
tion methods were the ability to conduct multiple cases
in one session, as well as the simultaneous multiple roles
played by the facilitator.
DISCUSSION
This study explored the potential use of MBS and VPS
debriefs as NA tools and revealed that debriefs may be
more efficient under certain circumstances, in terms of
time and cost at capturing similar needs contrasted to
traditional methods of data collection (interviews/focus
groups). Our investigation has also highlighted various
trade-offs that exist with selecting simulation as a NA
method.
Time and cost
With respect to time, the simulation debriefs yielded
a considerably shorter total length of audio recording
(76% less time than interviews/focus groups). As such,
the costs specifically required for the NA were signifi-
cantly lower for the simulations compared with the inter-
views and focus groups (73% less cost incurred). Even
when taking into consideration the total costs of running
the simulation cases before the debriefs and the debriefs
themselves, the cost remained lower due to the high cost
of transcription, NVivo data entry and data analysis with
larger volume of data collected. It is notable, that for
the cost of simulations multiple goals may be achieved,
in that the observed simulation scenario performance
allows for quantitative measure of performance gaps,
may serve as preintervention baseline performance data,
and may reveal additional unperceived performance
gaps, not otherwise captured in interviews, focus groups
or debriefs, as demonstrated in our earlier study.15 It is
important to note that cost analysis did not include the
initial investment costs or maintenance of a simulation
programme. Hence, if there were not a programme in
place, the cost of simulation would be increased.31 The
cost of a manikin-based simulator is substantially higher
than a virtual patient simulator,32 which is an important
consideration for those considering using simulation as
debriefs in NA.
Comparative analysis
Even with substantially less time spent in the simula-
tion debriefs, the majority of themes were identified in
the simulation debriefs compared with the interviews
and focus groups. Perhaps capturing needs is better
accomplished when participants have an experiential
and emotional encounter (possibly feeling more vulner-
able), with the discussion occurring close to the event
and promoting active participation. Theory underpin-
ning the debriefs includes facilitating the transformation
of experience into learning through reflection where
‘the ultimate goal of debriefing is for learners to reflect
on and make sense of their simulation experience and
generate meaningful learning that translates to clin-
ical practice’.25 Links between emotion and cognition
have been suggested and hence, actively experiencing
an event accompanied by intense emotions, may result
in long-lasting learning.30 33 Broadening the concept of
participation, increasingly the importance of materiality
(ie, objects and technologies) and relations (with social
and material ‘forces’) are being recognised in the liter-
ature through a sociomaterial approach to practice and
learning.34 Fenwick argues that materials, often missing
from learning accounts, cannot be ignored as they funda-
mentally impact human activity (medical practice and
knowledge), further stating that ‘any medical practice is a
collective sociomaterial enactment, not a question solely
of an individual’s skill’34 (p. 48). With this approach,
simulation provides a model setting to better understand
complex medical practice, hence allowing the opportu-
nity to identify needs at various levels (system/team/indi-
vidual) and across various complex intertwined elements
(material/social/cultural) within unique systems. As the
learners work to make sense of the simulation experience
in reference to their own world, there is the opportunity
to both identify needs and provide education. By iden-
tifying and interrupting matters that had previously felt
settled, the so-called ‘black boxes that masquerade as
matters of fact’ may be opened (p. 50).34
Although the majority of themes were identified in
the simulation debriefs (15 of 20), as compared with the
interview and focus groups, a greater depth of data was
captured through the more traditional methods. With
NAs, initial data collection may inform subsequent data
collection decisions.35 In addition, priorities must be
set, which includes identifying needs of greatest impor-
tance and most amenable to change.35 Depending on
the purpose and scope of a given NA, simulation debriefs
may stand alone or may be used to make decisions
surrounding whether more extensive data are required.
Performing simulation debriefs may also help identify
the highest priority needs and determine the initial set
of needs to be targeted, in that the needs that are most
readily uncovered may be the highest priority contrasted
to those that require more probing and questioning.
The findings highlight that not all themes identified
in the interviews and focus groups were captured in the
debriefs. More specifically, palliative and end-of-life care
was not identified in the debriefs, nor was the vision of
participants or two themes relating to the interhospital
interaction (patients’ postreferral hospital and lack of
understanding). In addition, although the theme of
patient transfers was identified across all methods, the
relative frequency and depth of data was much lower
in the debriefs compared with the interviews and focus
groups. This is an important yet not unexpected finding,
given the simulation cases were not specifically designed
to explore the areas of end-of-life care or the interaction
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
8SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
between the community and referral hospital, contrasted
to the traditional NA which undertook a broad line of
inquiry along with probing into various aspects of critical
care, including both end-of-life care and interhospital
interactions. The debriefs also did not include asking
participants their vision, and these data would be unlikely
to emerge independent of directed inquiry. This finding
highlights the risk of missing needs with the simulation
debriefs and demonstrates the importance of scenario
selection and development.
Trade-offs
In this investigation, multiple inter-related roles of the
simulation facilitator during the debriefs were identified,
including promoting reflection, teaching participants
and exploring gaps in practice. Despite using different
cases, online supplement C reveals that the two simu-
lation methods produced similar patterns in terms of
thematic frequency scores. That is, the three highest rated
simulation specific themes were reflection and teaching.
Perhaps this finding is indicative of the method whereby
education is infused, upfront in simulation. In this way, a
strength of simulation debriefs may include that they can
act simultaneously as an education tool and data collec-
tion modality.
Simulation debriefs focus on transformative learning
through self-reflection and may include individual and/
or social engagement.30 The simulation debriefs capital-
ised on the social spectrum of reflection and through
critical discourse between the facilitator and participants,
needs/gaps were uncovered beyond individual and team
performance, also uncovering system level gaps. Thus, a
strength of using simulation debriefs may also include
providing a tool for assessing needs across individual, team
and system levels. Furthermore, this finding highlights
the importance of working to structure the debriefs to
promote deeper reflection,36 hence potentially surfacing
unknown unknowns that combined with the quantitative
data (normative needs) from the simulation offers more
depth than eliciting only felt needs (known unknowns).
It is important to note that having the simulation facil-
itator act in multiple roles inevitably presents challenges
and trade-offs among these roles, which is a potential
limitation of using debrief session in NAs. For example, in
the traditional interviews and focus groups, the facilitator
remains ‘neutral’ and does not provide education while
they pursue questioning to better understand the needs.37
In contrast, in the simulation debriefs, the facilitator does
not remain neutral, at times interrupting the participants
to redirect and provide education, as evidenced by the
emergence of the interruption code within the MBS data.
Interruption was coded as instances whereby the facili-
tator would intentionally stop the conversation to correct
participants when they were clearly discussing inaccurate
content. When priority is given to the educative role, the
actions of the facilitator risks not allowing the partici-
pants to explore and express details surrounding their
needs. However, the educative element also promotes
engagement through a collaborative approach and partic-
ipants may leave with a better understanding and having
learnt something. Making transparent, thoughtful deci-
sions surrounding which methods to select and recog-
nising there are advantages and disadvantages to each are
fundamental to performing NAs.37–40 If debriefs are to be
more widely used in NAs, we need to better understand
the trade-offs and their impact on the NA.
In this study, very experienced master instructors facil-
itated the debriefs. The quality of the debriefs may be
linked to this, in that someone of lesser experience may
not have been able to uncover these gaps, while providing
skilled education, which potentially limits general use of
debriefs in NA. How educators facilitate debriefings has
been shown to be highly variable.41 Debrief facilitation
also appears to be influenced by the professional back-
ground and style of the facilitators. In their exploratory
investigation, van Soeren et al12 described how some facil-
itators assumed the role of an interprofessional guide,
whereas others assumed the role of teacher, tending to
impart their knowledge. This variability in facilitation
is an important consideration for assessing needs, in
that if the facilitator were to have a style more strongly
connected with teaching, then needs may not be readily
uncovered. As simulation instructors interact with partic-
ipants in collecting data for the NA, their role must be
considered as meaning is actively coconstructed.42 In
addition, the skill level of MBS and VPS may be different
(ie, higher level/more experienced) than that of a facil-
itator collecting data in a more traditional qualitative
manner.
Strengths of our study include highlighting the effi-
ciency in using MBS and VPS simulation as a timely and
potentially cost-efficient alternative to employing tradi-
tional (interviews and focus groups) methods although
under certain assumptions (ie, the research team had
access to a simulation centre with predeveloped simula-
tion scenarios for both the MBS and VPS sessions). This
finding is interconnected to the issue of the breadth and
depth in data coverage. That is, the results of this study
demonstrate similarities in breadth of themes using tradi-
tional methods and simulation debrief with the notable
difference in terms of depth. Undeniably, the qualitative
interviews and focus groups were able to provide more
depth and richness in the data as opposed to the simula-
tion techniques that were considerably shorter in terms of
transcript coverage. However, simulation offers the added
benefit of providing quantitative performance data that
can serve as a baseline and to triangulate with the debrief
data.
This was an exploratory study, which included secondary
analysis of an existing dataset. Where secondary anal-
ysis has been recognised as an important, underused
research approach, there are limitations to this method.
The quality of the secondary data analysis rests on the
quality of the existing dataset.17 It is important to high-
light that, as described, our earlier study was performed
with a rigorous methodology with numerous methods in
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
9
SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
place to ensure high quality and credibility of our find-
ings. One concern noted in the literature is the potential
problem of data fit’.18 In the current study, the data were
not originally collected for the current research objec-
tive; however, the available data were well positioned to
answer the current research questions in an exploratory
manner. In addition, ‘the problem of not having been there
has been cited as a concern, in that challenges exist when
the secondary researcher was not involved in the orig-
inal data collection.18 Limitations of this study include
the relatively small sample size and the focus on a single
centre. Furthermore, while the results are comparable
in terms of frequency of mention, they cannot be taken
as absolutely equivalent, given the qualitative approach
employed in this study. Further research is required to
better understand the utility of simulation as a NA tool,
the design features for NA and type of needs best identi-
fied using this approach. Moreover, it will be imperative
that various stakeholder groups participate in each type
of data collection methods so as to make more definitive
conclusions.
In conclusion, this investigation provides support for
the use of simulation debriefs as a NA method to explore
needs at the system, team and individual levels. Qualitative
data collected during debriefs may be a suitable substitute
to the typical interviews and/or focus groups. Simulation
debriefs promote a participatory, collaborative approach
with the educative function built in. Given current fiscal
realities, the dual benefit of being both educative while
identifying needs is appealing although under certain
conditions. While simulation is an innovative and effec-
tive method to conduct NAs, it is important to recog-
nise that there are trade-offs with selection of methods
requiring careful scenario design and debriefing.
Acknowledgements The authors are grateful to all the participants who gave their
time to assist us with this study.
Contributors AJS contributed to the study planning and conceptualisation and
led data collection, data interpretation/analysis, manuscript development and
review. RA contributed to the study planning and conceptualisation, interpretation/
data analysis, manuscript development and review. SS contributed to the study
conceptualisation, interpretation/data analysis, manuscript development and
review. AL contributed to data collection, manuscript development and review. JK
contributed to the study planning and conceptualisation, data collection, manuscript
preparation and review. PC contributed to the study planning and conceptualisation,
data collection, manuscript development and review.
Funding This study was funded by a grant from The Ottawa Hospital Academic
Medical Organization (TOHAMO)
Competing interests None declared.
Patient consent Not required.
Ethics approval Ottawa Hospital Research Ethics Board.
Provenance and peer review Not commissioned; externally peer reviewed.
Data sharing statement No additional data are available.
Open access This is an open access article distributed in accordance with
the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license,
which permits others to distribute, remix, adapt, build upon this work
non-commercially, and license their derivative works on different terms,
provided the original work is properly cited, appropriate credit is given,
any changes made indicated, and the use is non-commercial. See: http://
creativecommons. org/ licenses/ by- nc/ 4. 0/.
REFERENCES
1. Laxdal OE. Needs assessment in continuing medical education: a
practical guide. J Med Educ 1982;57:827–34.
2. Norman GR, Shannon SI, Marrin ML. The need for needs assessment
in continuing medical education. BMJ 2004;328:999–1001.
3. Mazmanian PE. Resources and studies are required to build
knowledge on assessment, service, and health care. J Contin Educ
Health Prof 2010;30:75–6.
4. Palinkas LA, Horwitz SM, Chamberlain P, et al. Mixed-methods
designs in mental health services research: a review. Psychiatr Serv
2011;62:255–63.
5. Gonsalves CL, Ajjawi R, Rodger M, et al. A novel approach to needs
assessment in curriculum development: going beyond consensus
methods. Med Teach 2014;36:422–9.
6. Watkins R, Meiers MW, Visser Y. A Guide to Assessing Needs.
Washington: World Bank Publications, 2012.
7. Flanagan B, Nestel D, Joseph M. Making patient safety the focus:
crisis resource management in the undergraduate curriculum. Med
Educ 2004;38:56–66.
8. Gordon JA, Wilkerson WM, Shaffer DW, et al. "Practicing" medicine
without risk: students' and educators' responses to high-delity
patient simulation. Acad Med 2001;76:469–72.
9. Larue C, Pepin J, É A. Simulation in preparation or substitution for
clinical placement: A systematic review of the literature. J Nurs Educ
Pract 2015;5:132–40.
10. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced
simulation for health professions education: a systematic review and
meta-analysis. JAMA 2011;306:978–88.
11. Maran NJ, Glavin RJ. Low- to high-delity simulation - a continuum
of medical education? Med Educ 2003;37 Suppl 1(s1):22–8.
12. van Soeren M, Devlin-Cop S, Macmillan K, et al. Simulated
interprofessional education: an analysis of teaching and learning
processes. J Interprof Care 2011;25:434–40.
13. Barry Issenberg S, Mcgaghie WC, Petrusa ER, et al. Features
and uses of high-delity medical simulations that lead to effective
learning: a BEME systematic review. Med Teach 2005;27:10–28.
14. Sanford PG. Simulation in nursing education: A review of the
research. The Qualitative Report 2010;15:1006–11.
15. Sarti AJ, Sutherland S, Landriault A, et al. Comprehensive
assessment of critical care needs in a community hospital*. Crit Care
Med 2014;42:831–40.
16. Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology
for health care professional skills training and assessment. JAMA
1999;282:861–6.
17. Sales E, Lichtenwalter S, Fevola A. Secondary analysis in social work
research education: past, present, and future promise. J Soc Work
Educ 2006;42:543–60.
18. Heaton J. Secondary analysis of qualitative data: An overview.
Historical Social Research 2008;33:33–45.
19. Creswell JW. Educational Research: Planning, Conducting, and
Evaluating Quantitative and Qualitative Research. Boston, MA:
Pearson, 2012.
20. Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses
of high-delity medical simulations that lead to effective learning: a
BEME systematic review. Med Teach 2005;27:10–28.
21. Cook DA, Triola MM. Virtual patients: a critical literature review and
proposed next steps. Med Educ 2009;43:303–11.
22. Kim J, Neilipovitz D, Cardinal P, et al. A pilot study using high-delity
simulation to formally evaluate performance in the resuscitation of
critically ill patients: The University of Ottawa Critical Care Medicine,
High-Fidelity Simulation, and Crisis Resource Management I Study.
Crit Care Med 2006;34:2167–74.
23. Cooper S, Cant R, Porter J, et al. Rating medical emergency
teamwork performance: development of the Team Emergency
Assessment Measure (TEAM). Resuscitation 2010;81:446–52.
24. Cheng A, Donoghue A, Gilfoyle E, et al. Simulation-based crisis
resource management training for pediatric critical care medicine: a
review for instructors. Pediatr Crit Care Med 2012;13:197–203.
25. Eppich W, Cheng A. Promoting Excellence and Reective Learning
in Simulation (PEARLS): development and rationale for a blended
approach to health care simulation debrieng. Simul Healthc
2015;10:106–15.
26. Pope C, Ziebland S, Mays N. data aqualitative. BMJ 2000;320:114–6.
27. Ritchie J, Spencer L. Qualitative data analysis for applied social
research. In: A B, Burgess RG, eds. Analyzing Qualitative Data.
Routledge, 1994.
28. Huberman M, Miles MB. The qualitative researcher's companion.
SAGE 2002.
29. Miles MB, Huberman AM. Analysis QD. SAGE 1994.
30. Finlay L. Reecting on “reective practice.” practice-based
professional learning centre. 2008. http://www. open. ac. uk/ opencetl/
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
10 SartiAJ, etal. BMJ Open 2018;8:e020570. doi:10.1136/bmjopen-2017-020570
Open access
les/ opencetl/ le/ ecms/ web- content/ Finlay-(2008)- Reecting- on-
reective- practice- PBPL- paper- 52. pdf.
31. Danzer E, Dumon K, Kolb G, et al. What is the cost associated
with the implementation and maintenance of an ACS/APDS-based
surgical skills curriculum? J Surg Educ 2011;68:519–25.
32. Petscavage JM, Wang CL, Schopp JG, et al. Cost analysis and
feasibility of high-delity simulation based radiology contrast reaction
curriculum. Acad Radiol 2011;18:107–12.
33. Fanning RM, Gaba DM. The role of debrieng in simulation-based
learning. Simul Healthc 2007;2:115–25.
34. Fenwick T. Sociomateriality in medical practice and learning: attuning
to what matters. Med Educ 2014;48:44–52.
35. Altschuld JW, Watkins R. A primer on needs assessment: more
than 40 years of research and practice. New Dir Eval
2014;2014:5–18.
36. Husebø SE, Dieckmann P, Rystedt H, et al. The relationship between
facilitators' questions and the level of reection in postsimulation
debrieng. Simul Healthc 2013;8:135–42.
37. Tipping J. Focus groups: A method of needs assessment. J Contin
Educ Health Prof 1998;18:150–4.
38. Ratnapalan S, Hilliard RI. Needs assessment in postgraduate
medical education: a review. Med Educ Online 2002;7:4542–7.
39. Crandall SJS. Using interviews as a needs assessment tool. J Contin
Educ Health Prof 1998;18:155–62.
40. Mann KV. Not another survey! Using questionnaires effectively in
needs assessment. J Contin Educ Health Prof 1998;18:142–9.
41. Tannenbaum SI, Cerasoli CP. Do team and individual debriefs
enhance performance? A meta-analysis. Hum Factors
2013;55:231–45.
42. Ng S, Lingard L, Kennedy T. Qualitative research in medical
education: Methodologies and methods. Swanwick T, Understanding
Medical Education. Oxford, UK: John Wiley & Sons, 2013:371–84.
on 8 October 2018 by guest. Protected by copyright.http://bmjopen.bmj.com/BMJ Open: first published as 10.1136/bmjopen-2017-020570 on 8 October 2018. Downloaded from
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Simulation in education has been used at least since the time of World War II. Simulation in nursing education in the form of static manikins, role playing, CPR manikins, and other techniques has also been utilized as a teaching modality for quite some time. High-fidelity simulation is a relatively new area in nursing education and utilizes high technology simulation monitors and computers. This technology offers new avenues for teaching student nurses scenarios as well as critical thinking and reflection on lived experience and practice. However, the outcome research in the area of high-fidelity simulation in nursing education is limited at this time. This article focuses on the qualitative and quantitative research currently available in this area.
Article
Full-text available
Although the concept of needs assessment in continuing medical education is well ac-cepted, there is limited information on needs assessment in postgraduate medical education. We discuss the learning needs of postgraduate trainees and review the various methods of needs as-sessment such as: questionnaire surveys, interviews, focus groups, chart audits, chart-stimulated recall, standardized patients, and environmental scans in the context of post graduate medical edu-cation.
Article
Background: In recent years, nursing education has undergone changes and restructuring due to changes that have occurred in clinical and academic settings. Currently, academic leaders are facing the challenges of an increasing number of students, the difficulty of recruiting teachers and preceptors to accompany students, and fewer clinical settings that can accommodate many interns at once. To come to terms with these changes, the idea of replacing clinical hours with simulation has emerged. On this issue, little conclusive data is available. The objective of this article is to clarify the contribution of simulation in clinical nursing education in preparation or substitution for clinical placement. Methods: The CIHNAL, MedLine, and PubMed databases, and Google and Google Scholar search engines were consulted between to conduct a systematic review of the literature between 2008 and 2014. Thirty-three articles were selected. Results: Students and teachers perceive the benefits of simulation as an adjunct to clinical placement in terms of effectiveness, self-confidence, and preparation for clinical practice. Substituting clinical placement with simulation does not seem to have a significant impact on clinical competency, critical thinking, knowledge acquisition, and self-confidence. Conclusions: The findings question the very concept of substitution and suggest that the strengths of clinical exposure through both simulation and clinical placement should be highlighted.
Statement: We describe an integrated conceptual framework for a blended approach to debriefing called PEARLS [Promoting Excellence And Reflective Learning in Simulation]. We provide a rationale for scripted debriefing and introduce a PEARLS debriefing tool designed to facilitate implementation of the new framework. The PEARLS framework integrates 3 common educational strategies used during debriefing, namely, (1) learner self-assessment, (2) facilitating focused discussion, and (3) providing information in the form of directive feedback and/or teaching. The PEARLS debriefing tool incorporates scripted language to guide the debriefing, depending on the strategy chosen. The PEARLS framework and debriefing script fill a need for many health care educators learning to facilitate debriefings in simulation-based education. The PEARLS offers a structured framework adaptable for debriefing simulations with a variety in goals, including clinical decision making, improving technical skills, teamwork training, and interprofessional collaboration.
Article
This chapter consists of an overview of needs assessment's rich history, definitions, models, tools, and techniques. These closely align its theory, research, and practice to several associated fields—most notably strategic planning and evaluation. The highlights of the content include a comparison to—and differentiation from—evaluation, a brief timeline of the recent history of the field, the notable emergence of hybrid assessment and asset/capacity building approaches, some discussion of opposition to needs assessment, and a description of two prominent models that guide what assessors do. The summary captures the dynamic nature of the enterprise and how it is evolving.
Chapter
Qualitative research methods can contribute to theory building and to the study of complex social issues in medical education. Qualitative research encompasses multiple research methodologies, including case study, grounded theory, phenomenology, hermeneutics, narrative inquiry and action research. This chapter focuses on data collection methods (including interviews, focus groups, observations and assembly of textual documents). Data analysis methods (including thematic analysis and discourse analysis) are considered separately. Qualitative research has made important contributions to medical education research in the past few decades. This form of inquiry is situated within a particular set of paradigms and draws on recognisable approaches and methodological tools to build knowledge regarding the experiences and activities of teachers, trainees, patients and team members in medical education settings. Particular ethical issues must be considered in a qualitative project, as well as appropriate criteria for determining the most rigorous path for each individual study.
Article
Background: Needs assessment should be the starting point for curriculum development. In medical education, expert opinion and consensus methods are commonly employed. Aim: This paper showcases a more practice-grounded needs assessment approach. Methods: A mixed-methods approach, incorporating a national survey, practice audit, and expert consensus, was developed and piloted in thrombosis medicine; Phase 1: National survey of practicing consultants, Phase 2: Practice audit of consult service at a large academic centre and Phase 3: Focus group and modified Delphi techniques vetting Phase 1 and 2 findings. Results: Phase 1 provided information on active curricula, training and practice patterns of consultants, and volume and variety of thrombosis consults. Phase 2's practice audit provided empirical data on the characteristics of thrombosis consults and their associated learning issues. Phase 3 generated consensus on a final curricular topic list and explored issues regarding curriculum delivery and accreditation. Conclusions: This approach offered a means of validating expert and consensus derived curricular content by incorporating a novel practice audit. By using this approach we were able to identify gaps in training programs and barriers to curriculum development. This approach to curriculum development can be applied to other postgraduate programs.
Article
In current debates about professional practice and education, increasing emphasis is placed on understanding learning as a process of ongoing participation rather than one of acquiring knowledge and skills. However, although this socio-cultural view is important and useful, issues have emerged in studies of practice-based learning that point to certain oversights. Three issues are described here: (i) the limited attention paid to the importance of materiality - objects, technologies, nature, etc. - in questions of learning; (ii) the human-centric view of practice that fails to note the relations among social and material forces, and (iii) the conflicts between ideals of evidence-based standardised models and the sociomaterial contingencies of clinical practice. It is argued here that a socio-material approach to practice and learning offers important insights for medical education. This view is in line with a growing field of research in the materiality of everyday life, which embraces wide-ranging families of theory that can be only briefly mentioned in this short paper. The main premise they share is that social and material forces, culture, nature and technology, are enmeshed in everyday practice. Objects and humans act upon one another in ways that mutually transform their characteristics and activity. Examples from research in medical practice show how materials actively influence clinical practice, how learning itself is a material matter, how protocols are in fact temporary sociomaterial achievements, and how practices form unique and sometimes conflicting sociomaterial worlds, with diverse diagnostic and treatment approaches for the same thing. This discussion concludes with implications for learning in practice. What is required is a shift from an emphasis on acquiring knowledge to participating more wisely in particular situations. This focus is on learning how to attune to minor material fluctuations and surprises, how to track one's own and others' effects on 'intra-actions' and emerging effects, and how to improvise solutions.
Article
To design and implement a needs assessment process that identifies gaps in caring for critically ill patients in a community hospital. This mixed-method study was conducted between June 2011 and February 2012. A conceptual framework, centered on the critically ill patient, guided the design and selection of the data collection instruments. Different perspectives sampled included regional leaders, healthcare professionals at the community hospital and its referral hospital, as well as family members of patients who had received care at the community ICU. Data sources included interviews (n = 22), walk-throughs (n = 5), focus groups (n = 31), database searches, context questionnaires (n = 8), family surveys (n = 16), and simulations (n = 13). None. Nine needs were identified. At the community hospital, needs identified included lack of access to human resources, gaps in expertise, poor patient flow and ICU bed use, communication, lack of educational opportunities, and gaps in end-of-life care and interprofessional teamwork. Needs were also identified in the interhospital interaction between the community and referral hospitals, which included an inadequate hospital network and gaps in transfer and repatriation of patients. The methodology uncovered the causes and widespread impact of each need and how they interacted with one another. Proposed solutions by the participants are presented including both organizational and educational/clinical solutions. This study captured needs in a complex, interprofessional, interhospital context, which can be targeted with tailored interventions to improve patient outcomes in a community hospital. Furthermore, this study provides a preliminary framework and rigorous methodology to performing a needs assessment in this setting.