Content uploaded by Oscar Lyons
Author content
All content in this area was uploaded by Oscar Lyons on Nov 06, 2019
Content may be subject to copyright.
75 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
A systematic review of
leadership training for
medical students
Oscar Lyons, Bruce Su’a, Michelle Locke, Andrew Hill
ABSTRACT
BACKGROUND: Leadership is increasingly being recognised as an essential requirement for doctors.
Many medical schools are in the process of developing formal leadership training programmes, but it
remains to be elucidated what characteristics make such programmes e ective, and to what extent
current programmes are e ective, beyond merely positive learner reactions. This review’s objective was
to investigate the e ectiveness of undergraduate medical leadership curricula and to explore common
features of e ective curricula.
METHODS: A systematic literature search was conducted. Articles describing and evaluating undergraduate
medical leadership curricula were included. Outcomes were stratified and analysed according to a modified
Kirkpatrick’s model for evaluating educational outcomes.
RESULTS: Eleven studies met inclusion criteria. Leadership curricula evaluated were markedly
heterogeneous in their duration and composition. The majority of studies utilised pre- and post- intervention
questionnaires for evaluation. Two studies described randomised controlled trials with objective measures.
Outcomes were broadly positive. Only one study reported neutral outcomes.
CONCLUSIONS: A wide range of leadership curricula have shown subjective e ectiveness, including short
interventions. There is limited objective evidence however, and few studies have measured e ectiveness
at the system and patient levels. Further research is needed investigating objective and downstream
outcomes, and use of standard frameworks for evaluation will facilitate e ective comparison of initiatives.
Effective leadership is vital in imple-
menting health improvements at
both clinical and system levels. In
health, effective leadership involves utilising
social infl uence and advocacy to anticipate
and act on health challenges for a positive
outcome.1,2 Ineffective leadership has been
shown to have an adverse effect on team
performance and patient outcomes.3,4 Effec-
tive leadership, however, signifi cantly im-
proves these outcomes5–7 and therefore many
major health institutions have incorporated
effective leadership as a core competency
skill expected of health professionals.8–16
To address this demand, leadership
training has since been implemented
within medical school curricula, from
pre-clinical17–25 to clinical,17–19,22,23,26–28 and
later through to residency and beyond.29,30
Although leadership programmes have been
well received by both medical students and
faculty, little objective data is available to
analyse outcomes, and little is known of how
such skills translate beyond medical school.17
Further, determining the optimum time to
implement such courses remains unclear.
This systematic literature review therefore
aims to collate studies that have incorpo-
rated leadership courses within medical
school curricula, and have evaluated their
effectiveness in an objective manner.
Methods
Search strategy and information
sources
This systematic review was performed
in accordance to the PRISMA statement
(Preferred Reporting Items for Systematic
Reviews and Meta-analysis).31 Five databases
were systematically searched: Excerpta
Medica database (EMBASE); Education
Resources Information Centre (ERIC);
ARTICLE
76 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
Medline; PsychINFO; and PubMed (National
Library of Medicine). Keywords were “lead-
ership”, “medical student” and “education”,
and were also mapped to medical subject
headings (MeSH terms) and exploded. The
initial search was completed on 20 May 2016
by OL and ML. Reference lists of articles
that were selected for full text review were
manually searched for additional studies.
Article selection
The title and abstract screen was
performed independently by two authors
(OL, ML). All articles concerning leadership
training and medical students were selected
for full text review. Full text reviews were
performed (OL, BS). The Kirkpatrick model
for assessment of training outcomes with the
BEME modifi cation32 was applied to studies
measuring level 2 or higher, as shown in
Table 1. This selection criteria allows for
objective outcomes to be analysed.
Inclusion and exclusion criteria
Studies where a leadership training inter-
vention was described and implemented
within a medical students’ population, and
having outcomes reported at Kirkpatrick’s
level 2 or higher were included in this
review. Studies without a full text available,
and not in English were excluded.
Data abstraction
Data from included studies were
abstracted into a Microsoft ® Excel® (2013)
spreadsheet using a modifi ed BEME coding
sheet by two authors (OL, BS). Any uncer-
tainties were resolved by consensus.
Data analysis
Study outcomes were categorised
according to the BEME modifi cation to
Kirkpatrick’s model for evaluation of effec-
tiveness of teaching. This model has been
used by several BEME collaborations and
was recently adapted by Steinert et al for
leadership initiatives in medicine.32
Risk of bias
Risk of bias was evaluated according to
the Cochrane Handbook for Systematic
Reviews of Interventions.33 This tool
assesses bias through seven areas:
random sequence generation, allocation
concealment, blinding of participants and
personnel, blinding of outcome assessment,
incomplete outcome data, selective
reporting and other sources of bias. Each
study was given an overall quality rating
(1=low; 5=high) and reviewers were asked
to comment on strengths and weaknesses.
Table 1: Description of Kirkpatrick’s levels for evaluating educational outcomes and levels.
Level Description Number
Level 2A
Change in attitudes
Changes in the attitudes or perceptions among
participant groups towards leadership, management
and/or administration.
10 (91%) 10
(91%)
Level 2B
Change in
knowledge or skills
For knowledge, this relates to the acquisition of concepts,
procedures and principles; for skills, this relates to the
acquisition of thinking/problem-solving, psychomotor
and social skills.
5 (45%)
Level 3A
Behavioural change
(self-reported)
Documents the transfer of learning to the workplace
and changes to professional practice, as noted by
participants.
9 (82%) 10
(91%)
Level 3B
Behavioural change
(observed)
Documents the transfer of learning to the workplace
and changes to professional practice, as noted by a third
party.
7 (64%)
Level 4
Results
Change in the system/organisational practice refers to
wider changes in the organisation, attributable to the
educational programme.
4 (36%)
Number refers to the studies which demonstrated outcomes at each level, and percentages (out of 11 studies) are
included. For level 2a/b and 3a/b, an additional combined number has been added. Level 1 was not included in this
review, as discussed in the text.
ARTICLE
77 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
Results
In total, 1,248 unique papers were iden-
tifi ed and screened, of which 11 studies
were included in review (Figure 1). Ten
of the studies reported positive outcomes
while one reported a neutral outcome.
A summary of included studies is shown
in Table 2. The majority of the included
studies were quasi-experimental, with two
randomised controlled trials21,25 and two
observational studies.18,22
Table 2: Summary of included studies.
Author Study design Intervention
(follow-up)
Learners
(n)
Outcome summary
Bergman28
(2008)
Quasi-
experimental
repeated
measures
Short course
(no long-term
follow-up)
Clinical
(160)
Level 2a
Increased openness to learning about
healthcare team members.
Subgroup changes: increased “attitudes
to openness and group dynamics” and
“openness in the professional role”
Level 2b
Increased awareness of importance of
variable leadership styles
Carufel-
Wert20
(2007)
Observational Longitudinal
(no long-term
follow-up)
Both (50) Level 2a
Increased interest in taking leadership
positions; increased desire to remain in
medical school
Level 2b
Increased perceived ability to be an
e ective leader
Level 3a
Increased interactions with those in the
leadership group
Figure 1: PRISM A Flow diagram.
ARTICLE
78 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
Coleman21
(2012)
Quasi-
experimental
repeated
measures
Short course
(followed up
at 8 months
and 18
months)
Both (11) Level 2a
Increased confidence to execute projects,
increased leadership self-e icacy
Level 3a
7/11 students executed their projects
Level 3b
Increased OSTE test scores
Goldstein22
(2009)
Quasi-
experimental
repeated
measures
Short course
(no long-term
follow-up)
Pre-
clinical
(>24)
Level 2a
Increased confidence levels in leadership
activities
Level 2b
Increase in leadership competencies,
knowledge of leadership styles
Level 3a
Reported utilisation of learned skills in
student organisations
Level 3b
Fi een completed community service
projects run by participants
Level 4
Increased pedestrian safety measures on
campus as a result of student project
Hunziker23
(2010)
Randomised
controlled
superiority
trial
Workshop
(followed
up at four
months)
Pre-
clinical
(237)
Level 3b
Increased leadership utterances (from 5
(2–8) to 7 (4–10), p=.02)
Level 4
Increased hands-on time
Decreased time to initiate CPR
Lower technical e ectiveness vs technical
instruction group
Meier29
(2012)
Quasi-
experimental
repeated
measures
Short course
(no long-term
follow-up)
Clinical
(17)
Level 2a
Self-evaluation scores increased p<.001
Level 2b
Average MCQ test score increased from
84.9% to 94.1% (p<.01)
Level 3a
Self-evaluation scores increased for 16,
remained constant for 1
Level 3b
Both TeamSTEPPS and NOTECHS scores
increased. Three of five NOTECHS
domains were individually significant, all
TeamSTEPPS domains were significant
Table 2: Summary of included studies (continued).
ARTICLE
79 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
Meurling30
(2013)
Quasi-
experimental
repeated
measures
Workshop
(no long-term
follow-up)
Clinical
(54)
Level 2a
No change in mental strain or
concentration
Level 3a
Increased self-e icacy scores
Level 3b
No significant changes except increased
frequency of sum-ups
Level 4
No change (no groups achieved this in
any scenario)
Sherrill24
(2000)
Observational Longitudinal
(no long-term
follow-up)
Both
(153)
Level 2a
More likely to desire business-type
careers, administrative duties
Level 3a
More confident in all aspects of clinical
and administrative duties
More likely to seek administrative rather
than clinical duties
Smith25
(2007)
Quasi-
experimental
repeated
measures
Short course
(11 months)
Both (23) Level 2a
Improved attitude towards leadership
behaviours
Level 2b
Increased skills (self-reported, not tested)
Level 3a
Higher exhibition of leadership
behaviours
Level 3b
Completion of project in 13/23 students
Level 4
Projects reached >600 students at 11
institutions (self-reported)
Warde26
(2014)
Quasi-
experimental
repeated
measures
Short course
(no long-term
follow-up)
Pre-
clinical
(20)
Level 2a
No change in Relational Coordination
Scale scores
Level 3a
No change in Leadership Practices
Inventory scores
Wayne27
(2010)
Randomised
controlled
superiority
trial
Tutorial
(no long-term
follow-up)
Pre-
clinical
(158)
Level 2a
Interviewees indicated more positive
attitude towards leadership
Level 3a
Interviewees indicated acting as leader as
a result of the instruction
Level 3b
Percentage female leaders increased
from 27% to 47%
Intervention specifically addresses the length of intervention rather than curriculum content, which is described in
the text. Outcome summaries are stratified according to the modified Kirkpatrick framework as described in Table 1.
Table 2: Summary of included studies (continued).
ARTICLE
80 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
Setting
Eight of the 11 studies were conducted
in the US, two in Sweden and one in Swit-
zerland. The majority of these were in
single centres. Two studies19,23 selected
participants from across the US and Canada,
and one study included all eligible partic-
ipants across six MD-MBA conjoint degree
programmes in the US.22
Participant selection
Participant numbers ranged from 11
to 237 as shown in Table 2. Most studies
included a subgroup of a medical school
cohort,18,20,21,24,26–28 with one including an
entire cohort25 and three including partic-
ipants from multiple medical schools.19,22,23
One study did not report the number of
participants.20
There was signifi cant variation in
selection criteria and the number of partic-
ipants. Three of the 11 studies evaluated
a compulsory component of a course: two
of these allowed students to opt out of the
evaluation, though not the training itself;26,28
one did not allow students to opt out of the
evaluation.25 Four studies offered open,
optional training and evaluation to an entire
cohort20–22 or from the portion of a cohort
enrolled in a particular elective.27 The
remaining four studies required participants
to submit a written application, and chose a
small number of students judged to already
have signifi cant leadership potential.18,19,32,24
Intervention
Interventions varied in the setting, mate-
rials, length of course and stage of the
programmes evaluated.
Course implementation
Four studies focused on pre-clinical
students, three on clinical students and four
on both (see Table 2).
Course intervention setting
Most studies incorporated some
component of experiential and refl ective
learning,18–21,23,24,26,28 though the format of
this was generally poorly reported. Three
studies used a simulation centre for their
study.21,27,28 Two studies employed a prac-
tical community component,18,20 and Wayne
et al utilised a small-group tutorial for
their study.25 Methods of reported teaching
included readings, discussions, simulation,
community projects and video instructions,
in various iterations.
Course duration
The durations of the intervention were
markedly heterogeneous and ranged from
the addition of two sentences to a standard
instruction,25 to implementing a longitu-
dinal course over a degree.18,22 Seven studies
comprised workshops conducted either in
a single day21,28 or in short courses of one
week19,23 to one semester in duration.20,24,27
Only one study delivered their initiative in
more than one discrete course.26
Course programmes utilised
Carufel-Wert et al and Sherrill et al eval-
uated existing programmes,18,22 whereas the
other nine studies evaluated new or signifi -
cantly altered programmes. Eight studies
assessed outcomes immediately post-inter-
vention only.19,20,22,24–28 Longer-term outcomes
were assessed in three studies only: at four
months;21 11 months;23 and separately at
eight and 18 months.19
Course outcomes
Outcomes were assessed at Kirkpatrick
level 2 in 10 studies,18–20,22–28 at level 3 in
10 studies,18–25,27,28 and at level 4 in four
studies20,21,23,28 (see Table 1). The majority of
these outcomes were self-reported.
Included study goal
Included studies had varied aims and
objectives. Eight studies broadly evaluated a
new or existing leadership intervention for
its utility in medical students.18–21,23,24,26,27 The
remaining three studies had main inten-
tions to outline student characteristics,22
determine whether gender bias in lead-
ership could be reduced25 and to explore
individual experiences and behaviours of
leaders and followers,28 respectively.
Study design
Each quasi-experimental study utilised
repeated measures without a control group.
Two studies elected to conduct both the pre-
and post-intervention surveys concurrently
at the end of the intervention.19,23
Carufel-Wert et al and Sherrill et al
conducted cross-sectional studies of partic-
ipants post-intervention, and relied on
participants to attribute outcomes to the
intervention subjectively.18,22
ARTICLE
81 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
Data collection methods
The most commonly used data collection
tool was a self-reported written ques-
tionnaire, utilised in nine of the 11
studies.18–20,22–24,26–28 Self-effi cacy was used
in six studies as a proxy for objective
ability.18,19,22,23,27,28 Video analysis was
employed in three studies,21,27,28 with the
observers specifi cally blinded to participant
status (pre- or post-intervention) in two
studies.27,28 Interviews were conducted as
part of the evaluation in four studies.18,20,22,25
Only one study used direct observation as
the main evaluation tool.25
Study quality/overall risk of bias
The mean study quality score was 3.1 out
of a possible fi ve. Randomisation was used
in two studies,21,25 with participant blinding
conducted in only one study.25 Included
participants were in several trials either an
elite subgroup of medical students19,23,24 or a
small subgroup.20,22,27,28
Results were self-reported in the majority
of studies.18–20,22–24,26–28
Discussion
This systematic review identifi ed 11
studies investigating the effectiveness of
leadership training programmes in medical
school at outcome levels beyond Kirkpatrick
level 1. There was a diversity of methods
employed in these studies, in terms of
length, type, materials, setting and stage
of medical students. The reported results
demonstrate that despite this diversity,
programmes were broadly found to
improve knowledge and skills of leadership,
infl uence attitudes and promote leadership
behaviour in medical students. There were
indications that there may be positive down-
stream outcomes, though these were not
well described.
Despite heterogeneity, studies produced
broadly positive results. This leads to some
tentative suggestions for future design of
leadership programmes. Interventions
tended to utilise a combination of didactic
learning, tutorials and refl ective learning.
The effectiveness of a programme within
medical school did not seem to be infl uenced
by preclinical or clinical implementation,
suggesting that both may be effective.
Furthermore, because all durations of
intervention showed positive results, long
and complex courses may not be required
to achieve positive change; short, punchy
courses with clear objectives may well be
as effective. Given already packed curricula
and the fi nancial benefi t of running courses
of short duration, this would be a valuable
area to explore further.
This review differed from those previously
published by focusing on outcomes at Kirk-
patrick level 2 or higher. While this limited
the number of studies eligible for inclusion,
it enabled the authors to highlight more
objective outcomes. The increase in studies
reporting these higher-level outcomes is in
line with recommendations from previous
reviews,17 and could indicate an increased
awareness of researchers of the need to
establish fi rm outcomes.
Over the last decade there has been a
marked increase in the number of medical
schools offering leadership curricula. A
literature search by O’Connell and Pascoe
in 2004 only returned 15 articles with any
degree of relevance.34 Ten years later,
despite using more specifi c search terms,
Webb et al found 45 articles, each describing
a curriculum to teach leadership to under-
graduate medical students.17 While a
signifi cant proportion of medical knowledge
is imparted didactically, role modelling and
practical experience remain vital to medical
education.35,36 Given variation in clinical
experiences and role models encountered by
students,36,37 and the increased importance
placed internationally on development of
clinical leadership abilities,8–12,14–16,38 it is
logical that medical programmes should
move towards formal leadership training.
One of the clear limitations of the studies
reviewed was a lack of objective measures
of effectiveness of leadership training.
There is an established connection between
self-effi cacy and leadership, but it remains
a subjective measure of leadership effec-
tiveness. Whereas clinical ability has been
reliably assessed via an Objective Struc-
tured Clinical Examination (OSCE),39 and
teaching ability has been assessed via an
Objective Structured Teaching Examination
(OSTE),19,40 there is not yet an established
means of objectively measuring leadership
effectiveness. In order for the quality of
different interventions to be compared, it
is important for a reliable measurement
tool to be developed and accepted within
ARTICLE
82 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
the literature. Furthermore, the use of a
standardised framework for evaluation of
training programmes (such as Kirkpatrick’s
model) and the reporting of results in a
systematic manner based on such frame-
works will enable future reviewers to more
easily ascertain components and character-
istics of leadership training curricula that
determine their success.
The lack of a widely-accepted defi nition
of clinical leadership and what it entails
further complicates training, assessment
and comparison of approaches. Defi ni-
tions of leadership present a plethora of
core attributes that may or may not have
been covered by the curricula evaluated in
the included studies. A consensus on the
defi nition of clinical leadership may help
streamline future courses and facilitate
more robust and comparable evaluation
based on an objective defi nition.
Despite a search strategy designed for
high sensitivity, the lack of standardisation
of medical education article databases
necessitates parallel approaches to liter-
ature searching as employed in this review,
and increases the risk of missing relevant
publications.43 The limited utilisation of
established frameworks for evaluation of
teaching required the researchers to cate-
gorise research outcomes manually and in
some cases required consensus decisions.
Heterogeneity of interventions and evalua-
tions precluded meta-analysis, and reduced
the external validity of conclusions made.
Conclusion
In summary, the evidence evaluated by
this review supports further development
and evaluation of leadership training
programmes in medical schools. There is
broad agreement in the studies reviewed
that the programmes evaluated resulted in
positive outcomes for learners. Objective
measures of leadership training effectiveness
need to be developed however, and an
emphasis placed on evaluation of systemic
and patient outcomes. The reviewers
recommend that further research focuses
on the use of recognised training evaluation
frameworks for their research and reporting,
and on the evaluation of objective and
downstream outcomes. Further standard-
isation will afford increased applicability
and comparability to studies. This will be an
important step towards elucidating charac-
teristics of programmes which are important
for success.
Competing interests:
Nil.
Acknowledgements:
Bruce Su’a was supported by a Health Research Council Pacifi c Clinical Training Fellowship.
Author information:
Oscar Lyons, Research Fellow, South Auckland Clinical Campus, University of Auckland,
Auckland; Doctoral Candidate, The Nuffi eld Department of Surgery, University of Oxford,
United Kingdom; Bruce Su’a, Research Fellow and Doctoral Candidate, South Auckland
Clinical Campus, University of Auckland, Auckland; Michelle Locke, Senior Lecturer in
Surgery, South Auckland Clinical Campus, University of Auckland, Auckland; Plastic and
Reconstructive Surgeon, Counties Manukau District Health Board, Auckland; Andrew
Hill, Assistant Dean and Head, South Auckland Clinical Campus, University of Auckland,
Auckland; Consultant General Surgeon, Counties Manukau District Health Board.
Corresponding author:
Dr Oscar Lyons, South Auckland Clinical Campus, University of Auckland, c/- Middlemore
Hospital, Private Bag 93311, Otahuhu, Auckland 1640.
oscar.lyons@auckland.ac.nz
URL:
https://www.nzma.org.nz/journal/read-the-journal/all-issues/2010-2019/2018/vol-131-no-
1468-19-january-2018/7467
ARTICLE
83 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
REFERENCES:
1. Kotter J. Leading Change.
Harvard Bus. Sch. Press 198
(1996). doi:10.1111/j.1365-
2648.2011.05802.x
2. The King’s Fund. The future
of leadership and manage-
ment in the nhs: No more
heroes. London Kings Fund
38 (2011). doi:6 July 2015.
3. Marsch SCU, et al. Human
factors affect the quality
of cardiopulmonary
resuscitation in simulated
cardiac arrests. Resusci-
tation 60, 51–56 (2004).
4. West MA, et al. Leadership
clarity and team innovation
in health care. Leadersh.
Q. 14, 393–410 (2003).
5. Mäkinen M, et al. Assess-
ment of CPR-D skills of
nurses in Göteborg, Sweden
and Espoo, Finland:
Teaching leadership makes
a difference. Resuscitation
72, 264–269 (2007).
6. Andersen PO, Jensen MK,
Lippert A, Østergaard D.
Identifying non-technical
skills and barriers for
improvement of team-
work in cardiac arrest
teams. Resuscitation
81, 695–702 (2010).
7. Cooper S. Developing
leaders for advanced life
support: evaluation of a
training programme. Resus-
citation 49, 33–8 (2001).
8. Bhanji F, et al. Part 14:
Education. Circulation
132, S561–S573 (2015).
9. Darzi A. High Quality Care
For All: NHS Next Stage
Review Final Report.
Dep. Heal. 1–92 (2008).
doi:ISBN 978-0-10-174322-8
10. Mancini ME, et al. Part
12: Education, Imple-
mentation, and Teams.
Circulation 122, (2010).
11. NHS. Medical Leadership
Competency Framework:
Enhancing Engagement in
Medical Leadership - Third
Edition. NHS Institute for
Innovation and Improve-
ment, Academy of Medical
Royal Colleges (2010).
12. Spurgeon P, Clark J, Ham C.
Medical leadership : from
the dark side to centre
stage. (Radcliffe Pub, 2011).
13. Frank JR, Danoff D. The
CanMEDS initiative: imple-
menting an outcomes-based
framework of physician
competencies. Med. Teach.
29, 642–647 (2007).
14. Standards for Assessment
and Accreditation of
Primary Medical Programs
by the Australian Medical
Council 2012. (2012).
15. New Zealand Curriculum
Framework for Prevoca-
tional Medical Training.
16. Simpson JG, et al. The
Scottish doctor--learning
outcomes for the medical
undergraduate in Scot-
land: a foundation for
competent and refl ective
practitioners. Med. Teach.
24, 136–143 (2002).
17. Webb AMB, et al. A First
Step Toward Under-
standing Best Practices in
Leadership Training in
Undergraduate Medical
Education. Acad. Med.
89, 1563–1570 (2014).
18. Carufel-Wert DA, et al.
LOCUS: immunizing medi-
cal students against the
loss of professional values.
Fam. Med. 39, 320–5 (2007).
19. Coleman MM, Blatt B,
Greenberg L. Prepar-
ing students to be
academicians: a national
student-led summer
program in teaching,
leadership, scholarship,
and academic medical
career-building. Acad.
Med. 87, 1734–41 (2012).
20. Goldstein AO, et al.
Teaching Advanced
Leadership Skills in
Community Service (ALSCS)
to medical students. Acad.
Med. 84, 754–64 (2009).
21. Hunziker S, et al. Brief lead-
ership instructions improve
cardiopulmonary resus-
citation in a high-fi delity
simulation: A randomized
controlled trial*. Crit. Care
Med. 38, 1086–1091 (2010).
22. Sherrill WW. Dual-degree
MD-MBA students: a look
at the future of medical
leadership. Acad. Med.
75, S37-9 (2000).
23. Smith KL, Petersen DJ,
Soriano R, Friedman E,
Bensinger, LD. Training
Tomorrow’s Teachers
Today: a national medical
student teaching and
leadership retreat. Med.
Teach. 29, 328–34 (2007).
24. Warde, CM, Vermillion M,
Uijtdehaage S. A medical
student leadership course
led to teamwork, advocacy,
and mindfulness. Fam.
Med. 46, 459–62 (2014).
25. Wayne NL, Vermillion M,
Uijtdehaage S. Gender
differences in leader-
ship amongst fi rst-year
medical students in the
small-group setting. Acad.
Med. 85, 1276–81 (2010).
26. Bergman D, Savage C,
Wahlstrom R, Sandahl C.
Teaching group dynamics-
-do we know what we
are doing? An approach
to evaluation. Med.
Teach. 30, 55–61 (2008).
27. Meier AH, et al. A
surgical simulation
curriculum for senior
medical students based
on TeamSTEPPS. Arch.
Surg. 147, 761–6 (2012).
28. Meurling L, Hedman L,
Felländer-Tsai L, Wallin
C-J. Leaders’ and followers’
individual experiences
during the early phase of
simulation-based team
training: an exploratory
study. BMJ Qual. Saf.
22, 459–67 (2013).
29. Steinert Y, Nasmith L,
McLeod PJ, Conochie L. A
teaching scholars program
to develop leaders in
medical education. Acad.
Med. 78, 142–9 (2003).
30. Itani KMF, Liscum K,
Brunicardi FC. Physician
leadership is a new
ARTICLE
84 NZMJ 19 January 2018, Vol 131 No 1468
ISSN 1175-8716 © NZMA
www.nzma.org.nz/journal
mandate in surgical
training. Am. J. Surg.
187, 328–31 (2004).
31. Liberati A, et al. The
PRISMA statement for
reporting systematic
reviews and meta-anal-
yses of studies that
evaluate healthcare
interventions: explanation
and elaboration. BMJ 339,
b2700–b2700 (2009).
32. Steinert Y, Naismith L,
Mann K. Faculty develop-
ment initiatives designed
to promote leadership in
medical education. Med.
Teach. 34, 483–503 (2012).
33. Higgins JPT, Green S.
Cochrane Handbook for
Systematic Reviews of
Interventions Version 5.1.0
[updated March 2011]. in
The Cochrane Collaboration
Table 7.7.a: Formulae for
combining groups (2011).
34. O’Connell MT, Pascoe JM.
Undergraduate medical
education for the 21st
century: leadership and
teamwork. Fam Med
36, S51–S56 (2004).
35. Dornan T. Osler, Flexner,
apprenticeship and ‘the
new medical education’. J.
R. Soc. Med. 98, 91–5 (2005).
36. Lempp H, et al. The
hidden curriculum in
undergraduate medical
education: qualitative
study of medical students’
perceptions of teaching.
BMJ 329, 770–3 (2004).
37. Markham FW. et al.
Evaluations of medical
students’ clinical experi-
ences in a family medicine
clerkship: Differences
in patient encounters by
disease severity in different
clerkship sites. Fam. Med.
34, 451–454 (2002).
38. Frank JR, Snell L, Sherbi-
no JE. CanMEDs 2015
Physician Competency
Framework. CanMEDS
2015 Physician Compe-
tency Framework.
Ottawa: Royal College of
Physicians and Surgeons
of Canada 1–30 (2015).
39. Harden R, Stevenson M,
Downie W. Assessment of
clinical competence using
objective structured exam-
ination. Br. Med. J. (1975).
40. Sturpe DA, Schaivone KA.
A primer for objective
structured teaching
exercises. Am. J. Pharm.
Educ. 78, (2014).
41. King HB, et al. Team-
STEPPS(TM): Team
Strategies and Tools to
Enhance Performance and
Patient Safety. Advances
in Patient Safety: New
Directions and Alternative
Approaches (Vol. 3: Perfor-
mance and Tools) (2008).
42. Robertson ER, et al. Oxford
NOTECHS II: A modifi ed
theatre team non-technical
skills scoring system.
PLoS One 9, (2014).
43. Sharma R, Gordon M,
Dharamsi S, Gibbs T.
Systematic reviews in
medical education: A
practical approach: AMEE
Guide 94. Med. Teach.
37, 108–124 ( 2015).
ARTICLE