Content uploaded by Claudio Lucchiari
Author content
All content in this area was uploaded by Claudio Lucchiari on Apr 24, 2015
Content may be subject to copyright.
The role of patient involvement in the diagnostic process in
Internal Medicine: a cognitive approach
Claudio Lucchiari, Gabriella Pravettoni
Università degli Studi di Milano
This is a pre-print versione. To Cite this article:
LUCCHIARI, Claudio; PRAVETTONI, Gabriella. The role of
patient involvement in the diagnostic process in internal
medicine: a cognitive approach. European journal of internal
medicine, 2013, 24.5: 411-415.
1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
1
2
Abstract
Much cognitive and clinical research has addressed clinical
reasoning, pointing out that physicians often have difficulties in
following a linear course when making accurate diagnoses.
Some authors suspect that physicians make mistakes because
they unknowingly fail to observe the laws of formal logic and
that their reasoning becomes influenced by contextual factors.
In this paper, we introduce some basic principles of the
cognitive approach to medical decision making and we describe
the cognitive balanced model. Then we discuss the relationship
between construction of mental models, cognitive biases and
patient involvement by the use of a clinical vignette.
Medical decisions may be considered fundamentally biased
since the use of judgment heuristics and a combination of
cognitive-related and system-related factors limit physicians'
rationality.
While traditional understanding of clinical reasoning has failed
to consider contextual factors, most techniques designed to
avoid biases seem to fail in promoting sound and safer medical
2
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
3
4
practice. In particular, we argue that an unbiased process
requires the use of a cognitive balanced model, in which
analytical and intuitive mind skills should be properly
integrated.
In order to improve medical decision making and thereby
lessen incidence of adverse events, it is fundamental to include
the patient perspective in a balanced model. Physicians and
patients should improve their collective intelligence by sharing
mental models within a framework of distributed intelligence.
Keywords: medical decision making; cognitive biases;
overconfidence; diagnostic errors;
3
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
5
6
Introduction
The diagnostic process is probably the most relevant
component in medical decision making from a cognitive point
of view. In fact, physicians need to work like an information
processor, which collects data from the environment, infers
judgments and produces clinical scenarios. Much research has
been devoted to this important topic but most is superficial,
both when it succeeds and when it fails. Actually, diagnostic
error accounts for a substantial portion of all medical errors,
receiving increased attention in the last 30 years [1]. However,
it is astonishing that the error rate seems to remain constant
over time and space, as demonstrated in two studies (one in the
US and one in Germany) which indicate how the error rate has
not substantially changed over since 1980, remaining firmly
anchored in both countries at a rate of around 10%, although
alarmingly a recent systematic review reported a rate as high
as 24% [2].
Generally speaking, most errors are reported to occur
within the information analysis stage. Physicians declare
failures or delays in identifying significant clues and in
4
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
7
8
prioritizing clinical information. In this stage, physicians, as
any other expert decision makers, need to gather data from the
environment and to organize them onto a so called mental
model. Indeed, the human mind works on well-structured data
that may be represented and cognitively processed, in order to
define a problem, highlights solutions and takes actions in a
cognitive loop (see figure 1). Thus, all the incoming
information needs to be weighted for relevance and tested for
reliability before being integrated in a mental model [3].
The activation of a first mental model starts with the
diagnostic process. In fact, using this mental structure based on
schemes stored in the long-term memory, a physician may
evaluate the consequences of each possible choice (diagnostic
or therapeutic interventions), in order to plan future actions,
choose scenarios, or even review the active mental model.
Figure 1 here
A number of studies have highlighted the complex nature of
making medical decisions, which cannot be considered a
cognitive exercise completely based on rational and technical
skills [4;5]. In particular, cognitive research has shown that the
clinical setting is influenced by heuristic processes, intuition
and a number of biases, or cognitive illusions, that can lead a
physician far from ideal clinical reasoning [6]. Recent studies
5
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
9
10
showed that it is possible to understand and prevent errors in
internal medicine starting by the recognition of the interaction
between cognitive-related and system-related causes [7] and
learning to detect early warnings [8].
A Cognitive Balanced Model
In previous work [9], we have defined a cognitive balanced
model to describe how the clinical decision setting should be
represented by a functional balance between analysis and
intuition, that is, between the two basic components of the
human mind [10]. The cognitive balanced model is based on
the assumption that the use of concepts and logical reasoning
should be developed in medical education along with specific
training within the realm of intuitive skills. In particular, it
emphasizes the importance of developing specific awareness
about of the need for balance, since the lack of awareness will
inevitably expose physicians and patients to clinical hazards.
Indeed, an overconfidence in analytical skills or the
underestimation of the importance of implicit thought will
increase the likelihood of falling into cognitive traps [11], and
failure to understand the origin of many errors.
Of course, the development of analytical skills and intuition
follow different paths. To follow logical and analytical schemas
6
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
11
12
it is necessary to learn general methods, specific concepts and
techniques as well as how to apply them in certain domains.
Intuition, in contrast, is developed with experience, essentially
during everyday activity and thus it is difficult to plan
training specifically aimed at develop intuitive skills. However,
it is possible to design education programs compatible with the
needs of the intuitive mind. Generally speaking, a strong
learning environment [12], characterized by consistency,
regularity, timely feedback and meta-cognitive moments can be
considered to be “pro-intuitive” [13].
Medical practitioners must learn to trust their intuition, but
also to know its limits. In particular, intuition is much more
powerful and reliable, when functioning within the specific
context in which it was developed. Doctors’ expert eye should
not be transferred automatically from one medical context to
another.
The cognitive balanced model highlights how these meta-skills
should belong to the cognitive background of a doctor. Without
this background, error prevention protocols and techniques to
cut down biases will always be partial solutions [14]. The
cognitive balanced model also implies that doctors should be
properly supported in both their training, and in their clinical
everyday practice by specific decision aids. However, also
these support systems should be designed to balance the
strength of analytical methods with the need for intuitive
7
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
13
14
evaluation. In contrast, most existing support systems have a
cognitive architecture mainly based on analytical algorithms
and static knowledge structures such as decision trees and
deterministic decision-making methods.
Furthermore, our perspective proposes use of general processes
that can be analyzed as a whole, instead of addressing simple
and elementary “mind bugs”. In particular, while agreeing with
the literature data [15] we argue that there are two general
conditions that often lead to an unbalanced decision-making
process and to potential adverse events: overconfidence bias
and premature closure.
Premature closure is the tendency to avoid considering other
possibilities after reaching a diagnosis, while overconfidence
bias is the tendency to overestimate one’s judgment ability.
Premature closure can lead to stopping the diagnostic process
even before a favoured diagnosis is actually confirmed by
appropriate clinical examination. It should derive from a strong
cognitive load which depends on several factors [personal,
inter-personal and contextual) and is time-dependent. More
specifically, premature closure may be the result of the
combination of an individual’s need for cognitive closure along
with certain contextual factors. It is obvious that intuitive
thinking is strongly associated with premature closure, even if
specific training could teach physician both to trust in their
8
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
15
16
intuition and to activate subsequent meta-cognitive control on
it.
Also overconfidence is a consequence of a number of direct
and indirect drivers, including age and experience. In
particular, overconfidence bias seems to be particularly
significant for expert doctors, since they have developed sound
competences and confidence in them.
Interestingly, experienced physicians are as likely as novices to
exhibit premature closure and indeed senior physicians may be
particularly predisposed both to premature closure and
overconfidence, probably because of the development of age-
related cognitive constraints [16].
The special importance that overconfidence and premature
closure seem to play in the diagnostic process probably lies in
some basic mind processes. The overconfidence bias leads to
the creation of a conservative mental model, ready to use, and
the need for closure exerts pressure to confirm the same mental
model in order to avoid cognitive and emotional overloading. A
particular mental model, by itself, can also contain complex
analytical processes and procedures, incorporating both
intuitive and analytical knowledge. Nevertheless, a lack of
awareness about decision making mechanisms may lead to the
use of unbalanced models.
In order to avoid the cognitive pitfalls it would be desirable to
implement an unbiased process in which incoming data are
9
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
17
18
organized in a mental model that highlights essential
information and leads to a safer diagnostic process.
However, we propose that a balanced process cannot really be
effective within a clinical context if it is built in isolation from
the context. The medical scenario includes different actors, in
particular physicians and patients. To avoid errors and to
strengthen the power of cognitive processes, mental structures
should be shared and the related intelligence distributed.
Patient involvement and error prevention
Although progress has been made in a number of specific areas
of prevention of errors, the patient's role in protecting and
promoting his or her own safety has long been neglected.
The scientific literature on this topic is scarce, despite some
positive cognitive studies which have suggested that this may
be a fruitful area to cut down errors. Indeed it has been
observed that patients seem to be quite efficient in detecting
errors and reporting risk situations.
Patient involvement in error detection and prevention has been
recommended, for example, by the US Institute of Medicine,
the American Hospital Association and by some clinical
experts [17;18;19]. In particular, the involvement of patients is
thought to be vital in avoiding errors in administration of
drugs, which cause many adverse events. Different studies in
10
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
19
20
internal medicine departments showed that prescription-related
error are not rare and that in the last decades error rate did not
substantially change. In particular, errors are originated by
incomplete, duplicative, or contradictory orders or by the
failure to adjust dosages for comorbid conditions.
For instance, in a two-year prospective study on nurses [20],
141 drug administration errors were found on 4.752
hospitalizations. Forty one percent of these errors were errors
in planning (omission of the administration and measurement
checks), 21% were errors in writing or transcription of
prescriptions and 38% were errors in dispensing drugs (mainly
wrong dose or wrong medication). Most researchers suggest to
adopt corrective measures by the implementation of a
computerized physician order entry system that allow the
elimination of transcribing errors and permit the introduction of
alarm systems. However, we argue that an active patient
involvement is equally important.
Administration of medication in cancer patients is particularly
critical, given the narrow therapeutic window of cytotoxic
drugs. Patients undergoing cycles of chemotherapy may be able
to participate in error prevention, since being exposed to
similar procedures several times may enable them to adopt
clinical abilities.
These skills might be considered the result of the development
of clinical competences (knowledge of the therapy,
11
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
21
22
pharmacodynamic principles, methods and procedures)
acquired by undergoing continuous clinical testing, inspections
and treatments. The repeated experiences over time, allow the
acquirement of a set of adaptive strategies for patients to take
control of their therapeutic journey [21;22]. If properly
informed and motivated, patients have the capability to play an
important role in the prevention or at least diminution of
adverse events. Yet little research has been undertaken on how
to effectively engage patients in this role of "watchful partner"
in their own healthcare [23;24].
In one study [25] doctors were asked to assess the perceived
effectiveness of fourteen recommended actions to prevent
errors. Results indicated that most actions were considered
effective. However, respondents also indicated that the
possibility to be involved in similar actions in their daily
activity were improbable. Having a greater self-efficacy of
being able to prevent medical errors is significantly correlated
with a higher probability to report and to be engaged in
preventive actions. For instance, feeling able to involve
patients in medical decision making and adverse event
prevention will increase the probability that a physician will
actually work to involve each subsequent patient in their
medical journey. To improve patients’ involvement, thus,
physicians need specific training to increase self-efficacy and
not just a general set of guidelines.
12
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
23
24
Clearly, the desire for involvement and participation in the
healthcare process and shared medical decision making also
depends on certain characteristics of patients. Age, gender,
level of education and personality traits are all factors capable
of modulating the need for involvement and the ability to be a
pro-active patient. Generally speaking, younger patients tend to
report a greater desire for involvement than older patients.
Women seem to prefer a more active role than males, as do
patients with more time in higher education. Younger and more
educated patients have a greater ability to obtain and
understand health-related information and thus they will be
more likely to become involved in health-related decisions
[26].
Past experience and the specific disease characteristics are
other components to be considered in understanding a patient’s
ability to be involved [26]. For instance, patients who have had
a recent myocardial infarction, coronary angioplasty or bypass
surgery are more prone to seek involvement in medical
decisions, compared to those patients who have no history of
cardiovascular disease.
The relationship between patient involvement and
cognitive traps: an example.
13
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
25
26
Patients can be critical to the efficiency of the diagnostic
process, as illustrated with the following example.
Mr. Smith went to the emergency department with a severe
abdominal pain. The pain arose suddenly, starting from the
solar plexus and spread though to his back, persisting for
several hours, before the decision to go to the nearest hospital.
Here, after a brief giving a complete history including careful
description of his pain, he had an electrocardiogram and blood
tests to rule out a myocardial infarction. This having been
ruled out by normal tests, he was given analgesic drugs and a
proton pump inhibitor, and discharged to the care of his
family doctor.
After a second acute episode, the family doctor referred Mr
Smith to have an esophagogastroduodenoscopy. The
endoscopic examination did not indicate the presence of
gastroesophageal reflux and also the histological results of
biopsies were negative. A diagnosis of moderate chronic
gastritis was made and he was recommended to continue with
the same therapy plus small fonts of Butylscopolamine as
required. The patient continued to suffer painful episodes
which became increasingly difficult to control with his
prescribed drugs so Mr. Smith decided to consult a specialist
in gastroenterology. After a brief history and description of
symptoms, the specialist read the report of the previous
endoscopy and then, having briefly examined the patient
14
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
27
28
confirmed the diagnosis of chronic gastritis, agreed with the
existing medications and suggested some diet changes.
Despite good general health, Mr Smith continued to experience
episodes of acute pain, only partially controlled by
Butylscopolamine and began to lose weight, put down to the
results of the dietary restrictions and physical exercise.
Eventually, after four referrals to the emergency department in
eighteen months for acute pain, each time being recommended
the same painkiller and sedative treatments, and the refusal of
further investigation he was seen by a surgeon who suspected
that the pain might be due to biliary colic. A careful ultrasound
confirmed the existence of gallstones and after hospitalisation
he was found to have signs of hepatitis and pancreatitis. The
patient was subsequently operated for removal of his gall
bladder and stones, and discharged from hospital free of pain
for the first time in two years.
The case of Mr. Smith, is not uncommon and has various
interesting facets, most critically the total absence of
involvement of the patient in the diagnostic process. Mr. Smith
was considered by the many doctors who met him (his family
doctor, specialists, emergency department internists) as a mere
symptoms and signs carrier. It is obvious that after the first
diagnosis the mental model of the subsequent doctors was built
instantly on it (so-called anchoring process) [27].
15
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
29
30
Indeed, the first question when encountering a patient coming
to the emergency department with a painful condition is often
“did you already suffered from stomach problems?”. A positive
response ("I have been diagnosed with gastritis") will
automatically elicit the activation of a mental model that will
guide the subsequent actions.
Once the pain goes away, the patient does not bring any more
information and thus may be discharged, in order to rapidly
close the case (in fact, the emergency department internist
won’t see the patient anymore). The patient will not be asked
for further information about his state of health (for example,
with respect to weight loss), nor will his request for further
investigation be heeded.
In practice, the use of a stand-alone mental scheme, not to be
shared with the patient, caused an overconfidence bias and
subsequently the need for a premature closure determined the
course, and the end of clinical reasoning among Mr Smith’s
caring professionals.
The last surgeon observed the patient without the filter of the
initial framing information, since his clinical history was
collected anew from the patient, independent of the previous fat
file of case note. In this way, the surgeon was guided only by
his cultural background and expertise that led him to build a
mental model specifically focused on the symptoms which he
ascribed to colic. We argue that greater patient involvement in
16
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
31
32
medical decision-making would have allowed internists to
doubt their own mental model and share with the patient a
more flexible one. In this way, Mr. Smith would not have
suffered a two years delay in diagnosis.
Of course, this case could be construed as a classic case of
delayed diagnosis due to a biased mind process, however, it is
our opinion that it is essential to consider also the patient
involvement issues to improve our ability to appraise the whole
situation. In fact, the overconfidence bias and the premature
closure acted not only on the individual doctor, preventing a
balanced logical thought process, but also on the relationship
with others, in this case with the patient.
Lacking technical expertise and specific knowledge the patient
was overwhelmed by each physician’s confidence in their
decisions. Subsequently premature closure precluded the use
of, or the search for further data, such as marginal signs and
perpetuation of the patient’s complaints despite what should
have been appropriate medication.
A doctor/patient relationship model based on openness and
sharing would have allowed the patient to declare his doubts, to
better describe the characteristics of this pain (as well as his
other symptoms), and to have the confidence to make effective
requests for alternative management. In this way, Mr Smith’s
legion of doctors would have been able to observe their own
mental models from a different perspective and to assess more
17
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
33
34
objectively the next steps to be performed. In this way, any of
them could have activated a balanced thought process, avoiding
the insidious cognitive traps, particularly the overconfidence
bias (see figure 2) into which the internists in our example
inevitably fell.
Figure 2 here
Conclusion
The paradigm of cognitive balanced model is not only a
theoretical framework that allows us to create a sort of
metaphorical description of clinical reasoning. but also, the
cognitive balanced model helps focus attention on all those
mechanisms (both at individual and social level) capable of
unbalancing the clinical reasoning in one way or another. A
lack of awareness about the functioning of the mind and the
failure to elicit a shared context give rise to double imbalance
in the diagnostic process, consequently doubly-dangerous.
The active involvement of the patient can be a powerful
mechanism to balance the cognitive course also within the
realm of the diagnostic process and not only in oncology or
chronic diseases. According to our model, the involvement of
the patient represents a solid anchor for clinical reasoning in
many clinical scenarios. However, this entails a change of
18
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
35
36
paradigm, since the physician must be able to shift from the
construction of an individual (stand-alone) mental model (much
simpler to build and to manage) to a shared mental model,
according to the paradigm of distributed cognition [28]. The
patient and the doctor, in this way, would work as a small team,
sharing information, purpose and decisions [29]. The sharing
and the co-construction of mental models to be used during the
diagnostic journey is a mechanism for increasing the collective
intelligence (in this case the community is made up of the
doctor-patient duet), which acts as an error pre-emptive tool
[29;30;31]. However, building a distributed cognitive model is
much more complex and requires skills that many doctors did
not have the opportunity to develop in their education or
professional experience. It will therefore be a task of higher
education agencies to help clinicians in this direction, in order
to significantly contribute to reduce medical errors. Even if is
possible to plan the use of sophisticated decision support
systems to prevent misdiagnosis events (as the one we
described), we argue that the first step toward a safer diagnostic
path is the active involvement of patients. Any other devices
will be useful tools to be used within a paradigm of shared
cognition between patients and physicians within a balanced
cognitive model.
19
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
37
38
References
[1] Mamede S, Schmidt HG, Rikers R. Diagnostic errors and
reflective practice in medicine. J Ev Clin Pract 2007;13:138–45.
[2] de Vries EN, Ramrattan MA, Smorenburg SM, Gouma DJ,
Boermeester MA. The incidence and nature of in-hospital adverse
events: a systematic review. Qual Saf Health Care 2008;17:216-23.
[3] Blendon RJ, DesRoches CM, Brodie M, et al. Views of
practicing physicians and the public on medical errors. N Engl J Med
2002;347:1933-40.
[4] Norman GR, Eva KW. Diagnostic error in clinical
reasoning. Med Educ 2010;44:94-100.
[5] Croskerry P. Achieving quality in clinical decision
making: cognitive strategies and detection of bias. Acad Emerg
Med 2002;9:1184-204.
[6] Croskerry P. A universal model of diagnostic reasoning.
Acad med 2009;84:1022-8.
[7] Graber ML, Franklin N, Gordon R. Diagnostic error in
internal medicine. Arch Intern Med 2005;165:1493-9.
[8] Balla J, Heneghan C, Goyder C, Thompson M. Identifying
20
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
39
40
early warning signs for diagnostic errors in primary care: a qualitative
study. BMJ open 2012;2.
[9] Lucchiari C, Pravettoni G. Cognitive balanced model: a
conceptual scheme of diagnostic decision making. J Eval Clin
Pract 2012;18:82-8.
[10] Stanovich K. Who Is Rational: Studies of Individual
Differences in Reasoning. Mahwah, N.J.: Lawrence Erlbaum
Associates, 1999.
[11] Klein JG. Five pitfalls in decisions about diagnosis and
prescribing. BMJ 2005;330-81
[12] Hogarth R M. Educating intuition. Chicago: University
of Chicago Press, 2001.
[13] Klein G. Naturalistic decision making. Human factors.
2008;50:456-60
[14] Normann G. Dual processing and diagnostic errors. Adv in
Health Sci Educ 2009;14:37–49
[15] Berner ES, Graber ML. Overconfidence as a cause of
diagnostic error in medicine. Am J Med 2008;121:22-3.
[16] Choudhry NK, Fletcher RH, Soumerai SB. Systematic
review: the relationship between clinical experience and quality
of health care. Ann Int Med 2005;142:260-73.
[17] Schwappach DLB, Wernli M. Medication errors in
chemotherapy: incidence, types and involvement of patients in
prevention. A review of the literature. Eur j Cancer Care
2010;19:285-92.
21
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
41
42
[18] Coulter A. Patient safety: what role can patients play?
Health Expect 2006;9:205-6.
[19] Vincent C A, Coulter A. Patient safety: what about the
patient? Qual Saf Health Care 2002;11:76-80.
[20] Ford CD, Killebrew J, Fugitt P, Jacobsen J, Prystas E
M. Study of medication errors on a community hospital
oncology ward. J Oncol Pract 2006;2:149-54.
[21] Franneby U, Sandblom G, Nyren O, Nordin P,
Gunnarsson U. Self-reported adverse events after groin hernia
repair, a study based on a national register. Value Health
2008;11:927-32.
[22] Schwappach DL. "Against the silence": development
and first results of a patient survey to assess experiences of
safety-related events in hospital. BMC health services research
2008;8:59.
[23] Weissman JS, Schneider EC, Weingart SN, Epstein
AM, David-Kasdan J, Feibelmann S, et al. Comparing patient-
reported hospital adverse events with medical record review: do
patients know something that hospitals do not? Ann Intern Med
2008;149:100-8.
[24] Unruh KT, Pratt W. Patients as actors: the patient's role
in detecting, preventing, and recovering from medical errors.
International journal of medical informatics 2007;76 Suppl
1:S236-44.
22
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
43
44
[25] Hibbard JH, Peters E, Slovic P, Tusler M. Can patients
be part of the solution? Views on their role in preventing
medical errors. MCRR 2005;62:601-16.
[26] Davis RE, Jacklin R, Sevdalis N, Vincent CA. Patient
involvement in patient safety: what factors influence patient
participation and engagement? Health Expect 2007;10:259-67.
[27] Rottenstreich Y, Tversky A. Unpacking, repacking, and
anchoring: advances in support theory. Psychol Rev
1997;104:406-15.
[28] Perry M. Distributed Cognition. In J.M. Carroll (Ed.)
HCI Models, Theories, and Frameworks: Toward an
Interdisciplinary Science. Burligton: Morgan Kaufmann, 2003
[29] Woolf S H. Shared decision-making: the case for
letting patients which choice is best. J Fam Pract 2005; 45:205-
18.
[30] Epstein RM. Whole mind and shared mind in clinical
decision-making. Pat Edu Couns 2012. Epub 2012/08/14.
[31] Cote S, Lopes PN, Salovey P, Miners CTH. Emotional
intelligence and leadership emergence in small groups.
Leadership Quart 2010;21:496-08.
23
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
45
46