ArticlePDF Available


Diagnostic reasoning is a critical aspect of clinical performance, having a high impact on quality and safety of care. Although diagnosis is fundamental in medicine, we still have a poor understanding of the factors that determine its course. According to traditional understanding, all information used in diagnostic reasoning is objective and logically driven. However, these conditions are not always met. Although we would be less likely to make an inaccurate diagnosis when following rational decision making, as described by normative models, the real diagnostic process works in a different way. Recent work has described the major cognitive biases in medicine as well as a number of strategies for reducing them, collectively called debiasing techniques. However, advances have encountered obstacles in achieving implementation into clinical practice. While traditional understanding of clinical reasoning has failed to consider contextual factors, most debiasing techniques seem to fail in raising sound and safer medical praxis. Technological solutions, being data driven, are fundamental in increasing care safety, but they need to consider human factors. Thus, balanced models, cognitive driven and technology based, are needed in day-to-day applications to actually improve the diagnostic process. The purpose of this article, then, is to provide insight into cognitive influences that have resulted in wrong, delayed or missed diagnosis. Using a cognitive approach, we describe the basis of medical error, with particular emphasis on diagnostic error. We then propose a conceptual scheme of the diagnostic process by the use of fuzzy cognitive maps.
Cognitive Balanced Model: a conceptual scheme of diagnostic decision making1
Claudio Lucchiari, Gabriella Pravettoni
Università degli studi di Milano
Authors: Claudio Lucchiari (PhD), Prof. Gabriella Pravettoni
Institution: Università Degli studi di Milano, Via Passione 7, 20122 Milano (Italy)
Running title: A conceptual scheme of diagnostic decision making
Corresponding author:
Claudio Lucchiari
Dipartimento di Studi Sociali e Politici
Università degli Studi di Milano
Via Conservatorio 7,
20122 Milano, Italy.
Tel. 02 50321237, Fax: 0250321240
Diagnostic reasoning is a critical aspect of clinical performance, having a high impact on quality
and safety of care. Although diagnosis is fundamental in medicine, we still have a poor
understanding of the factors that determine its course.
According to the traditional approach all information used in diagnostic reasoning is objective and
logically derived. However, these conditions are not always met. Although we would be less likely
to miss diagnosis when following rational decision making, as described by normative models, the
real diagnostic process works in a different way. Recent work has described the major cognitive
biases in medicine as well as a number of strategies for reducing them, collectively called debiasing
techniques. Although, cognitive science gave rise to an extensive empirical literature on cognitive
biases in medical decision-making, this advance has found obstacles in actually entering medicine.
While the traditional approach to clinical reasoning, failed to consider contextual factors, most
debiasing techniques seem to fail in raising sound and safer medical praxis. Technological
solutions, being data-driven, are fundamental in increasing care safety, but they need to interact
with the human factor. Thus balanced models, cognitive-driven and technology based, are needed
in day-to-day applications to actually improve the diagnostic process.
The purpose of this article is to provide insight into cognitive influences that have resulted in
wrong, delayed or missed diagnosis. We will use a cognitive approach to describe the basis of
medical error, with particular emphasis on diagnostic error.
Key words: medical decision making, diagnostic error, cognitive biases, heuristics, cognitive maps,
clinical reasoning,
Safety and quality of care are not only a matter of technical skills, scientific knowledge and
guidelines. At the opposite, contemporary medicine often faces obstacles that lie outside the
medicine realm, being intrinsically due to the way the human mind works. In this sense, the current
practices in medicine represent an intriguing field for applying cognitive science. This is
particularly true when facing the diagnostic process. Indeed the analysis of cognitive processes
within medicine requires the use of sophisticated models to explain the complex path that lead from
signs and symptoms to a diagnosis.
Cognitive science may thus help physicians to properly apply their knowledge in searching and
filtering clinical and laboratory data, in developing a differential diagnosis, and, finally, in making
a rational decision without falling in some cognitive traps.
Many research has been devoted to this important topic, but little is still known about the deep
nature of the diagnostic process, both when it succeeds and when it fails. Diagnostic error accounts
for a substantial fraction of all medical errors and it has received increasing attention in last 30
years [1], starting from the pioneer work of Elstein and colls. [2].
Since this time interest in diagnostic processes and errors increased and related studies questioned
previous data and traditional medical knowledge. Particularly interesting is the current debate
concerning the importance of competence and the process of board certification in order to
minimize the risk of diagnostic errors.
At least four areas of competence were identified as necessary [3]: medical interviewing, physical
examination, data analysis and clinical judgment.
We agree that a physician must be competent in these areas to minimize errors. However, we do not
consider this to be enough as errors are not only caused by a lack of competence or knowledge. In
this sense, cognitive science may contribute in discovering others pathways to understand the nature
of diagnostic error [4].
In this article we will first shortly refer to error rates before describing cognitive roots of diagnostic
errors. A cognitive approach will be presented and suggestions for future research will be made.
Diagnostic errors
When discussing medical errors, many possible patterns, causes, and covariant factors must be
considered. The field of medical errors is broad and many different scenarios can be analyzed from
different perspectives.
In his classic studies of clinical reasoning, Elstein [5] estimated the rate of diagnostic errors to be
approximately 15%, fairly consistent with the 10% to 15% error rate determined in autopsy studies
[6]. However, other researchers found more alarming data. For instance, The study of Leap [7]
showed astonishing numbers in the domain of diagnosis, the author, in fact, quoted several autopsy
studies with rates as high as 35-40% of missed diagnoses.
Shojania and colls [8] reviewed autopsy studies focused on the diagnosis of pulmonary tuberculosis
highlighting that almost the 50% of these diagnoses were not indicated ante-mortem. A similar
percentage was found by Pidenda et al. [9] analyzing cases of fatal pulmonary embolism. Finally,
Schiff and colls [10] analyzed 583 physicians-reported errors by surveying physicians from 22
hospitals across the US. Results of the survey indicated that 69% of reported errors were moderate
or severe and that most of the missed or delayed diagnoses were pulmonary embolism and drug
reactions, while stroke and coronary syndrome had a significant, though lower, frequency.
It is interesting to note that most reported errors were attributed to information analysis. Surveyed
physicians admit failures or delays in identifying significant clues and in prioritizing clinical
information, addressing cognitive-related difficulties.
Framing the medical choice
Framing is an immediate process that will lead a physician in search of information, both general
and specific, coming from a variety of different sources: patients, family, colleagues, laboratories,
nursing staff, specific data-bases and the tacit knowledge. All these information need to be weighted
for their relevance and reliability before being integrated in a given template or, more properly, in a
mental model [11]. Doctors need to build one as soon as possible because this simplifies the
situation, by deleting “unnecessary” information and by directing future actions.
This first mental model can be described as the starting point in the diagnostic process.. Starting
from this mental structure which is based on schemes that are
already present in the long-term memory, physicians are able to evaluate the consequences of each
possible choice (diagnostic or therapeutic interventions), in order to plan futures actions, choosing
options and possibly entirely reviewing the mental model that was first created These processes
only apparently require the use of a formally defined logical path. In reality they call into question
several disturbing factors (personal, relational or context-related factor such as working conditions,
time, availability of beds, diagnostic tools and general resources), which can significantly affect the
linearity of the process and make it in some unfortunate cases, a real trap, both for doctors and
patients. The initial data processing constitutes a really insidious trap, since cognitive failures were
found to happen particularly in the information synthesis stage, where a physician is called to
process and combine incoming data [12].
Probably due the a poor mental model [11] the diagnostic process is vulnerable to so-called
cognitive traps which are implicit mechanisms that contaminate reasoning and decision-making,
generally referred to as biases. Physicians are than implicitly guided by cognitive simplification
These activities of the mind can be characterized as a top-down process based on specific cognitive
functions called heuristics [13, 14, 15, 16, 17,18]. Heuristics are rules of thumb that facilitate our
cognitive work, providing inexpensive and effective way for solving complex problems in a short
time. Although they produce good results in many situations, they do not always work properly in
the clinical setting.
Several heuristics that apply in medicine [19] are reported in literature showing that more than 40
cognitive biases may affect clinical reasoning [20, 21, 22]. We are highlighting here on the most
commonly described (see table 1).
Table 1: Heuristics and cognitive bias affecting medical decision making.
Cognitive Mechanisms Description
This applies to the shortcut that one takes to avoid a formal process for the
estimation of disease frequency in a given "population" characterized by a set
of signs and symptoms presented by a given patient, replacing it by considering
only the ease with which we draw to mind a particular diagnostic hypothesis.
The availability heuristic can lead to flawed clinical judgments by various
mechanisms, for example, emphasizing past experience of a physician or
underestimating the importance of the actual rates of disease presentation
Representiveness It consists in judging the likelihood of a condition based on the typical clinical
picture, without considering that in reality the whole set of atypical conditions
is much more likely to happen within a given context than the typical one.
Basing judgment on memory exclude considerations about probabilities and
lead to a fast diagnosis that may suffer of a severe evaluation error. The
representativeness heuristic, in fact, depends on the practice of assigning the
probability of an event based on its similarity to the traditional framework,
without considering the prior probability.
Anchoring It is the tendency of a doctor not to take into account the most appropriate
diagnostic information. In particular, the anchoring effect occurs when
clinicians faithfully adhere to an initial impression even when conflicting and
contradictory data are accumulated
Confirmation bias When diagnostic hypothesis is developed on weak or ambiguous basis, the
tendency to seek confirmation rather than denials lead physicians to ignore
information that may generate some kind of conflict with the first diagnosis.
Premature closure Physicians generally formulate a quick diagnosis (often based on pattern
recognition) and fail to consider other possible diagnoses and stop collecting
data (jump to conclusions), often even the suspected diagnosis is not confirmed
by appropriate testing.
Overconfidence bias Consists in the tendency of decision-maker to be confident in their judgment
ability beyond any rational consideration. Well-calibrated judgments are
actually the exception. Generally speaking, physicians have been showed to be
overconfident with their own judgment ability.
Attribution error This bias involve negative stereotypes that lead clinicians to ignore or
minimize the possibility of serious disease. For example, clinicians might
assume that a senseless patient with an odor of alcohol is “just another drunk”
and miss hypoglycemia or intracranial injury
Self-served bias The tendency to consider problems (such as a diagnostic puzzle) only from
one's own perspective. This bias poses important limits to communication and
information exchange between experts
Affect Heuristic Emotional cues may heavily affect judgment ability introducing contextual
disturbance factors. These factors may play a major role in the realization of
the first impression and the guide judgment process. Specific clinical
situations, for example, provoke lesser or greater degrees of affective valence,
thus addressing physician mindwork.
Cognitive heuristics can lead to systematic and predictable errors, or biases, that resemble optical
illusions. Systematic and predictable cognitive biases in judgment and decision-making may explain
the persistence of the core of diagnostic errors and the difficulties to prevent them. The use of
heuristics by physicians is due to a basic principle of the human mind: the existence of two separate
ways of thinking.
Two ways of thinking
Dual process theories describe the existence of two parallel thinking systems. System 1 is described
as intuitive, automatic, implicit and associative [23, 24]. System 2 corresponds to serial and slow
processes activated when necessary in order to approach new and/or particularly complex problems.
It is analytical in nature, it requires conscious effort to be activated and it consumes resources.
System 1 has also been described having a phase-locked activation while system 2 shows a more
tonic activity [25]. Roughly speaking, we may describe the human mind to use intuition or
analytical processes to solve a problem, to formulate a judgment or to take a decision.
Novices seem to rely more on the analytical system than experts do, since they have to develop a
valid tacit knowledge in order to make intuitive decisions. For instance, Gabbay and Le May [26]
describe the need of novices to openly use guidelines, a process that requires the activation of
system 2, whereas expert doctors rely more on “mindlines”, which are subtle strategies developed
with the experience and heavily based on system 1, though not exclusively. However, not all
empirical results are convergent on this topic [27]. Furthermore, the success or the fail of a
diagnosis are not inherently dependent on
the use of system 1 or system 2. Although following formal and standardized procedures, a
physician may fail to arrive at a good diagnosis due to incompetence (e.g., lack of knowledge of the
use of probability), cognitive limitations or the missing possibility of collecting all the information
needed in a given spatial/temporal context (e.g., due to the fact that epidemiological data or clinical
tests are not perfectly accurate). We may then conclude that both system 1 and system 2 to the
human decision-maker. However, they serve different aims and they are affected by different biases
as well as emotional and contextual interferences [16].
System 2, in particular, process information in a more abstract form, so it is relatively independent
of contingency and marginal clues. System 2 is thus the ideal logical place for hypoteticdeductive
reasoning. At the opposite system 1 is strongly affected by contextual factors (e.g. the particular
context in which a consultation occurs) as well as by the emotional valance of the situation [14, 16].
This doesn’t mean, however, that system 2 is emotion-driven and that system 1 is isolated from the
emotional system.
In many occasions a decision-maker is not even aware of the fact that system 1 has already started
to work and that the final decision will be taken on the basis of implicit processes. This is due to the
fact that system 1 is phase-locked in a bottom-up and rapid fashion. In this way implicit cognitive
distortions, similarly to perceptual illusions, may affect the decision process [15, 28].
The cognitive balanced model
During the last 10 year a number of studies have addressed the problem of diagnostic errors.
However, Elstein [29] pointed out that is not clear the effective contribution of the several biases
described in literature upon diagnostic error. Furthermore, little improvement was observed after
these studies, suggesting that further research as well as intervention strategies are needed in order
to reduce diagnostic errors. The failure in advancing in the field may surely be linked to the intrinsic
difficulty of conducting research in this field. In particular, Wears and Nemath [30] pointed out that
most scientific research on medical error is based on retrospective studies which are starting with
failure or a medical error, trying to found out simple and well defined sources of error. In so doing,
our judgment ability is strongly affected by the hindsight bias, that is the tendency to evaluate a
process starting from its outcome. Hindsight may guide researchers’ work by framing hypotheses,
collecting and interpreting data.
However, following several authors [12, 31, 32], a limited number of functioning principles may be
called in to explain a large proportion of diagnostic errors: premature closure (the tendency to avoid
considering other possibilities after reaching a diagnosis) and overconfidence (the tendency to
overestimate ones’ judgment ability). Interestingly, experienced physicians are as likely as novices
to exhibit premature closure and elderly physicians may be particularly predisposed both to
premature closure and overconfidence, probably because of age-related cognitive constraints and
expertise development [33].
Premature closure is a goal-directed, general tendency of the cognitive system. It can be described
as the result of our experiences, our limit (both cognitive and behavioural) and our goals. Thus we
can track the route of a premature closure in basic cognitive, motivational and relational aspects.
Many drivers push towards a premature closure in a clinical setting at various levels. Some of these
levels may be controlled, others cannot. For example, the need of premature closure should derive
from a strong need of emotional and cognitive discharge. In fact, when a physician approaches a
new case, cognitive load increases. If this load remains active for a long time it may lead to
cognitive stress, since it requires resources, and it may require to subtract time of other tasks. At the
same time, a physician may experience an emotional charge, when a diagnosis appears difficult.
Also overconfidence is the consequences of a number of direct and indirect drivers, including age
and experience. Overconfidence and premature closure may constitute two basic elements of so-
called dysrationalia, a term coined by Stanovich [34] to indicate the relative independence of
judgment performance and judgment ability. Intelligent and competent physicians may well know
what should be the best (mental-) path to take even though they are taking another one [4, 35].
We believe the reason for these two complex biases to be reported as playing a key role in
diagnostic error to lay in the nature of the functioning of the mind.
As we have described above, the human mind makes use of two ways of thinking: an intuitive,
automatic, system and an analytical and controlled one. Both premature closure and overconfidence
seem to derive from a complex interaction between these two global systems. In particular,
premature closure maybe described as the result of an unbalanced cognitive functioning, in which
the need to activate system 2 but the lack of resources to do it (time, emotional/cognitive load and
so on) , force to close the cognitive work, even though the diagnostic process is still in progress.
In this mechanism, overconfidence may play an important role. In fact, biased self-confidence may
boost premature closure, avoiding the activation of system 2, thus leading to unbalanced cognitive
We propose here that a rational diagnostic process should make use of both system 1 and 2, in order
to achieve a balanced mix of intuition and analysis, “mindlines” and guidelines.
Figure 1. The Cognitive Balanced Model.
T0 Tf
In this model, the set of information provided by a clinical case activate mind processes of a
physician, starting by rapidly reactive systems (value and emotional structures). This fast reaction
will enable the pre-activation of procedural and declarative memories. In a second step a thought
system is activated. System 1 is faster and more sensible to emotional and physiological activity. In
fact, system 1 works as a phase-locked system and shows a rapid variability when provided with
input. System 1 may lead to judgments and decision making, but it also may activate or inhibit
system 2 for further evaluations. System 2 is slower and always start after system 1, that may be
iper-activated or inhibited by system 2 in a circular link.
Education and experience should lead a physician to balance the relationship between system 1 and
system 2, but this interplay often degenerates. In fact, most of the times a decision, e.g. a diagnosis,
is a consequence of tacit, repetitive, rules. In other situations analytical, formal reasoning is forced
by the application of rigid procedure of decision supports.
In both cases the diagnostic course is the result of an unbalanced process. In many cases, expert
decision makers trust too much in their intuition, thus impeding system 2 to monitor the process and
engage in analytical reasoning when required. At the same time, novices may fail to use intuition,
thus relying too much on guidelines, procedures or (true or false) authorities. However, the picture
is not that so simple since contextual, cultural and organizational factors may intervene in pushing
forward analytical or intuitive decisions, thus leading to unbalanced processes.
We think that an unbalanced process of finding a diagnosis is the rule since there is a lack of tools
such as decision aids, training or education courses that may directly address the need to use
information in a cognitive balanced way.
Researchers generally suggest that physicians should learn to use debiasing techniques such as
meta-cognition (teaching physicians to ask themselves “What alternatives should be considered?”
before closing a case), high-fidelity simulation and the ability to critically reflect upon their own
practice in order to develop and maintain medical expertise throughout life [1,33]. The use of
systematic checklists and computer-based aids, which are able to suggest alternatives and to
highlight relevant clues or incoherent choices will increase diagnostic accuracy, and will certainly
help physician to reduce the likelihood of making a wrong diagnosis in the near future [36].
However, debiasing methods cannot discharge physicians’ expertise and/or their intuition. Most
debiasing systems proposed and tested led to little results. As recently pointed out by Norman and
Eva [37], who reviewed the diagnostic error literature, physicians seem to be particularly vulnerable
to error when they try to be analytical, i.e. when they try to force the use of system 2 instead of
letting system 1 work naturally. Indeed, most successful diagnoses are reported to be based on
intuitive judgment, rather than formal reasoning.
It is then clear that we have to address the unbalanced diagnosis in order to developed health-
related technology that is able to not only provide information and hints but also to generate a
strong learning environment [38] where sound decision skills can be developed by using both
system 1 and 2. In particular, we suggest that a broader use of fuzzy cognitive maps [39], a soft
computing technique that combines human expertise and analytical algorithms, will enhance
physicians’ awareness of the diagnosis-related cognitive flow. At the moment the use of these
techniques is limited since research is directed toward simpler debiasing models or fully automatic
data mining processes.
Cognitive fuzzy maps (CFM) are conceptual graphic representations similar to a graph with nodes
linked by branches. In contrast to a standard graph (like a decision tree), a CFM represents a
complex system without a given direction, since each node may have several univocal or reciprocal
connections with other nodes. These connections may have a positive or a negative effect on other
nodes, such as to activate or inactivate the linked node. Furthermore, each node does not represent
an on/off switch, but instead, each node may be partially active or inactive, varying between 0 and
1. A CFM may run on a computer using specific software that is able to simulate different
situations. In this sense a CFM may be an important decision aid, since it may simulate the
interactions between different parameters in a given clinical case.
However, what is more important for us is the construction process of a CFM. Differing from
analytical models and decision trees, a CFM is constructed through a cognitive as well as a social
working process. A panel of experts called to consider a general diagnostic problem. Each expert
bears specific (past experience) and general (education) knowledge, as well as intuitive expertise
and skills. Therefore they are able to develop a first static picture of the problem, defining each
node and the relative connections. This process may thought to be similar to the construction of
scenario [40]. In a second step, a positive or negative value might be assigned to each link and re-
entry links may be added to give rise to a dynamic representation. Finally, different CFMs may be
compared and discussed in order for a final CFM to be constructed. This CFM may be tested on a
computer and validated comparing the actual output of the CFM with expected results in well
known clinical pictures.
In this way both analytical and intuitive knowledge are used to suggest optimal clinical decisions.
Interestingly, the construction of a CFM does not imply quantitative knowledge or mathematical
skills, but instead always starts with a pattern of linguistic, qualitative descriptions of a situation
Physicians are not required to quantify the strength of a connection or other parameters, but they
only have to state an intuitive comprehension of the conceptual model.
As shown in experimental studies, a cognitive balanced decision with potentially positive effects on
the diagnostic process can be taken by using specific CFMs in particularly complex situations [41].
CFMs may strongly contribute to balance intuition and analysis, since they are build to integrate an
operator’s (e.g. physician’s) personal experience into a system that is able to derive decisions. The
great advantage of this approach is that it provides the possibility to use heuristics and intuitive
knowledge in a well stated conceptual scheme [39]. In this was the analytical and the intuitive part,
which are often described as divergent lines in a decision process found a natural converging
trajectory toward a balanced diagnosis.
We believe that the systematic use of CFMs may be particularly important in giving rise to a strong
learning environment both in the education process and in the daily clinical activity. Furthermore,
this activity will help to balance the phyician’s cognitive decision system. In fact, the use of CFMs
implies the integration of intuitive and explicit knowledge thanks to the need of giving a verbal,
and then explicit, form to intuitive expertise. The cognitive model that links clinical information to
a diagnosis is logically expressed in a CFM and may thus be proven or partially disconfirmed by
specific practices. The comparison between actual and optimal outcomes may lead both to a tuning
of the CFM used and to an increase in the physician’s knowledge thanks to explicit feedback.
Furthermore, the construction of CFMs includes some useful social activities as experts who are
working in a certain environment (e.g. a hospital or a ward) can share their experiences and
knowledge . Thereby, they give rise to a shared intelligence of specific situations and problems.
In this way, idiosyncratic rules, constructed in daily activity, may be exposed and overtly discussed.
We believe that the diagnostic problem needs to be approached by an interdisciplinary effort. The
use of CFMs may be important in achieving the aim of helping physicians to obtain a balanced
process towards a diagnosis, though not necessarily an optimal one.
Research on diagnostic error prevention will obligatory pass through the development of health
technology that could help medical decision making. Physicians will have to learn to interact with
technological decision aids and artificial expert systems. At the same time, health settings shall
favor the development of tacit natural knowledge, in order to facilitate the development of
sound expertise and intuition.
1.Mamede, S., Schmidt, H.G. and Rikers, R. (2007). Diagnostic errors and reflective practice in
medicine. J Ev Clin Pract,13,138–145.
2.Wachter, R.M., Holmboe, E.S (2009). Diagnostic errors and patient safety. JAMA, 15, 258.
3. Elstein, A.S. (1976). Clinical judgment, psychological research and medical practice. Science,
4. Croskerry, P. (2009). Clinical cognition and diagnostic error, applications of a dual process
model of reasoning. Adv Health Sci Educ Theory Pract, 1, 27-35
5. Elstein, A.S. (1995). Beyond multiple-choice questions and essays, the need for a new way to
assess clinical competence. Acad Med 1995, 68, 23-45.
6. Kirch, W., Schafii, C. (1996). Misdiagnosis at a university hospital in 4 medical eras. Medicine,
7. Leape, L.L. (1994). Error in medicine. JAMA, 21,1851-7.
8. Shojania, K., Burton, E., McDonald, K., et al. (2002). The autopsy as an outcome and
performance measure, evidence report/technology assessment. Agency for Healthcare Research and
Quality. AHRQ Publication No. 03-E002.
9. Pidenda, L.A., Hathwar, V.S., Grand, B.J. (2001). Clinical suspicion of fatal pulmonary
embolism. Chest,120, 791–795.
10. Schiff, G.D., Kim, S., Abrams, R., Cosby, K., Lambert, B., Elstein, A.S., Hasler, S., Krosnjar,
N., Odwazny, R., Wisniewski, M.F., McNutt, R.A. (2005). Diagnosing Diagnosis Errors, Lessons
from a Multi-institutional Collaborative Project. In, Henriksen, K., Battles, J.B., Marks, E.S.,
Lewin, D.I. (eds). Advances in Patient Safety, From Research to Implementation. Rockville (MD),
Agency for Healthcare Research and Quality (US)..
11. Reason, J. (1990). Human Error. New York, NY: Cambridge University Press.
12. Graber, M.L., Franklin, N., Gordon, R. (2005). Diagnostic error in internal medicine. Arch
Intern Med, 165, 1493-9.
13. Kahneman, D., Tversky, A. (1979). Prospect Theory, An analysis of decision under risk.
Econometrica, 47,111-132.
14. Kahneman, D., Tversky, A. (1973). On the psychology of prediction. Psych Rev, 80, 237-251.
15. Kahneman, D., Tversky, A. (1972). Subjective probability, A judgement of representativeness.
Cog Psych, 3, 430-454.
16. Slovic, P., Finucane, M., Peters, E., MacGregor, D.G. (2006). The affect heuristic. In T.
Gilovich, D. Griffin, Kahneman D. (Eds.), Heuristics and biases, The psychology of intuitive
judgment. (pp. 548–558). Cambridge, England: Cambridge University Press.
17. Gigerenzer, G., Todd, P.M., and the ABC Research Group (1999). Simple Heuristics That Make
Us Smart. New York: Oxford University Press.
18. Tversky, A., Kahneman, D. (1981). The framing of decisions and the psychology of choice.
Science, 211, 453-458.
19. Croskerry, P., Abbass, A.A. (2008). Wu AW. How doctors feel, affective issues in patients'
safety. Lancet, 372, 1205-6.
20. Croskerry, P. (2009). A universal model of diagnostic reasoning. Acad Med, 84, 1022-8.
21. Andre, M., Borgquist, L., Foldevi, M., Molstad, S. (2002). Asking for rules of thumb, a way to
discover tacit knowledge in general practice. Fam Prac, 19, 617-622.
22. Croskerry, P. (2003). The importance of cognitive errors in diagnosis and strategies to minimize
them. Acc Med, 78, 775-780.
23. Stanovich, K. (1999). Who Is Rational, Studies of Individual Differences in Reasoning.
Mahwah, N. .J. : Lawrence Erlbaum Associates.
24. Epstein, S., Pacini, R., Denes-Raj, V., Heier, H. (1996). Individual differences in intuitive-
experiential and analytical-rational thinking styles. J Pers Soc Psych, 71, 390-405.
25. Lucchiari, C., Pravettoni, G. (2010). Mind The Gap. Milan: Unicopli.
26. Gabbay, J., le May, A. (2009). Practice made perfect ? Discovering the role of a community of
general practice. In A. le May, (ed), Communities of practice in health and sociale care. Oxford:
27. Elstad, E.A., Lutfey, K.E., Marceau, L.D., Campbell, S.M., von dem Knesebeck, O., McKinlay,
J.B. (2010). What do physicians gain (and lose) with experience? Qualitative results from a cross-
national study of diabetes. Soc Sci Med, 70, 1728-36.
28. Croskerry, P. (2002). Achieving quality in clinical decision making, cognitive strategies and
detection of bias. Acad Emerg Med, 9,1184-204
29. Elstein, A.S. (2009). Thinking about diagnostic thinking, a 30-year perspective. Adv Health Sci
Educ Theory Pract, 1 ,7-18.
30. Wears, R.L., Nemeth, C.P. (2007). Replacing hindsight with insight, Toward better
understanding of diagnostic failures. Ann Em Med, 49 ,206-206.
31. Normann, G.(2009). Dual processing and diagnostic errors. Adv in Health Sci Educ, 14, 37–49.
32. Berner, E.S., Graber, M.L. (2008). Overconfidence as a cause of diagnostic error in medicine.
Int J Med, 121, 22-23.
33. Choudhry, N.K., Fletcher, R.H., Soumerai, S.B. (2005). Systematic review, the relationship
between clinical experience and quality of health care. Ann Int Med, 142, 260-273.
34. Stanovich, K. (1993). Dysrationalia. A New Specific Learning Disability. J Learn Disabil,
35. Croskerry, P. (2011). Commentary, Lowly interns, more is merrier, and the Casablanca
Strategy. Acad Med, 86, 8-10.
36. Newman-Toker, D.E., Pronovost, P.J. (2009). Diagnostic errors, the next frontier for patient
safety. JAMA, 301,1060-1062.
37. Norman, G.R., Eva, K.W. (2010). Diagnostic error in clinical reasoning. Med Educ, 44, 94-100.
38. Hogarth, R.M. (2001). Educating intuition. Chicago: University of Chicago Press.
39. Kosko, B. (1986). Cognitive Fuzzy Maps. Int J. Man-Machines Studies, 24, 65-75.
40. Kok, K. (2009). The potential of Fuzzy Cognitive Maps for semi-quantitative scenario
development, with an example from Brazil. Glob Env Change, 19,122-133.
41. Papageorgiou, E.I., Spyridonos, P.P., Glotsos, D.T., Stylios, C.D., Ravazoula, P., Nikiforidis,
G.N., Groumpos, P.P. (2008). Brain tumor characterization using the soft computing technique of
fuzzy cognitive maps. App Soft Comp J, 8, 820-828.
... To successfully meet these challenges, clinicians must regularly and continuously update their biomedical understanding and technical skills, which should result in enhanced quality of diagnosis (clinical judgment) and treatment planning (decision-making). 1 Clinical judgment and decision-making both use intuitive (rapid, non-analytical reasoning) and analytical (deliberate reasoning) cognitive processes, which integrate many factors including information obtained in a patient's physical examination and medical history, prior clinical experience, deductive knowledge, and relevant statistical data. [2][3][4][5] In dentistry, as in medicine, clinical judgment and decision-making are complex processes 4,6 dealing with many uncertainties, including patient-specific health factors, treatment-related technical factors, financial constraints, and clinician-related factors such as experience, intelligence, fatigue, and mood. Skills in clinical judgment and decision-making vary substantially among clinicians, such that some are better diagnosticians than others. ...
... Focusing concurrently on several clinical variables requires splitting of the limited mental resources for cognitive executive functioning, particularly that of attention, with the risk of maladaptively dealing with one or more variables. 3,26 This may lead to diagnostic errors. With increased experience, the clinician will be able to satisfactorily perform such complex tasks with less focused attention and cognitive effort. ...
... 33 Cognitive errors can also be attributed to various extrinsic and intrinsic factors, including limited financial resources, time constraints, limited cognitive executive functioning, work overload, overconfidence, deficiency of relevant information, incorrect evaluation and inappropriate prioritization of elements of the clinical information, poor communication skills, or simply to incompetence. 3 One tends to make decisions based on intuition (rapid, almost automatic and effortless, non-analytical reasoning) rather than on time-consuming, analytical, deliberately attention-demanding reasoning; both are influenced by mood, emotions, and stress, which fluctuate considerably. There is an inherent element of uncertainty in predicting the outcome of any treatment plan because the formulation of a treatment plan relies heavily on the clinician's familiarity with similar cases (heuristic planning) and on the cognitive ease with which relevant information comes to mind. ...
Full-text available
The development of clinical judgment and decision-making skills is complex, requiring clinicians—whether students, novices, or experienced practitioners—to correlate information from their own experience; from discussions with colleagues; from attending professional meetings, conferences and congresses; and from studying the current literature. Feedback from treated cases will consolidate retention in memory of the complexities and management of past cases, and the conversion of this knowledge base into daily clinical practice. The purpose of this narrative review is to discuss factors related to clinical judgment and decision-making in clinical dentistry and how both narrative, intuitive, evidence-based data-driven information and statistical approaches contribute to the global process of gaining clinical expertise.
... We argue that to optimize the diagnostic process and to prevent errors, doctors should exploit the potential of computer science technology and AI, so as to better balance intuitive and analytical processes (Lucchiari and Pravettoni 2012;Marcum 2012) and thus avoiding falling into dangerous cognitive traps while using their knowledge and skills. Indeed, we argue that mind-friendly and human-centered decision-making support tools (Memon et al. 2014) are today equally important as technical and logical competencies in ensuring a rational diagnostic process. ...
... Indeed, clinical decision-making tools are rarely used by physicians to make real decisions about specific patients, because the task they support does not match the physician's mental task. In previous works (Lucchiari and Pravettoni 2012), a cognitive balanced model (CBM) was suggested to describe how the clinical decision setting should be represented by a functional balance between analysis and intuition, guidelines and mindlines. CBM underlines the need for a doctor to develop both intuitive and analytical skills, but also the need for a support system that will help physicians find the balance needed case by case, adapting their thinking style to fit with the actual demands of the problem. ...
Full-text available
Research suggests that doctors are failing to make use of technologies designed to optimize their decision making skills in daily clinical activities, despite a proliferation of electronic tools with the potential for decreasing risks of medical and diagnostic errors. This paper addresses this issue by exploring the cognitive basis of medical decision making and its psychosocial context in relation to technology. We then discuss how cognitive-led technologies-in particular, decision support systems and artificial neural networks-may be applied in clinical contexts to improve medical decision making without becoming a substitute for the doctor's judgment. We identify critical issues and make suggestions regarding future developments.
... Most used theoretical models have limited explanatory power, and are based on certain assumptions about what constitutes clinical reasoning. 10 In the literature of clinical reasoning, several competing theories and models have been raised. 1,[11][12][13] Although most of the theoretical contributions on clinical reasoning belong to the 20th century, proposing new models are well continued into the 21st century, for example, Haring and her colleagues proposed a conceptual model for expert judgment of clinical reasoning of medical students. ...
... 36 "Think about" / hypotheses box "Look for" / inquiry box In the reviewing of the literature, we found out that some of the researchers established their models based on dual-processing theory like Marcum, 34 Croskerry,35,37 and Lucchiari and Pravettoni. 10 Dual-processing theory employs many of the seemingly contradictory features that have been proposed for clinical reasoning in the literature (such as fast, slow, reflective, etc.). It seems that, in reality, a physician does not use just intuitive or analytic systems and the mind of physician operates in the space between them, while the dual-processing theory ignores this. ...
Full-text available
Clinical reasoning is a complex cognitive process that is essential to evaluate and manage a patient's medical problem. The aim of this paper was to provide a critical review of the research literature on clinical reasoning theories and models. To conduct our study, we applied the process of conducting a literature review in four stages in accordance with the approach of Carnwell and Daly. First, we defined the scope of the review as being limited to clinical reasoning theories and models in medical education. In the second stage, we conducted a search based on related words in PubMed, Google Scholar, PsycINFO, ERIC, ScienceDirect and Web of Science databases. In the third stage, we classified the results of the review into three categories, and in the fourth stage, we concluded and informed further studies. Based on the inclusion and exclusion criteria, 31 articles were eligible to be reviewed. Three theories and two models were recognized and classified into three categories. Several theories and models have been proposed in relation to clinical reasoning, but it seems that these theories and models could only explain part of this complex process and not the whole process. Therefore, to fulfill this gap, it may be helpful to build a Meta-model or Meta-theory, which unified all the models, and theories of clinical reasoning.
... In the field of medicine, Zane et al. found cognitive differences between outpatients and clinicians in treatment and found that cognitive matching between them can affect the course of treatment and predict the treatment outcome [24]. To improve the inaccuracies in traditional clinical reasoning and the main cognitive biases in medicine, [25] established a cognitive balance model by using the diagnostic process of a fuzzy cognitive map. In the field of psychology, Roth et al. established a reclassification model based on cognitive consistency and explored whether the identity between individuals and groups determines the compatibility between different groups [26]. ...
Full-text available
In actual product development, the cognitive differences between users and designers make it difficult for the designed products to be recognized by users. To reduce the cognitive differences between these two design subjects, this paper proposes a method of cognitive matching of the design subjects. First, we use the relevant methods of Kansei engineering to quantify the Kansei image cognition of the two design subjects and construct a cognitive matching model of the design subjects with information entropy and the technique for order preference by similarity to ideal solution (TOPSIS). Second, according to the Kansei image, the Kansei image prototype cluster is constructed, and the representative Kansei image prototype is obtained. Then, we combine an artificial neural network (ANN) with a cognitive matching model of the design subjects to construct a product Kansei image evaluation system; this is used to evaluate the evolved forms. Finally, a product Kansei image form evolution system is constructed based on the genetic algorithm (GA). To some extent, the system simulates the cognitive matching process between designers and users in product design, helps designers to more accurately understand the cognitive trends of the two design subjects, and provides a theoretical basis for the intelligent design of product forms through the cognitive balance of multiple design subjects. This paper takes a beverage bottle as an example to verify the feasibility of the model through a comparative study.
... It may also serve as an indicator for a shift in brand preference caused by a TV advertisement (Silberstein and Nield 2015). Importantly, brand processing is associated with frontal region activity (Lucchiari and Pravettoni 2012); thus, based on EEG measurement marketers can conclude if the brand (e.g. exposed in a commercial) attracts consumer's attention (Wang et al. 2016). ...
Full-text available
Until now, neuromarketing studies have usually been aimed at assessing the predictive value of psychophysiological measures gathered while watching a marketing message related to a particular product. This study is the first attempt to verify the possibility of predicting familiar and unfamiliar brand purchases based on psychophysiological reactions to a retailer television advertisement measured by EEG, EDA and eye-tracking. The number of private label products chosen later served to assess the binary dependent variable. A logistic regression model (with a prediction rate of 61.2%) was applied to determine which psychophysiological variables explained the largest part of the variance of a final purchase decision. The results show that among various measures, only the electrodermal peaks per second were significant in predicting further purchase decisions. The decision to buy was also influenced by brand familiarity. The article concludes that EDA is an unobtrusive measure of emotion-related anticipation of significant outcomes, particularly for dynamic stimuli, as related to decision-making.
... Many studies have reported that heuristics biases play an important role in medical diagnostic errors (4)(5)(6)(7), which are most prevalent in internal, family and emergency medicine (8)(9)(10). There are over 50 heuristics related to the cognitive biases (11), and physicians' use of the availability heuristic is one of the most common cognitive biases related to diagnostic errors (12). The availability bias is the tendency to overestimate the likelihood of events when they readily come to mind (13). ...
Full-text available
Objective Empirical evidence on the availability bias associated with diagnostic errors is still insufficient. We investigated whether or not recent experience with clinical problems can lead physicians to make diagnostic errors due to availability bias and whether or not reflection counteracts this bias. Methods Forty-six internal medicine residents were randomly divided into a control group (CG) and experimental group (EG). Among the eight clinical cases used in this study, three experimental cases were similar to the disease of dengue fever (DF) but exhibited different diagnoses, one was actually DF, and the other four filler cases were not associated with DF. First, only the EG received information on DF, while the CG knew nothing about this study. Then, six hours later, all participants were asked to diagnose eight clinical cases via nonanalytic reasoning. Finally, four cases were diagnosed again via reflective reasoning. Results In stage 2, the average score of the CG in the diagnosis of experimental cases was significantly higher than that of the filler cases (0.80 vs. 0.59, p<0.01), but the EG's average score in the two types of cases was not significantly different (0.66 vs. 0.64, p=0.756 ). The EG and CG had significantly different scores for each experimental case, while no difference was observed in the filler cases. The proportion of diseases incorrectly diagnosed as DF among experimental cases ranged from 71% to 100% in the EG. There were no significant differences between the mean diagnostic accuracy scores obtained by nonanalytic reasoning and those obtained by the reflective reasoning in any cases. Conclusion Availability bias led to diagnostic errors. Misdiagnoses cannot always be repaired solely by adopting a reflective approach.
Medicine is practiced in environments of volatility, uncertainty, complexity, ambiguity, and disruption (VUCAD). Medical educators are tasked with providing an education that prepares graduates to successfully practice medicine in those types of environments. While medical education is continually evolving to incorporate new content and educational approaches, there are two mega‐evolutions that have permeated medical education: increased curricular structure and content integration. Structure and integration are assumed to be positive changes; however, they may alter the development of higher cognitive skills such as information management, planning, breadth of approach, strategy, and initiative. As new graduates will practice in VUCAD environments, it seems prudent to consider the effects of medical education’s mega‐evolutions on the development of higher cognitive skills critical to those types of environments. This paper explores the potential consequences of the increased curricular structure and content integration in medical education for learners who will practice in VUCAD environments and proposes a research agenda to better understand the relationship between curricular structure and integration and higher cognitive skill development.
This chapter presents the overview of medical errors; drug prescription errors and prescribing; the overview of medical error disclosure; medical errors and telemedicine; medical errors and medical education; the overview of nursing medication errors; and the aspects of medical errors in the health care industry. Reducing medical errors, increasing patient safety, and improving the quality of health care are the major goals in the health care industry. Medical errors are caused by mistakes in drug prescription, dosing, and medical administration in inpatient and outpatient settings. Heath care-related guidelines, institutional safety practices, and modern health care technologies must be applied in hospitals, clinics, and medical offices to reduce the occurrence of medical errors. The chapter argues that understanding the perspectives of medical errors has the potential to enhance health care performance and reach strategic goals in the health care industry.
Negotiating skills are not part of the traditional lawyer’s training. Today, however, advanced skills are required of the lawyer to settle disputes consensually before trial. Cognitive, psycho-social and communication aspects should be trained in a purposed way. Cognitive knowledge and brain technology may be combined to develop targeted empowering programs during negotiation training. This article outlines the methodology for a pilot study that investigates the cooperative problem-solving skills development through neuroscientific devices that are non-invasive, portable and therefore usable in ecological contexts. The use of Brain-Computer-Interface would allow trained negotiators to explicitly develop mind tools and skills, with greater interaction between intuitive and analytical thinking systems.
Full-text available
The goal of this study was to determine the relative contribution of system-related and cognitive components to diagnostic error and to develop a comprehensive working taxonomy. One hundred cases of diagnostic error involving internists were identified through autopsy discrepancies, quality assurance activities, and voluntary reports. Each case was evaluated to identify system-related and cognitive factors underlying error using record reviews and, if possible, provider interviews. Ninety cases involved injury, including 33 deaths. The underlying contributions to error fell into 3 natural categories: "no fault," system-related, and cognitive. Seven cases reflected no-fault errors alone. In the remaining 93 cases, we identified 548 different system-related or cognitive factors (5.9 per case). System-related factors contributed to the diagnostic error in 65% of the cases and cognitive factors in 74%. The most common system-related factors involved problems with policies and procedures, inefficient processes, teamwork, and communication. The most common cognitive problems involved faulty synthesis. Premature closure, ie, the failure to continue considering reasonable alternatives after an initial diagnosis was reached, was the single most common cause. Other common causes included faulty context generation, misjudging the salience of findings, faulty perception, and errors arising from the use of heuristics. Faulty or inadequate knowledge was uncommon. Diagnostic error is commonly multifactorial in origin, typically involving both system-related and cognitive factors. The results identify the dominant problems that should be targeted for additional research and early reduction; they also further the development of a comprehensive taxonomy for classifying diagnostic errors.