ChapterPDF Available

Striving for Certainty: Epistemic Motivations and (Un)Biased Cognition

Authors:
DOI: 10.4324/9781003111474-11
Introduction
In this chapter, we will focus on how the quest for certainty drives cognition
and thereby affects knowledge formation and usage. Traditionally, this quest has
been linked to closed-minded cognition, that is, to forming rigid knowledge
and belief systems resistant to change (Kruglanski, 1989). Closed-mindedness
leads people to believe they are in possession of an absolute truth, which is
why they uncritically ignore, discount, or reject evidence that is discrepant with
their important beliefs (usually linked to identity). This usually drives inaccurate
and biased cognition and implies a tendency to maintain in one’s mind a single
perspective along with the conviction of its unquestionable correctness, which
results in the rejection of other perspectives. This also leads to knowledge resist-
ance, that is, a failure to accept available and established knowledge.
The motivation to achieve certainty is however not always associated with
closed-minded (and biased) cognition, and in this chapter, we will put forward
an alternative view to account for this. More specically, we claim that the quest
for certainty is a goal that can be attained by various means. They may be cho-
sen from among a range of means, either biased, identity-protective, or accura-
cy-oriented cognitive strategies, depending on how useful (i.e. instrumental)
these means are perceived to be for the overarching goal of epistemic certainty.
Epistemic certainty about the past and present state of the world refers to what
we know. Epistemic uncertainty however arises because of what we do not know
but could know in theory (e.g. uncertainty due to limitations of the sample or
methodology) (van der Bles et al., 2019). When identity-protective strategies are
adopted, the beliefs that a person holds remain unchanged, or even strengthened
due to rejection of claims with good evidence against one’s view or endorse-
ments of claims with no credible evidence that support ones’ beliefs/identity.
However, when accuracy-oriented strategies are adopted, existing beliefs may be
altered by the incoming information. This implies the capacity to retain diverse
perspectives in one’s mind, to accept their diversity and their critical overview. In
consequence, it becomes possible to change one’s beliefs and judgements when-
ever new and more credible information is revealed.
Striving for Certainty
Epistemic Motivations and
(Un)Biased Cognition
Małgorzata Kossowska, Gabriela Czarnek,
Ewa Szumowska and Paulina Szwed
11
208 M. Kossowska, G. Czarnek, E. Szumowska, et al.
Cognition is Motivated1
The construction of new knowledge is a persistent human activity. For activ-
ities ranging from the relatively simple and mundane to the highly complex,
new knowledge is essential to assure condent decisions and reasoned actions.
Given the prevalence of the knowledge formation process, and its essential psy-
chological relevance to human thoughts, feelings, and actions, understanding
how knowledge is formed and changed, is a task of considerable importance
for psychological science (Kruglanski, 2004). According to Lay epistemic the-
ory (Kruglanski, 1989), contrary to popular belief, individuals do not gather
information in a chaotic and random manner. Research has rather shown that
knowledge formation is a process of hypothesis generation and validation, which
is quite orderly and follows logical rules, such as “if – then”, from premise to
conclusion (Kruglanski et al., 2009). The conclusion is knowledge, an opin-
ion, a belief, or a judgement. This process occurs regardless of the quality of
the information acquired (evidence may be reliable or unreliable). It also occurs
regardless the engagement of the person involved (one may wish to know what
the truth is, or simply to conrm their initial expectations). It emerges in each
case when an individual learns of something that is sufciently important to
initiate the motivational process that underlies cognition.
Kruglanski et al. (2009) demonstrate that the manner in which people gen-
erate hypotheses is reliant on cognitive resources. These may be modied by
exhaustion and by people’s readiness to engage in cognitive activity. The more
cognitive resources available, the more alternative hypotheses could be gener-
ated. However, cognitive exhaustion (e.g. several activities are being conducted
at once, too much similar information is being given, or even information chaos
is present) or high epistemic motivation (i.e. the desire to develop and maintain
a rich and thorough understanding of a situation) usually limit the scope of the
hypothesis generation process. As a result, people tend to bring up a low number
of hypotheses about the event.
However, the process of hypothesis validation depends on prior knowledge
and its level of activation, plus the quality and strength of evidence available.
These factors work together in shaping the processes of selection and evalua-
tion of information, and in effect, the adoption or rejection of a hypothesis,
and thus the formation of knowledge. A further factor that plays a crucial role
here is epistemic motivation. This factor affects the degree of condence in
one’s knowledge and inuences the propensity to continue or stop searching
for information. It also impacts decisions concerning which information can be
considered “evidence”. It shapes readiness to update one’s beliefs in the light of
emerging new evidence (Kruglanski et al., 2009). This epistemic process may
manifest in knowledge resistance or openness to its update based on credible
evidence.
Epistemic motivation is usually initiated under uncertainty, i.e. when there
is a lack of information (or there is access to merely low-quality, incomplete or
conicting information) about whether, where, when, how, or why an event has
Striving for Certainty 209
occurred, or will occur (Knight, 1921). Uncertainty could be reduced by means
of the acquisition of precise, unambiguous knowledge of the specic content
of one’s beliefs and preferences (or regardless of their specicity). Thus, this
type of motivation inuences different epistemic behaviors, including the active
search for information that is subjectively considered relevant and valid. Such
information could serve as “evidence”. The behaviors initiated under epistemic
motivation can also encompass the active avoidance of information subjectively
considered nonrelevant or nonvalid. In addition, epistemic motivation itself can
generally be classied into two kinds: the need for nonspecic certainty, and the
need for specic certainty (Kruglanski, 1989). Whereas the former reects the
need to possess any certain answer on a topic (e.g. whether vaccination against
Covid-19 is safe and effective), the latter refers to the need to attain a concrete
judgement, opinion, and/or assessment (e.g. that the vaccination against Covid-19
is indeed safe and effective). The need for specic certainty has an inuence on
cognition which has often been interpreted as a directional bias toward a favored
conclusion (e.g. anti-vaccination advocates can interpret the side effects of vaccine
as proof that they were right). Much classic motivational work in attribution (e.g.
Miller, 1976) as well as cognitive dissonance (Cooper & Fazio, 1984) has been
the focus of this particular motive.
Moreover, the primary assumption of a great deal of traditional work on moti-
vated reasoning is that the whole process of knowledge formation is motivated by
prior beliefs (Kunda, 1990). It has been suggested that people form their current
beliefs based both on prior beliefs and the cogency of the new relevant evidence
(Kruglanski et al., 2020). In this view, prior beliefs serve as (internal) models of
(external) reality, and are used to make predictions about the world. However,
any actions or perceptions are subject to optimization, and the explanations
accounting for the new evidence need to be accurate as possible. Consequently,
there are two ways of accounting for the new evidence: (1) updating one’s model
or (2) acting on and sampling evidence so that it ts with the model (Kruglanski
et al., 2020). Taking the rst of these paths, people construct mental models
that enable them to predict and interpret subsequent experiences. It also pro-
vides them with a sense of understanding, even meaning (Proulx, Inzlicht, &
Harmon-Jones 2012). Once adopted, people are committed to the models, but
may also change them. This process is dened as a change of expectations toward
new stimuli that renders them consistent with what was already known. In turn,
the second process involved is one of accounting for new evidence which entails
people tending to search for, interpret, favor and recall information in such a way
as to conrm their preexisting beliefs or hypotheses (Nickerson, 1998). In this
way, people may start out overcondent in an initial belief, fail to give proper
consideration to alternative hypotheses, or interpret ambiguous information in
favor of a rmly held belief (Klayman, 1995).
We now turn to the second class of epistemic motivation, the need for non-
specic certainty which reects the need to arrive at any conclusion whatsoever
that would serve the focal goal to achieve certainty (Kruglanski, 1989). In other
words, the need for nonspecic certainty drives the possession of any opinion,
210 M. Kossowska, G. Czarnek, E. Szumowska, et al.
judgement, beliefs, regardless of their content. This knowledge needs to provide
a sense of certainty, adequacy, and be subjectively sufcient to understand a
given phenomenon. This type of epistemic motivation boils down to such things
as (1) reducing the scope of information processing and hypothesis generation,
(2) concentrating the process of seeking information on prototypical rather than
diagnostic parameters, and (3) using the rst available information. All these
lead to the tendency to focus on evidence or facts that are presented earlier than
others (primacy effect), and then to determine the other information from it
(anchoring), as well as the activation of stereotypical content, and a preference
for consensual and general knowledge (for an overview, see Roets et al., 2015).
According to Kruglanski et al. (2020), by taking into account the need for
specic or non-specic certainty, we are in a position to explicate diverse epis-
temic phenomena, such as seeking, avoiding, biasing new information, and
revising and updating, or protecting, one’s beliefs when confronted with new
evidence. These processes are crucial to understanding knowledge formation
and its usage.
Cognitive Effects of the Need for Specic Certainty
One of the most documented effects of the need for certainty are conrma-
tion or myside bias and disconrmation bias (for an overview, see Nickerson,
1998). The rst phenomenon occurs when people accept evidence conrming
their (important) beliefs without criticism, whereas the latter occurs when peo-
ple try to undermine the evidence contrary to their beliefs. It follows that one
type of evidence that might be perceived as supporting one’s stances is mixed
ndings. In a classic study, Lord et al. (1979) found that people were more skep-
tical toward research that presented conclusions which were inconsistent with
their beliefs (about the efcacy of the death penalty as a deterrent to murder).
Specically, people perceived the studies presented as more reliable and convinc-
ing, when the results therein were in support of their own stance on the topic
compared to those that were not. Intriguingly, the study methods themselves
were presented to participants after the procedures were completed. The authors
called this process biased assimilation. They concluded that, as a result of this
process, when people are provided mixed, inconclusive, or random evidence,
biased assimilation leads to a further polarization of opinions. Similarly, in a
study by Ditto and Lopez (1992; Studies 2–3), when people were presented with
the undesirable (vs. desirable) results of a medical test, it took them longer to
decide whether their test result was complete, they were more likely to retest the
validity of their result, and rated test accuracy lower. This indicates that people
were less skeptical of evidence that was provided to them with desirable vs. unde-
sirable information. Another study looking at the effects of mixed evidence was
that of Bastardi et al. (2011), who analyzed responses to scientic evidence from
would-be parents who deemed home care to be superior to day care with regards
to a child’s future prospects. They compared two groups: conicted parents (who
were planning to use day care, although convinced that home care is superior)
Striving for Certainty 211
and unconicted ones (who were planning to use home care only). Participants
were presented with two studies with different research designs (either randomly
assigned, or statistically matching the sample) showing evidence for the supe-
riority of one form of childcare or the other. The parents’ evaluation of the
studies’ methodology favored the study that supported their desire (day care for
the conicted; home care for the non-conicted) but the effects were stronger for
the conicted group. Also, people in the conicted group changed their beliefs
about day care dramatically on being provided with the (mixed) evidence. Those
in the non-conicted group changed their opinion only slightly. The authors
concluded that “evaluations of purported scientic evidence were shaped more
by what participants desired to be true than by what they had initially believed
to be true” (p. 732).
These ndings illustrate how prior beliefs inuence knowledge formation.
However, not all beliefs are valued to the same extent. Hence, not all beliefs
exercise the same power to drive cognition. The sorts of beliefs that especially
inuence the way people search for and process information are those that
are directly linked to their identity, both personal and social. Indeed, there is
mounting evidence to suggest that identity-relevant beliefs are more than just
tools to achieve external goals. Rather, these beliefs are a source of value in
and of themselves, such that people are motivated to hold particular beliefs.
For example, people generally prefer to believe they are correct rather than
incorrect, they prefer to believe the future is bright rather than dark, and they
prefer to hold beliefs with certainty rather than uncertainty. The researchers
propose that the more identity-relevant a perception of behavior, the more
likely functional these beliefs are, thus, the more successful self-regulation
will occur. It is worth highlighting here that there is an overlap of brain regions
involved in self- related and reward processing, which is in line with a suggestion
that behavior or information that is self- or identity-relevant would have high
subjective value (Berkman et al., 2017).
A vast body of research has demonstrated that beliefs related to social identity
hold greater subjective value than beliefs irrelevant to this identity (Ellemers
et al., 2002). This stems from ndings that while personal identity informs the
beliefs that are important to oneself (for instance, related to being tall, belief in
one’s prociency in foreign languages or intelligence), social identity refers to a
person’s knowledge pertaining to their belonging to a social category or group
(Hogg & Abrams, 1988). The social categorization of self and others gener-
ates a sense of in-group identication and belonging. It regulates perception,
inference, feelings, behavior, and interaction to conform to the best representa-
tion of a given category (to prototype-based knowledge) one possesses about
one’s own group, and relevant outgroups. Moreover, because group prototypes
and representation are shared (“we” are like this, “they”‘are like that), one’s
world view and self-concept are consensually validated by the overt and verbal
behavior of fellow group members. Social categorization thus makes one’s own
and others’ behavior predictable, and allows one to avoid harm, plan effective
action, and know how one should feel and behave. Thus, under uncertainty,
212 M. Kossowska, G. Czarnek, E. Szumowska, et al.
being motivated by the specic need for certainty, people become more involved
in identity defensive cognitions (e.g. the right-wing adherents tend to be stricter
and surer about abortion ban when uncertainty is present). This is especially the
case when taking into consideration evidence that is suffused with culturally
divisive meanings. In these circumstances, the pressure to adhere to group-con-
gruent beliefs will often dominate over ‘the right answer’ standpoint (Kahan,
2017). Thus, espousing and holding beliefs that are aligned with one’s social
identity is a higher priority than achieving accuracy. The latter is too inconse-
quential a motive to affect the level of risk that a person faces, or to determine
the outcome of any public debate. However, the consequences of getting the
‘wrong answer’ in terms of what is expected by members of the afnity group,
are much more serious for the person, ranging from a loss of trust among peers
to stigmatization within their community. Indeed, Kahan (2017) claims that
social incentives for holding and expressing beliefs that are congenial to ones’
group are almost invariably of higher value than producing accurate responses
in most instances.
Still, it is worth noting that uncertainty itself, and various sorts of threats
posed to one’s identity, make the protection of identity-relevant beliefs stronger.
An interesting example comes from a study by Rothmund et al. (2015), show-
ing that when an important value is put in jeopardy (e.g. by informing pacists
about real-life violence), people are more likely to believe in scientic and polit-
ical claims regarding any further threat to this value (e.g. that violent games are
harmful). Colombo et al. (2016) looked into the role of morality in the perception
of scientic hypotheses. They found that when a scientic hypothesis is offensive
to one’s moral values (e.g. hypotheses that attending religious services makes
people healthier could be offensive to those who are dogmatic atheist, or that
growing up with non-heteronormative parents lead to developmental disorders –
to members of LGBT+ communities), then the assessment of the hypothesis is
biased. Of interest is the fact that providing incentives (money) for more accurate
evaluations did not improve subjects’ accuracy, and these effects held even after
controlling for the prior credibility of the hypothesis (e.g. when informed that
the scientic community meets the scientic consensus about given hypothesis).
Furthermore, Washburn and Skitka (2018) asked participants to interpret the
results of the scientic evaluations of a public policy (e.g. CO2 vehicle emission
standards) and its conclusion. Although participants were informed about the
correct interpretation afterwards, their ratings of agreement with these interpre-
tations, the perception of being knowledgeable, and trust in the research’s inter-
pretation depended on their own political ideology. Signicantly, both liberals
and conservatives were not in agreement with interpretations of the scientic
ndings that contradicted their own beliefs. Also, Kossowska et al. (2017), stud-
ying religious orthodoxy, demonstrated that the threat posed by value-violators
(e.g. atheists) leads to negative attitudes toward these groups among highly reli-
gious people. In this case, experienced threat for the outgroup was operational-
ized by cardiovascular reactivity, i.e. heart rate (HR); the higher the HR index,
the higher the threat. The results found that people who hold high (vs. low)
Striving for Certainty 213
levels of orthodox belief responded with increased HR after they were exposed
to atheistic worldviews. However, the authors observed decreased HR after the
expression of prejudice toward atheists among highly orthodox participants
compared to the control condition. They did not nd this effect among people
holding low levels of orthodox belief. Thus, the researchers revealed that preju-
dice, in fact, may serve as an efcient strategy to protect oneself from sources of
threat. This reasoning is consistent with research suggesting that prejudice and
discrimination directed toward members of groups that violate important values,
norms, and traditions can be used to diminish (or resist) these groups’ informa-
tional inuence on the person. This further bolsters one’s cultural worldview,
and thus reduces threat levels (for an overview, see Burke et al., 2010). In a sim-
ilar vein, across three studies, Kossowska et al. (2020) showed that ideology is
linked to the misperception of politically sensitive facts (e.g. What percentage of
all people who died in Auschwitz were Jews? or What percentage of Polish soci-
ety are LGBT?). This was especially true under conditions conducive to a higher
salience of political identity (i.e. during the outbreak of the Covid-19 pan-
demic). The researchers explain this effect by positing that politically-relevant
facts, especially highly politicized facts which are associated with membership
in a political group, trigger the goal of protecting one’s identity. As with other
social-identity processes, ideology powerfully motivates perceptual processes
toward making assessments in line with beliefs held by one’s group (and resist-
ing, i.e. ignoring or discounting, information in opposition to the beliefs held
by the group). Other researchers also claim that shared ideological commitments
intertwined with membership in groups furnish these individuals with impor-
tant forms of support – emotional and psychological as well as material (e.g.
Green et al., 2002). If a proposition about some policy-relevant fact comes to be
commonly associated with membership in such a group, the prospect that one
might form a contrary position can threaten one’s standing within the group.
Thus, these individuals may be motivated to resist empirical assertions (e.g. that
gun control reduces or does or does not reduce crime), if they run contrary to
the dominant belief within their groups. Thus, individuals may display the facts
as negligible in their impact provided that the assessments (however wrong) are
in line with their group commitments. Of note is the nding that the effects of
identity on information processing are observed under uncertainty conditions
which are conducive to a higher salience of political identity. Uncertainty may
lead individuals to display a strong tendency to conform their understanding
of different issues, especially complex ones, in accordance with the position of
the authorities, or groups that they support or belong to (e.g. Kahan, 2017).
This stems from the fact that uncertainty (threat, anxiety, and related negative
feelings) causes ideological identity to become more salient, and in that fashion,
identity-related beliefs shape social perception. Erroneousness that individuals
may display regarding the facts is seen as negligible in its impact provided that
the assessments (however wrong) are in line with their group commitments.
Of note is the nding that the effects of identity on information processing
are observed under uncertainty conditions which are conducive to a higher
214 M. Kossowska, G. Czarnek, E. Szumowska, et al.
salience of political identity. Uncertainty may lead individuals to display a strong
tendency to conrm their understanding of different issues, especially complex
ones, in accordance with the position of the authorities, or groups that they
support or belong to (e.g. Kahan, 2017). This stems from the fact that uncer-
tainty (threat, anxiety, and related negative feelings) causes ideological identity
to become more salient, and in that fashion, identity-related beliefs shape social
perception.
Although most of the studies demonstrated the negative effects of identity
protective cognitions on accurate perception, judgments, and attitudes, it should
be pointed out that there is some evidence showing that, under certain condi-
tions, identity bias can be reduced or even overcome. For example, prompt-
ing an accuracy goal to reach a correct conclusion can elicit greater cognitive
effort toward that goal, which can be translated into accurate cognition (e.g.
Baumeister & Newman, 1994). Other studies show that identity-biased cogni-
tion is reduced when people are asked to form accurate opinions about a policy
(Bolsen et al., 2014). Also, curiosity toward science was shown to reduce par-
tisan polarization around science. Hence, people with high levels of curiosity
about science were willing to consume news that was not in line with their
political identity (Kahan, 2017). Similarly, helping people to realize their own
ignorance about policy details – known as the explanatory depth illusion – can
reduce political polarization; by contrast, derogating your political opponents
tends to increase polarization (Fernbach et al., 2013; Suhay et al., 2018). Finally,
Porter and Schumann (2018), investigating intellectual humility (i.e. recogniz-
ing the limits of one’s knowledge and appreciating others’ intellectual strengths),
experimentally demonstrated that this factor could contribute to disagreements
becoming more constructive. Specically, it turned out that making salient a
growth mindset of intelligence (i.e. by asserting that intelligence can be devel-
oped) boosted intellectual humility and in turn, openness to opposing views.
Cognitive Effects of the Need for Non-specic Certainty
The need for non-specic certainty implies the search for a rm, precise answer
to a question, regardless of its specic content. Thus, under this motivation
one just wants to know, rather than conrm a specic belief. Many studies have
demonstrated that the motivation to attain certainty can psychologically mani-
fest in the vigilance used to detect threats and opportunities. It also unfolds in
impulsive reactions, wherein a person responds rapidly, with little deliberation
(e.g. one makes a decision based on scarce, readily available information instead
of engaging in a more extensive search). It also manifested in the capture of any
immediate benets, even when greater benets could be obtained later (Jonas
et al., 2014). This gives rise to a number of cognitive, motivational, and behavio-
ral implications, including risk aversion, attentional biases, and impaired perfor-
mance on a variety of working memory and decision-making tasks (e.g. Jameson
et al., 2004). It also leads to narrow, selective attention focused on threaten-
ing stimuli that, under many circumstances, results in suboptimal performance
Striving for Certainty 215
(Easterbrook, 1959; Kossowska, 2007). For example, a sizeable majority of pre-
vious studies have demonstrated that motivation to reduce uncertainty promotes
simplistic cognition relying mainly on stereotypes and heuristics, that is, simple
rules that lead to fast, yet at times suboptimal decisions (Kruglanski, 2004).
Some studies have shown that people who are highly motivated to reduce uncer-
tainty make more stereotypical judgments, prefer homogeneous over diverse
groups, prefer consistent over inconsistent images, prefer realistic over abstract
art, and prefer normative over deviant stimuli. Moreover, this motivation is
related to heightened resistance to altering conclusions once drawn and greater
reliance on the default mode of decision-making (for a review, see Roets et al.,
2015). To conclude, under motivation to non-specic certainty, knowledge sys-
tems became rigid, closed to new evidence, resistant to change, and biased in the
face of fragmented information.
While research clearly demonstrates the link between uncertainty and sim-
plistic cognition, leading to biases and neglect of a large portion of important
evidence, there are some contradictory ndings, revealing that this motivation
may also drive people to complex, effortful, and unbiased cognitions. For exam-
ple, there is substantial evidence that people attend to novel, unexpected events
that might disconrm their expectancies but only when these events are relevant
to their goals (e.g. when individuals desire to understand the event and be accu-
rate in their cognition). Other studies have also shown that disconrmations
of important expectancies lead to increased attention to and processing of the
inconsistent information. Additionally, people are willing to consider and incor-
porate new information in order to improve their predictive ability. This moti-
vation can also foster an exploratory mode in which people tend to be open to,
seek, and incorporate new information so as to be accurate or to avoid mistakes.
These effects are reviewed by Kossowska et al. (2018).
A Goal (Versus Means) Perspective on the Quest for Certainty
So far, we have outlined the cognitive effects of the quest for certainty (specic
or non-specic) that can be usually described as limiting openness for new evi-
dence and thus biasing cognition. However, we have mentioned that this epis-
temic motivation, may also lead to more open-minded and unbiased cognition
(i.e. all evidence is processed, regardless of their consistency with one’s views).
This dichotomy presents us with the challenge of distinguishing the conditions
under which the quest for certainty leads to open-minded and when to simplis-
tic, bias-prone cognition. Given the seeming necessity for theoretical renement
in this area, we have proposed a framework that allows for the re-examination of
the abovementioned ndings.
Specically, we take a goal-means perspective and differentiate between
the cognitive goals and means (i.e. actions) undertaken to satisfy these goals
(Kruglanski et al., 2002). Goals represent desirable states of affairs to which
attainment one is personally committed, and means are instrumental actions
serving attainment of one’s goals. We posit that the need for certainty (whether
216 M. Kossowska, G. Czarnek, E. Szumowska, et al.
specic or non-specic) is no different from any other goal. In this case, people
aim to achieve certainty, they seek an answer to an important question, they
desire to uphold a certain belief, and/or they wish to make condent decisions.
These motivational states may initiate various epistemic actions to fulll these
underlying motives. For instance, people may consult other people’s opinions
to obtain external validation of their views, or they may simply depend on their
own epistemic authority to form a condent judgment (Kossowska et al., 2018).
Moreover, they may thoroughly scrutinize the attributes of all the available
alternatives before making a decision, or they may be satised with choosing
the rst option that passes their personal threshold (Schwartz, 2004). While
people will sometimes act skeptically and seek out information that contradicts
their own knowledge, in other cases, they will actively avoid information if that
helps them to protect a valued belief (Golman et al., 2017). Lastly, while they are
sometimes ready to reach accurate conclusions, very often they form biased but
identity-protective judgements (Kahan, 2017).
The above shows that even when the goal stays the same, (one wants to attain
certainty (either specic or non-specic)), the means (cognitive strategies) can
differ and, on some occasions, people select “closed-minded” means whereas, at
other times, they opt for “open-minded” ones. And it is the distinction at the
level of means, rather than goals, that determines whether people will resist the
new or contradictory facts or let them inuence their belief systems. This prop-
osition has important theoretical and practical implications, as it allows for iden-
tifying conditions under which certainty-seeking individuals – otherwise prone
to knowledge resistance – are more open to processing belief-inconsistent facts.
To this end, Kossowska et al. (2018) proposed that processing strategies, or
means, are chosen according to their perceived instrumentality in accomplishing
a particular goal, and their relations with other means and goals (Kruglanski
et al., 2002). Instrumental means are ones that afford high probability (expec-
tancy) of attaining a given goal with them (e.g. studying is an instrumental
means to the goal of passing an exam, whereas partying is not) (Bélanger et al.,
2016). In addition, a means is less likely to be chosen if it can be substituted
by other means (i.e. equinality), and is more likely to be selected if it serves
additional coactivated goals (i.e. multinality). A parallel line of research, in the
cognitive neuroscience of motivation (e.g. Berridge et al. (2009), found that
goal-directed behavior is associated with neuro-psychological states linked to
wanting and seeking, and the activation of areas of the brain associated with
reward processing (e.g. the cortico–basal ganglia–thalamic loop), as well as sym-
pathetic nervous system reactivity (Gendolla et al., 2019). Together, these func-
tions optimize goal striving and effort.
Following this thread of reasoning, Kossowska et al. (2018) proposed a model
that allows clear predictions to be made about when and why people, epistem-
ically motivated to reduce (non-specic) uncertainty, tend to perceive open-
minded cognitive strategies as more instrumental than closed-minded strategies
for reaching their goal of certainty. Specically, the researchers suggested that
this may happen when: (a) cues present in a situation suggest that open-minded
Striving for Certainty 217
means are more useful for attaining the goal, (b) the closed-minded means are
unknown or unavailable, or (c) general trust in closed-minded options is under-
mined. In an extensive research program, the researchers found support for
these assumptions. For example, Jaśko et al. (2015) investigating decision-mak-
ing processes, demonstrated that people motivated to achieve certainty searched
for more information (i.e. they open more boxes with relevant information) before
they made a decision and spent more time on decision-making than did those not
in search of certainty, which attests to their openness to new information. What is
more, it turned out that when a clue appeared in the task informing the participant
of techniques conducive to its completion, people needing certainty followed it
more frequently than those low in this need. In particular, when there was a clear
rule by which seeking a greater amount of information turned out to be more ben-
ecial in terms of goal achievement (i.e. participants were told that the majority of
people open most boxes to attain high results), people highly motivated to achieve
certainty engaged in information-seeking to a greater degree. These ndings have
important implications for understanding how certainty-seeking individuals pro-
cess information more generally. Specically, it suggests that they can be more
open or closed (i.e. resistant) to new facts, depending on the situation. When, in a
given context, there is a clue suggesting that certainty could be best attained when
engaging in unbiased, more extensive information search (e.g. nudges prompting
fact-checking or verifying information with different sources), people motivated to
attain certainty will exhibit more “open” epistemic behaviors, even when this may
lead to a change in their initial view.
A further example of a condition inducing open-minded cognition among
people epistemically motivated to achieve certainty comes from the classic study
by Kruglanski et al. (1991). The experiment they conducted showed that when
the initial certainty of participants as to their decisions was high, the need for
certainty was indeed associated with a lower amount of information being sought
by the participants. However, when participants were not certain as to their ini-
tial decision, this epistemic motivation expanded the scope of data sought out.
An illustration of cognition occurring under conditions where general trust
in closed-minded options is undermined comes from studies by Kossowska &
Bar-Tal (2013). In this study, the researchers demonstrated that low trust in
one’s own capacity to achieve certainty may lead to cognition that is typically
associated with openness, such as reduced bias in the formulation of impressions
of others, the taking of complex decisions rather than simple ones, and reduced
stereotyping. In addition, studies in which one’s condence in the previously
obtained knowledge was experimentally undermined, these open-minded
effects were also found (Dragon & Kossowska, 2019). In these situations, indi-
viduals lost faith in themselves and their knowledge, which, in turn, resulted in
this knowledge (i.e. opinions, beliefs, stereotypes) no longer serving as the basis
for formulating judgements, and ultimately led to it shedding its potential for
guaranteeing certainty. As a consequence, the individuals were forced to employ
alternative strategies to achieve certainty. Such a situation turns out to be particu-
larly difcult for people for whom certainty plays a fundamental role. On the one
218 M. Kossowska, G. Czarnek, E. Szumowska, et al.
hand, they feel a strong need to obtain certainty, while on the other, they are
deprived of their existing means of achieving it. This leads to them potentially
being more motivated to revise their previous expectations and views, and to
look for new information on a given subject. In other words, they can be more
epistemically motivated to engage in open-minded cognition, and thus counter-
acting resistance to new and inconsistent facts.
Final Thoughts
The research mentioned above reveals that open-minded cognition is preferred
(a) when a situation provides clues that “open” strategies are likely to be the most
effective in achieving certainty, (b) when simplied inference is not possible, or
(c) when people begin to doubt their previous modes of inference, whether this
is a result of a threat to the self, the experience of a loss of power or control over
the situation, or also when encountering credible (and by the same token impos-
sible to ignore) information that is inconsistent with the individual’s existing
knowledge and previous experiences.
However, the focus of these research efforts was mostly devoted to describing
fundamental cognition (and measured this at physiological and neuropsycholog-
ical levels). Thus, the open cognition that the researchers focused on refers to
the readiness to select more complex, difcult, and effortful cognitive activity.
It may include: seeking out new information, posing new hypotheses, taking
care to meet the standards given in instructions, forming an impression about
others based not on stereotypes, but rather on non-stereotypical information,
received in “real time”. While all of the abovementioned examples referred to
the non-specic motivation to reduce uncertainty, we feel that this model could
also be fruitfully applied to the cognition motivated by the specic epistemic
motivation. Moreover, traditionally researchers focus on identity-relevant cog-
nitions as the best means to achieve certainty. However, there are many accuracy -
oriented means that may also serve this goal (see Jonas et al., 2014). For example,
for particular groups (e.g. journalists, scientists, etc.) ensuring accuracy may help
obtain certainty in an improved manner (van Bavel & Pereira, 2018). The value
of accuracy-oriented (i.e. open-minded, extensive, and effortful) strategies as a
means of achieving certainty can be accomplished with incentives, and through
education systems that cultivate curiosity, accuracy, and accountability. We
believe, this eventually could lead to less tribe-like and polarized discussions
that many societies experience nowadays.
Finally, we here focused on processing strategies rather than on knowledge
per se. However, it is information selection and processing that leads to forming,
changing, or maintaining existing beliefs (i.e. knowledge). If one accesses only
a limited number of pieces of information, most likely restricted only to those
consistent with one’s beliefs, there is a weak chance that these beliefs will be
revised if incorrect. Furthermore, one’s views may further solidify, which will
make them more resistant to change in the future. Therefore, it is so crucial to
identify conditions which will prevent that from happening.
Striving for Certainty 219
Note
1 Recent literature uses the terms motivated cognition or motivated reasoning in a
narrow sense. That is, when one’s prior beliefs act to bias information process-
ing so as to make any conclusions congenial to these beliefs. It suggests that
motivated implies biased and precludes rational (e.g. Druckman & McGrath,
2019). However, this is also related to the old but ongoing debate on whether
biases in reasoning are due to motivation or cognition. We take the position
that any cognitive and motivational inuences prevail in virtually any epis-
temic activity. Thus, any cognitive activities are motivated by their very nature
(see Kruglanski et al. 2020).
References
Bastardi, A., Uhlmann, E., & Ross, L. (2011). Wishful thinking: Belief, desire, and the
motivated evaluation of scientic evidence. Psychological Sciences, 22(6), 731–732.
Baumeister, R., & Newman, L. (1994). Self-regulation of cognitive inference and decision
processes. Personality and Social Psychology Bulletin, 20(1), 3–19.
Berkman, E., Hutcherson, C., Livingston, J., Kahn, L., & Inzlicht, M. (2017). Self-control
as value-based choice. Current Directions in Psychological Science, 26(5), 422–428.
Berridge, K., Robinson, T., & Aldridge, J. (2009). Dissecting components of reward:
‘liking’, ‘wanting’, and learning. Current Opinion in Pharmacology, 9(1), 65–73.
Bolsen, T., Druckman, J., & Cook, F. (2014). The inuence of partisan motivated reason-
ing on public opinion. Political Behavior, 36(2), 235–262.
Burke, B., Martnes, A., & Faucher, E. (2010). Two decades of terror management theory:
A meta-analysis of mortality salience research. Personality and Social Psychology Review,
14(2), 155–195.
Colombo, M., Bucher, L., & Inbar, Y. (2016). Explanatory judgment, moral offense and
value-free science. Review of Philosophy Psychology, 7(4), 743–763.
Cooper, J., & Fazio, R. H. (1984). A new look at dissonance theory. Advances in
Experimental Social Psychology, 17, 229–266.
Ditto, P. H., & Lopez, D. F. (1992). Motivated skepticism: Use of differential decision
criteria for preferred and nonpreferred conclusions. Journal of Personality and Social
Psychology, 63(4), 568–584.
Dragon, P., & Kossowska, M. (2019). Need for closure and compensatory rule-based
perception: The role of information consistency. European Journal of Social Psychology,
49(1), 127–140.
Druckman, J. N., & McGrath, M. C. (2019). The evidence for motivated reasoning in
climate change preference formation. Nature Climate Change, 9(2), 111–119.
Easterbrook, J. A. (1959). The effect of emotion on cue utilization and the organization
of behavior. Psychological Review, 66(3), 183–201.
Ellemers, N., Spears, R., & Doosje, B. (2002). Self and social identity. Annual Review of
Psychology, 53(1), 161–186.
Fernbach, P., Rogers, T., Fox, C., & Sloman, S. (2013). Political extremism is supported
by an illusion of understanding. Psychological Science, 24(6), 939–946.
Gendolla, G., Wright, R., & Richter, M. (2019). Advancing issues in motivation intensity
research: Updated insights from the cardiovascular system. In R. M. Ryan (Ed.), Oxford
handbook of human motivation (2nd ed., pp. 373–392). Oxford University Press.
Golman, R., Hagmann, D., & Loewenstein, G. (2017). Information avoidance. Journal of
Economic Literature, 55(1), 96–135.
220 M. Kossowska, G. Czarnek, E. Szumowska, et al.
Green, D., Palmquist, B., & Schickler, E. (2002). Partisan hearts and minds: Political
parties and the social identities of voters. Yale University Press.
Hogg, M., & Abrams, D. (1988). Social identications: A social psychology of intergroup
relations and group processes. Routledge.
Jameson, T., Hinson, J., & Whitney, P. (2004). Components of working memory and
somatic markers in decision making. Psychonomic Bulletin & Review, 11(3), 515–520.
Jaśko, K., Czernatowicz-Kukuczka, A., Kossowska, M., & Czarna, A. (2015). Individual
differences in response to uncertainty and decision making: The role of behavioral inhi-
bition system and need for closure. Motivation & Emotion, 39(4), 541–552.
Jonas, E., McGregor, I., Klackl, J., Agroskin, D., Fritsche, I., Holbrook, C., Nash, K.,
Proulx, T., & Quirin, M., (2014). Threat and defense: From anxiety to approach. In J.
M. Olson & M. P. Zanna (Eds.), Advances in experimental social psychology (Vol. 49,
pp. 219–286), Academic Press.
Kahan, D. (2017). Misconceptions, misinformation, and the logic of identity-protective
cognition. Cultural Cognition Project Working Paper Series No. 164; Yale Law School,
Public Law Research Paper No. 605; Yale Law & Economics Research.
Klayman, J. (1995). Varieties of conrmation bias. Psychology of Learning and Motivation,
32, 385–418.
Knight, F. H. (1921). Risk, uncertainty, and prot. Houghton Mifin Company.
Kossowska, M. (2007). Motivation towards closure and cognitive processes: An individual
differences approach. Personality and Individual Differences, 43(8), 2149–2158.
Kossowska, M., & Bar-Tal, Y. (2013). Need for closure and heuristic information process-
ing: The moderating role of the ability to achieve the need for closure. British Journal
of Psychology, 104(4), 457–480.
Kossowska, M., Szwed, P., Czernatowicz-Kukuczka, A., Sekerdej, M., & Wyczesany, M.
(2017). From threat to relief: Expressing prejudice toward atheists as a self-regulatory
strategy protecting the religious orthodox from threat. Frontiers in Psychology, 8, 1–8.
Kossowska, M., Szumowska, E., Dragon, P., Jaśko, K., & Kruglanski, A. W. (2018).
Disparate roads to certainty processing strategy choices under need for closure. European
Review of Social Psychology, 29(1), 161–211.
Kossowska, M., Szwed, P., & Czarnek, G. (2020). Ideologically motivated perception:
The role of political context and active open-mindedness.
Kruglanski, A. W. (1989). The psychology of being “right”: The problem of accuracy in
social perception and cognition. Psychological Bulletin, 106(3), 395–409.
Kruglanski, A. W., Peri, N., & Zakai, D. (1991). Interactive effects of need for closure and
initial condence on social information seeking. Social Cognition, 9(2), 127–148.
Kruglanski, A. W. (2004). The psychology of closed mindedness. Psychology Press.
Kruglanski, A. W., Dechesne, M., Orehek, E., & Pierro, A. (2009). Three decades of lay
epistemics: The why, how, and who of knowledge formation. European Review of Social
Psychology, 20(1), 146–191.
Kruglanski, A. W., Jasko, K., & Friston, K. (2020). All thinking is ‘wishful’ thinking.
Trends in Cognitive Sciences, 24(6), 413–424.
Kruglanski, A. W., Shah, J. Y., Fishbach, A., Friedman, R., Chun, W. Y., & Sleeth-Keppler,
D. (2002). A theory of goal systems. In M. P. Zanna & M. P. Zanna (Eds.), Advances in
experimental social psychology (Vol. 34, pp. 331–378). Academic Press.
Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–498.
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polari-
zation: The effects of prior theories on subsequently considered evidence. Journal of
Personality and Social Psychology, 37(11), 2098–2109.
Striving for Certainty 221
Miller, A. G. (1976). Constraint and target effects in the attribution of attitudes. Journal
of Experimental Social Psychology, 12(4), 325–339.
Nickerson, R. (1998). Conrmation bias: A ubiquitous phenomenon in many guises.
Review of General Psychology, 2(2), 175–220.
Porter, P., & Schumann, K. (2018). Intellectual humility and openness to the opposing
view. Self and Identity, 17(2), 139–162.
Proulx, T., Inzlicht, M., & Harmon-Jones, E. (2012). Understanding all inconsistency
compensation as a palliative response to violated expectations. Trends in Cognitive
Sciences, 16(5), 285–291.
Roets, A., Kruglanski, A. W., Kossowska, M., Pierro, A., & Hong, Y. (2015). The moti-
vated gatekeeper of our minds: New directions in need for closure theory and research.
In J. Olson, & M. Zanna, (Eds.), Advances in experimental social psychology (Vol. 52,
pp. 221–283), Academic Press.
Rothmund, T., Bender, J., Nauroth, P., & Gollwitzer, M. (2015). Public concerns about
violent video games are moral concerns—How moral threat can make pacists suscep-
tible to scientic and political claims against violent video games. European Journal of
Social Psychology, 45(6), 769–783.
Schwartz, B. (2004). The paradox of choice. HarperCollins.
Suhay, E., Bello-Pardo, E., & Maurer, B. (2018). The polarizing effects of online partisan
criticism: Evidence from two experiments. The International Journal of Press/Politics,
23(1), 95–115.
van Bavel, J., & Pereira, A. (2018). The partisan brain: An identity-based model of political
belief. Trends in Cognitive Sciences, 22(3), 213–224.
van der Bles, A., van der Linden, S., Freeman, A., Mitchell, J., Galvao, A., Zavala, L., &
Spiegelhalter, D. (2019). Communicating uncertainty about facts, numbers and science.
Royal Society Open Science, 6(5), 181870.
Washburn, A., & Skitka, L. (2018). Science denial across the political divide: Liberals
and conservatives are similarly motivated to deny attitude-inconsistent science. Social
Psychological and Personality Science, 9(8), 972–980.
Article
Full-text available
The article discusses the role of digital media use in societal transformations, with a specific focus on the emergence of micro-identities. It also explores the extent to which such transformations entail increasing the risk of societal disintegration—defined as the erosion of established social structures, values, and norms. Our contention is that the distinctive attributes of digital media, coupled with the myriad expanding opportunities of use they afford, harbor the potential to fragment and polarize public discourse. Such tendencies jeopardize public trust in democratic institutions and undermine social cohesion. The intricate interplay between media usage and polarization synergistically contributes to the formation of micro-identities, characterized by their narrow and emergent nature. These micro-identities, in turn, manifest themselves through in-group self-determination often to the detriment of the broader social fabric. Thus, various micro-identities may actively contribute to the actual atrophy of the implicit rules and procedures hitherto deemed the norm within society. By addressing these multifaceted issues, typically confined within distinct disciplinary silos, this analysis adopts a multidisciplinary approach. Drawing from perspectives in political science, sociology, psychology, and media and communication, this paper offers in-depth analyses of the interactions between social processes and media usage. In doing so, it contributes substantively to the ongoing discourse surrounding the factors driving societal disintegration.
Article
Full-text available
What determines effort intensity in instrumental behavior? According to motivation intensity theory, effort should be proportional to experienced task difficulty as long as success is possible and justified and low when success is impossible or excessively difficult, given the available benefit. When task difficulty is unspecified or unknown, effort should be proportional to the importance of success. This chapter reports an extensive program of research that has operationalized effort intensity as cardiovascular reactivity during task performance and used multiple manipulations of variables influencing subjective task difficulty (e.g., performance standards, instrumentality, ability, fatigue, mood, depressive symptoms, implicit affect, implicit and biological aging) and the amount of justified effort (e.g., material incentive, instrumentality, needs, personal and social evaluation, mortality salience). In the second edition of this handbook, this chapter focuses on recent empirical evidence for the principles of motivation intensity theory and discusses challenges for other theoretical accounts.
Article
Full-text available
Uncertainty is an inherent part of knowledge, and yet in an eraof contested expertise, many shy away from openly communicating their uncertainty about what they know, fearful of their audience’s reaction. But what effect doescommunication of such epistemic uncertainty have? Empirical research is widely scattered across many disciplines. This interdisciplinary review structures andsummarizes current practice and research across domains, combining a statistical and psychological perspective. This informs a framework for uncertainty communication inwhich we identify three objects of uncertainty—facts, numbers and science—and two levels of uncertainty: direct and indirect. An examination of current practices provides ascale of nine expressions of direct uncertainty. We discuss attempts to codify indirect uncertainty in terms of quality of the underlying evidence. We review the limited literature about the effects of communicating epistemic uncertainty on cognition, affect, trust and decision-making. While there is some evidence that communicating epistemic uncertainty does not necessarily affect audiences negatively, impact can vary between individuals and communication formats. Case studies in economic statistics and climate change illustrate our framework in action. We conclude with advice to guide both communicators and future researchers in this important but so far rather neglected field.
Article
Full-text available
When making comparisons, people tend to use routinized standards, rules, and knowledge structures. Compensatory rules (e.g., “if competent, then cold”, “if incompetent, then warm”) allow for the quick and easy evaluation of groups when they are compared. We claim that the application of these rules is especially attractive for people who are motivated to seek quick and firm answers (people high in the need for closure—NFC). However, we assume that when people are confronted with expectancy‐inconsistent information, higher levels of NFC lead to a lower reliance on these rules. This is because the inconsistency may serve as a signal that the rules no longer provide guidance on how to act. We demonstrated these effects in three studies set in different group contexts, where we manipulated expectancy‐consistent and expectancy‐inconsistent information. These findings allow for a more comprehensive view of the dynamic and diverse effects of NFC.
Article
Full-text available
Strong disagreements have stymied today’s political discourse. We investigate intellectual humility – recognizing the limits of one’s knowledge and appreciating others’ intellectual strengths – as one factor that can make disagreements more constructive. In Studies 1 and 2, participants with higher intellectual humility were more open to learning about the opposition’s views during imagined disagreements. In Study 3, those with higher intellectual humility exposed themselves to a greater proportion of opposing political perspectives. In Study 4, making salient a growth mindset of intelligence boosted intellectual humility, and, in turn, openness to opposing views. Results suggest that intellectual humility is associated with openness during disagreement, and that a growth mindset of intelligence may increase intellectual humility. Implications for current political polarization are discussed.
Article
People often seek new information and eagerly update their beliefs. Other times they avoid information or resist revising their beliefs. What explains those different reactions? Answers to this question often frame information processing as a competition between cognition and motivation. Here, we dissolve this dichotomy by bringing together two theoretical frameworks: epistemic motivation and active inference. Despite evolving from different intellectual traditions, both frameworks attest to the indispensability of motivational considerations to the epistemic process. The imperatives that guide model construction under the epistemic motivation framework can be mapped onto key constructs in active inference. Drawing these connections offers a way of articulating social psychological constructs in terms of Bayesian computations and provides a generative testing ground for future work.
Article
Despite a scientific consensus, citizens are divided when it comes to climate change — often along political lines. Democrats or liberals tend to believe that human activity is a primary cause of climate change, whereas Republicans or conservatives are much less likely to hold this belief. A prominent explanation for this divide is that it stems from directional motivated reasoning: individuals reject new information that contradicts their standing beliefs. In this Review, we suggest that the empirical evidence is not so clear, and is equally consistent with a theory in which citizens strive to form accurate beliefs but vary in what they consider to be credible evidence. This suggests a new research agenda on climate change preference formation, and has implications for effective communication.
Article
Democracies assume accurate knowledge by the populace, but the human attraction to fake and untrustworthy news poses a serious problem for healthy democratic functioning. We articulate why and how identification with political parties – known as partisanship – can bias information processing in the human brain. There is extensive evidence that people engage in motivated political reasoning, but recent research suggests that partisanship can alter memory, implicit evaluation, and even perceptual judgments. We propose an identity-based model of belief for understanding the influence of partisanship on these cognitive processes. This framework helps to explain why people place party loyalty over policy, and even over truth. Finally, we discuss strategies for de-biasing information processing to help to create a shared reality across partisan divides.
Article
Affective and social political polarization—a dislike of political opponents and a desire to avoid their company—are increasingly salient and pervasive features of politics in many Western democracies, particularly the United States. One contributor to these related phenomena may be increasing exposure to online political disagreements in which ordinary citizens criticize, and sometimes explicitly demean, opponents. This article presents two experimental studies that assessed whether U.S. partisans’ attitudes became more prejudiced in favor of the in-party after exposure to online partisan criticism. In the first study, we draw on an online convenience sample to establish that partisan criticism that derogates political opponents increases affective polarization. In the second, we replicate these findings with a quasi-representative sample and extend the pattern of findings to social polarization. We conclude that online partisan criticism likely has contributed to rising affective and social polarization in recent years between Democrats and Republicans in the United States, and perhaps between partisan and ideological group members in other developed democracies as well. We close by discussing the troubling implications of these findings in light of continuing attempts by autocratic regimes and other actors to influence democratic elections via false identities on social media.
Article
We tested whether conservatives and liberals are similarly or differentially likely to deny scientific claims that conflict with their preferred conclusions. Participants were randomly assigned to read about a study with correct results that were either consistent or inconsistent with their attitude about one of several issues (e.g., carbon emissions). Participants were asked to interpret numerical results and decide what the study concluded. After being informed of the correct interpretation, participants rated how much they agreed with, found knowledgeable, and trusted the researchers’ correct interpretation. Both liberals and conservatives engaged in motivated interpretation of study results and denied the correct interpretation of those results when that interpretation conflicted with their attitudes. Our study suggests that the same motivational processes underlie differences in the political priorities of those on the left and the right.