ChapterPDF Available

Tribal bias from the wild to the laboratory

Tribal bias from the wild to the laboratory
By Cory J Clark
“War is older than the human species. It is found in every region of the world, among all the
branches of humankind. It is found throughout human history, deeply and densely woven into its
causal tapestry. It is found in all eras, and in earlier periods no less than later… War is reflected
in the most fundamental features of human social life.”
Tooby & Cosmides, 2010, p. 191
Human evolution has been powerfully shaped by our history of war and intergroup conflict. For
groups to survive, reproduce, and pass their genes on to later generations of humans, they had to
coordinate and cooperate within their own groups in order to defeat other groups. Because group
loyalty and commitment were so important for group survival, humans would have evolved to
reward loyal and cooperative members of their ingroup—those who fervidly support the group’s
cause and contribute to the group’s success. Humans also would have evolved to punish and
ostracize disloyal members of the group—those who oppose or harm the goals of the group. In
these environments, where loyalty was rewarded with status and resources (and perhaps more
crucially, where disloyalty was highly costly), individuals would have evolved traits that enhance
coalitional commitment and tendencies to signal those traits to other group members. Modern
humans evolved from these highly group-loyal and cooperative individuals. In other words,
human evolution selected for traits that enhance and signal ingroup loyalty.
Group loyalty and cooperation sound like positive human tendencies—and they certainly can be
positive. But these group commitments can also lead to more problematic psychological
propensities, particularly when the goal is intergroup cooperation and coordination.
One of these propensities is ingroup favoritism. Indeed, even children arbitrarily assigned to
relatively meaningless groups (e.g., red t-shirts or blue t-shirts) display tendencies to be more
favorable toward ingroup than outgroup members. Ingroup favoritism comes in many varieties.
People not only like the members of one’s ingroup (whether one’s religious ingroup, political
ingroup, ethnic ingroup, etc.) better than the members of other groups, but they also tend to treat
the ideas and behaviours of ingroup members more favorably than those exact same ideas and
behaviours from outgroup members. For example, the immoral actions of a political ingroup
member are evaluated as more morally permissible than those exact same immoral actions when
performed by a political outgroup member.
But another, and perhaps more problematic type of tribal psychological tendency, is ingroup bias
or tribal bias. Tribal biases regard not merely how we feel about other people or how we evaluate
individual actions, but how we evaluate empirical information. Our apparent pursuit for truth and
understanding of human nature and the world can be warped by our desires to conform to the
ingroup and to gain status within the ingroup.
There are two primary levels of these tribal biases. The first is called selective exposure.
Selective exposure regards how we approach and avoid information in the world. Specifically,
people have tendencies to seek out information that supports their group’s goals and to avoid
information that opposes their group’s goals. An immigration restrictionist might seek out
information regarding the downfalls and costs of relatively open borders and avoid information
regarding the benefits of more liberal immigration policies, whereas a proponent of immigration
might avoid information regarding the costs and challenges of immigration and seek out
information on the benefits.
Perhaps the most obvious way in which people engage in selective exposure is in their media
consumption. They are inclined to read the newspapers, watch the news programmes, and visit
the news websites that support the beliefs of their ingroup and are similarly inclined to avoid any
media that opposes their group’s beliefs. But they also engage in selective exposure in their
social worlds. We choose to be friends with users on Facebook, on Twitter, and in the real world,
who are part of our various ingroups and who tend to exchange information with us that
conforms to our group’s beliefs. We selectively put ourselves in media environments and social
environments that are likely to bolster our pre-existing beliefs about the world—the beliefs that
support our ingroup—and selectively avoid media and social environments that put us at risk of
confronting discordant information. Thus, we are exposing ourselves and others to a biased set of
potentially relevant information.
The second level of tribal bias occurs after people are exposed to new information (whether they
purposefully sought out that information or failed to avoid it). The two critical biases here are
called motivated scepticism and motivated credulity. Motivated scepticism refers to the tendency
for people to be highly critical and unaccepting of information that opposes their own group’s
interests, whereas motivated credulity refers to the tendency for people to be highly credulous
toward and uncritical of very similar information that supports their own group’s interests.
A classic example of these tendencies comes from a 1979 paper by social psychologists, Lord,
Ross, and Lepper. These researchers had death penalty proponents and opponents evaluate the
scientific methods of an ostensibly real study that tested whether the death penalty does or does
not appear to deter crime. For example, participants read a paragraph describing a scientific
study that compared murder rates in 14 states in the United States before and after the adoption
of capital punishment. In one experimental condition (which half of the participants saw), the
study results demonstrated that murder rates were lower after adoption of the death penalty and
thus supported the deterrent efficacy of the death penalty; in the other experimental condition
(which the other half of the participants saw), the study results demonstrated that murder rates
were higher after adoption of the death penalty and thus opposed the deterrent efficacy of the
death penalty.
Participants were then asked to evaluate how well-conducted the study was. Note that the study
was conducted in exactly the same way in the two experimental conditions, and only the results
of the study differed. Lord, Ross, and Lepper found that participants who were proponents of the
death penalty evaluated the study as better conducted when the results indicated the death
penalty does deter homicide than when the results opposed the deterrent efficacy of the death
penalty, and the exact opposite pattern was observed for participants who opposed the death
penalty, such that they evaluated the study as better conducted when the results indicated that the
death penalty does not deter homicide than when the results supported the deterrent efficacy of
the death penalty. Thus, when people are exposed to new information, they tend to discount that
information as low quality when it challenges the beliefs of their political ingroup.
These types of biases appear to be particularly problematic in the political sphere. Politics is one
of the most salient modern tribal conflicts. We generally no longer kill our tribal opponents, but
we argue and debate in an effort to advance the success of our own political ingroup and to
squash our political opponents.
There are at least three main reasons why the political sphere elicits substantial tribal biases.
(1) Political arguments are highly consequential. Many political disagreements centre on who
should receive status and resources within society (for example, who should receive welfare
benefits and who should pay for them), and so group success is very important for individual
success. There is a strong motivation to win political disagreements.
(2) Political disagreements are often morally significant. Moral disagreements signal an
unwillingness to conform to the same rules—rules that are often set in place to advance the
interests of the ingroup (or to oppose disadvantages). For example, gun control is a major source
of political conflict in the United States. When Democrats, who are less likely to own guns than
Republicans, morally oppose certain gun rights, and Republicans reject this moral opposition,
Republicans signal an unwillingness to follow the same rules as Democrats (specifically, to give
up certain rights to gun ownership). Moreover, this results in at least one sort of power imbalance
between the two groups: in the US, Republicans are more armed than Democrats. This,
understandably, creates conflict between the two groups.
(3) Lastly, ambiguity exacerbates bias, and political issues are often if not always ambiguous.
Even experts disagree on many political issues. For example, experts disagree on how large pay
discrepancies are between men and women and they disagree on which factors contribute to such
discrepancies. On top of factual ambiguities, political issues often relate to opinions about what
ought to be the case based on that fuzzy understanding of the facts. One might think that
ambiguity would compel humility and open-mindedness, but when ambiguity occurs in the
context of political conflict, it appears to make people more biased and more dogmatic. Why?
Because there is more room for argument. People do not argue about obvious truths. It would be
pretty challenging to argue over whether animals must be killed to obtain meat, but it is easier to
argue about the costs and benefits of meat consumption. They are difficult to quantify; even
animal experts do not know how animals experience physical and emotional pain; even health
experts do not know whether there are long-term health consequences of avoiding animal
products. Given these unknowns, it is even easier to argue about whether the difficult-to-quantify
costs outweigh the difficult-to-quantify benefits. Nobody knows the answer for certain, so
compelling arguments can make the difference between whether one’s preferred policy is
supported or opposed.
Thus far, I have suggested that tribal conflict and biases are a fundamental feature of politics
because humans share an evolutionary history of intergroup conflict—and politics is the most
salient modern form of intergroup conflict. However, the social sciences have long emphasized
the shortcomings of more right-leaning or politically conservative ideologies, arguing that the
cognitive tendencies of political conservatives (e.g., threat avoidance, cognitive rigidity) likely
predict more bias in conservatives relative to liberals. In recent years, however, the social
sciences have been criticized for their left-leaning political homogeneity. Nearly all social
scientists identify as political liberals. It is possible then, that the overemphasis on the flaws and
biases of conservatives in the social sciences is merely a reflection of the ingroup biases of a left-
leaning field. In other words, the very scientists who have been exploring political biases may
have mischaracterized the cognitive tendencies of political conservatives due to their own
political biases.
This possibility inspired my colleagues and I to test whether indeed conservatives are more
biased than liberals as many scholars have contended, or whether liberals and conservatives are
more similar in their ingroup bias tendencies. We conducted two studies in the United States.
First, we simply asked Republicans and Democrats whether Republicans or Democrats are more
biased. The results displayed in the table below demonstrated that Republicans reported that
Democrats are more biased and Democrats reported that Republicans more biased. This is a nice
foreshadowing of the results to come.
We then conducted a meta-analysis of political bias research. A meta-analysis essentially
combines the results of all studies that measure a particular effect. In this case, we combined the
results of studies that were very similar to the classic Lord, Ross, and Lepper study described
above. We combined the results of 51 separate studies that presented political partisans with
virtually identical information with conclusions that either supported or opposed their political
ingroup and then asked participants to evaluate the validity of that information. Topics spanned
dozens of political issues, including capital punishment, gun control, abortion,
welfare, healthcare, climate change, same-sex marriage, affirmative action, immigration,
education, taxes, and marijuana. When all results were combined together and averaged, we
found near perfect symmetry between liberals and conservatives. That is, across 51 studies, both
liberals and conservatives evaluated information that supports their political ingroup as more
valid than that exact same information when it opposes their political ingroup, and to virtually
equal degrees.
In recent years, inspired by the realization that political homogeneity in the social sciences may
have biased conclusions about conservatives, many scholars have now been probing other
domains of ingroup favoritism among liberals and conservatives. The emerging trend has been
that liberals and conservatives are far more similar in their ingroup tendencies than thought
before. Whereas social scientists long thought support for authoritarianism was mainly a feature
of right-wing ideologies, more recent work has found that left-wing authoritarianism both exists
and predicts left-wing prejudices against right-wing targets. Similarly, scholars long thought that
conservatives were particularly intolerant toward outgroups, but more recently, researchers have
expanded the considered outgroups to those that liberals oppose, and found that liberals are
similarly as intolerant and prejudicial as conservatives. Many scholars had also suggested that
conservatives are particularly avoidant and skeptical of scientific conclusions that oppose their
political positions, but more recent work suggests that liberals and conservatives are similarly
prone to avoidance and denial of dissonant scientific findings.
Given the shared human evolutionary histories of individuals within different political groups
and parties, it seems plausible that all political groups and parties would be similarly tribal and
similarly prone to concomitant tribal biases. If the members of a particular political group
were not tribal, it seems unlikely that they would be able to compete with and survive alongside
other political groups with more loyal and tribal group members.
It may seem puzzling, then, that scholars have long placed far greater emphasis on the tribal
cognitive tendencies among political conservatives. However, the politically liberal homogeneity
of the social sciences suggests one possible explanation: Perhaps the very people attempting to
measure political biases have biases of their own. Specifically, because social scientists who
study bias are politically liberal, they may have been unable to perceive the biases of their own
ingroup. Indeed, research has shown that people struggle to perceive their own biases. For these
reasons, we should be somewhat skeptical of social scientific claims that the opposing political
group is relatively flawed or has particularly unflattering cognitive tendencies. And perhaps, we
should be somewhat sceptical of social scientific claims in general.
Tribalism is a natural and quite possibly an ineradicable element of human social groups. It is
likely that all groups, and all people within those groups, are susceptible to tribalism and tribal
biases. It is usually quite easy to convince people of this point. But people seem easily convinced
because they consider this claim as it relates to other groups and other people, and not to their
own group or to their own self. People seem to think to themselves, “Ah, yes, this explains why
that other group has such ludicrous beliefs!” But tribalism and tribal biases do not only affect our
outgroups. They likely affect all groups: mine, yours, and everyone else’s. We are all humans,
and thus we are all susceptible to these kinds of biases.
Tribalism and tribal biases are not all bad. Group loyalty and commitment can be beautiful
things, and biases often serve as useful heuristics. However, they can be toxic when the goal is to
make compromises between groups, which is so often the goal of modern groups and
governments. Moreover, and more troubling, tribalism and tribal biases can steer us away from a
true and accurate understanding of human nature. For those of us who care deeply about truth
and understanding, we should be vigilant in acknowledging and combating these tendencies in
our own groups and in ourselves.
*This article is a chapter forthcoming in Past and Present, based on a talk given at the
Engelsberg Seminar in Engelsberg, Sweden.
Berreby, D. (2005). Us and them: Understanding your tribal mind. New York, NY: Time Warner
Book Group.
Brandt, M. J., Reyna, C., Chambers, J. R., Crawford, J. T., & Wetherell, G. (2014). The
ideological-conflict hypothesis: Intolerance among both liberals and
conservatives. Current Directions in Psychological Science, 23, 27-34.
Claassen, R. L., & Ensley, M. J. (2016). Motivated reasoning and yard-sign-stealing partisans:
Mine is a likable rogue, yours is a degenerate criminal. Political Behavior, 38, 317-335.
Clark, C. J., Liu, B. S., Winegard, B. M., & Ditto, P. H. (2019). Tribalism is Human Nature.
Current Directions in Psychological Science.
Conway, L. G., Houck, S. C., Gornick, L. J., & Repke, M. A. (2018). Finding the Loch Ness
Monster: Left-Wing Authoritarianism in the United States. Political Psychology, 39,
DeMarree, K. G., Clark, C. J., Wheeler, S. C., Briñol, P., & Petty, R. E. (2017). On the pursuit of
desired attitudes: Wanting a different attitude affects information processing and
behavior. Journal of Experimental Social Psychology, 70, 129-142.
Ditto, P. H., Clark, C. J., Liu, B. S., Wojcik, S. P., Chen, E. E., Grady, R. H., ... & Zinger, J. F.
(2019). Partisan bias and its discontents. Perspectives on Psychological Science, 14, 304-
Ditto, P. H., Liu, B., Clark, C. J., Wojcik, S. P., Chen, E. E., Grady, R. H., Celniker, J. & Zinger,
J. F. (2018). At least bias is bipartisan: A meta-analytic comparison of partisan bias in
liberals and conservatives. Perspectives on Psychological Science, 14, 273-291.
Duarte, J. L., Crawford, J. T., Stern, C., Haidt, J., Jussim, L., & Tetlock, P. E. (2015). Political
diversity will improve social psychological science. Behavioral and Brain Sciences, 38,
Geary, D. C. (2005) Origin of mind: evolution of brain, cognition, and general intelligence.
Washington DC: American Psychological Association.
Jost, J. T., Glaser, J., Kruglanski, A. W., & Sulloway, F. J. (2003). Political conservatism as
motivated social cognition. Psychological Bulletin, 129, 339-375.
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The
effects of prior theories on subsequently considered evidence. Journal of Personality and
Social Psychology, 37, 2098-2109.
Munro, G. D., Weih, C., & Tsai, J. (2010). Motivated suspicion: Asymmetrical attributions of the
behavior of political ingroup and outgroup members. Basic and Applied Social
Psychology, 32, 173-184.
Nisbet, E. C., Cooper, K. E., & Garrett, R. K. (2015). The partisan brain: How dissonant science
messages lead conservatives and liberals to (dis) trust science. The ANNALS of the
American Academy of Political and Social Science, 658, 36-66.
Pronin, E., Lin, D. Y., & Ross, L. (2002). The bias blind spot: Perceptions of bias in self versus
others. Personality and Social Psychology Bulletin, 28, 369-381.
Sierksma, J., Spaltman, M., & Lansu, T. A. (2019). Children tell more prosocial lies in favor of
in-group than out-group peers. Developmental Psychology.
Stroud, N. J. (2010). Polarization and partisan selective exposure. Journal of communication, 60,
Tajfel, H. & Turner, J. C. (1979). An integrative theory of intergroup conflict. The Social
Psychology of Intergroup Relations, 33, 33-47.
Tooby, J., & Cosmides, L. (2010). Groups in mind: The coalitional roots of war and
morality. Human morality and sociality: Evolutionary and comparative perspectives, 91-
Washburn, A. N., & Skitka, L. J. (2017). Science denial across the political divide: liberals and
conservatives are similarly motivated to deny attitude-inconsistent science. Social
Psychological and Personality Science, 1-9.
Wetherell, G. A., Brandt, M. J., & Reyna, C. (2013). Discrimination across the ideological
divide: The role of value violations and abstract values in discrimination by liberals and
conservatives. Social Psychological and Personality Science, 4, 658-667.
Winegard, B. M., Clark, C. J., Hasty, C. R., & Baumeister, R. F. (2018). Equalitarianism: A
source of liberal bias. Manuscript in preparation.
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
Humans evolved in the context of intense intergroup competition, and groups comprised of loyal members more often succeeded than those that were not. Therefore, selective pressures have consistently sculpted human minds to be "tribal," and group loyalty and concomitant cognitive biases likely exist in all groups. Modern politics is one of the most salient forms of modern coalitional conflict and elicits substantial cognitive biases. Given the common evolutionary history of liberals and conservatives, there is little reason to expect pro-tribe biases to be higher on one side of the political spectrum than the other. We call this the evolutionarily plausible null hypothesis and recent research has supported it. In a recent meta-analysis, liberals and conservatives showed similar levels of partisan bias, and a number of pro-tribe cognitive tendencies often ascribed to conservatives (e.g., intolerance toward dissimilar others) have been found in similar degrees in liberals. We conclude that tribal bias is a natural and nearly ineradicable feature of human cognition, and that no group—not even one’s own—is immune.
Full-text available
Children tell prosocial lies from the age of three years onwards, but little is known about for whom they are inclined to lie. This pre-registered study examined children’s (N = 138, 9-12 years) prosocial lying behavior towards minimal in-group and out-group peers. Additionally, children evaluated vignettes in which an in-group peer told a prosocial lie to an in-group or out-group peer. Results show that only older children told more prosocial lies for the benefit of in-group compared to out-group peers. Further, in the vignettes children of all ages were more accepting of prosocial lying in favor of in-group members compared to out-group members. These findings underscore the importance of considering intergroup relations in children’s prosocial lying behavior and advocate for broadening the scope of research on children’s intergroup prosociality.
Full-text available
Baron and Jost (this issue) present three critiques of our meta-analysis demonstrating similar levels of partisan bias in liberals and conservatives: 1) that the studies we examined were biased toward finding symmetrical bias among liberals and conservatives, 2) that the studies we examined do not measure partisan bias but rather rational Bayesian updating, and 3) that social psychology is not biased in favor of liberals but biased instead toward creating false equivalencies. We respond in turn that: 1) the included studies covered a wide variety of issues at the core of contemporary political conflict and fairly compared bias by establishing conditions under which both liberals and conservatives would have similar motivations and opportunity to demonstrate bias, 2) we carefully selected studies that were least vulnerable to Bayesian counterexplanation and most scientists and laypeople consider these studies demonstrations of bias, and 3) there is reason to be vigilant about liberal bias in social psychology, but this does not preclude concern about other possible biases, all of which threaten good science. We close with recommendations for future research and urge researchers to move beyond broad generalizations of political differences that are insensitive to time and context.
Full-text available
Both liberals and conservatives accuse their political opponents of partisan bias, but is there empirical evidence that one side of the political aisle is indeed more biased than the other? To address this question, we meta-analyzed the results of 51 experimental studies, involving over 18,000 participants, that examined one form of partisan bias—the tendency to evaluate otherwise identical information more favorably when it supports one’s political beliefs or allegiances than when it challenges those beliefs or allegiances. Two hypotheses based on previous literature were tested: an asymmetry hypothesis (predicting greater partisan bias in conservatives than in liberals) and a symmetry hypothesis (predicting equal levels of partisan bias in liberals and conservatives). Mean overall partisan bias was robust (r = .245), and there was strong support for the symmetry hypothesis: Liberals (r = .235) and conservatives (r = .255) showed no difference in mean levels of bias across studies. Moderator analyses reveal this pattern to be consistent across a number of different methodological variations and political topics. Implications of the current findings for the ongoing ideological symmetry debate and the role of partisan bias in scientific discourse and political conflict are discussed.
Full-text available
Recent scholarship has challenged the long-held assumption in the social sciences that Conservatives are more biased than Liberals, yet little work deliberately explores domains of liberal bias. Here, we demonstrate that Liberals are particularly prone to bias about victims’ groups (e.g. Blacks, women) and identify a set of beliefs that consistently predict this bias, termed Equalitarianism. Equalitarianism, we believe, stems from an aversion to inequality and a desire to protect relatively low status groups, and includes three interrelated beliefs: (1) demographic groups do not differ biologically; (2) prejudice is ubiquitous and explains existing group disparities; (3) society can, and should, make all groups equal in society. This leads to bias against information that portrays a perceived privileged group more favorably than a perceived victims’ group. Eight studies (n=3,274) support this theory. Liberalism was associated with perceiving certain groups as victims (Studies 1a-1b). In Studies 2-7 and meta-analyses, Liberals evaluated the same study as less credible when the results concluded that a privileged group (men and Whites) had a more desirable quality relative to a victims’ group (women and Blacks) than vice versa. Ruling out alternative explanations of Bayesian (or other normative) reasoning, significant order effects in within-subjects designs in Studies 6 and 7 suggest that Liberals believe they should not evaluate identical information differently depending on which group is portrayed more favorably, yet do so. In all studies, higher equalitarianism mediated the relationship between more liberal ideology and lower credibility ratings when privileged groups were said to score higher on a socially valuable trait. Although not predicted a priori, meta-analyses also revealed Moderates to be the most balanced in their judgments. These findings indicate nothing about whether this bias is morally justifiable, only that it exists.
Full-text available
Although past research suggests authoritarianism may be a uniquely right-wing phenomenon, the present two studies tested the hypothesis that authoritarianism exists in both right-wing and left-wing contexts in essentially equal degrees. Across two studies, university (n = 475) and Mechanical Turk (n = 298) participants completed either the RWA (right-wing authoritarianism) scale or a newly developed (and parallel) LWA (left-wing authoritarianism) scale. Participants further completed measurements of ideology and three domain-specific scales: prejudice, dogmatism, and attitude strength. Findings from both studies lend support to an authoritarianism symmetry hypothesis: Significant positive correlations emerged between LWA and measurements of liberalism, prejudice, dogmatism, and attitude strength. These results largely paralleled those correlating RWA with identical conservative-focused measurements, and an overall effect-size measurement showed LWA was similarly related to those constructs (compared to RWA) in both Study 1 and Study 2. Taken together, these studies provide evidence that LWA may be a viable construct in ordinary U.S. samples.
We tested whether conservatives and liberals are similarly or differentially likely to deny scientific claims that conflict with their preferred conclusions. Participants were randomly assigned to read about a study with correct results that were either consistent or inconsistent with their attitude about one of several issues (e.g., carbon emissions). Participants were asked to interpret numerical results and decide what the study concluded. After being informed of the correct interpretation, participants rated how much they agreed with, found knowledgeable, and trusted the researchers’ correct interpretation. Both liberals and conservatives engaged in motivated interpretation of study results and denied the correct interpretation of those results when that interpretation conflicted with their attitudes. Our study suggests that the same motivational processes underlie differences in the political priorities of those on the left and the right.
We fielded an experiment in the 2012 Cooperative Congressional Election Study testing the theory that motivated reasoning governs reactions to news about misdeeds on the campaign trail. Treated subjects either encountered a fabricated news story involving phone calls with deceptive information about polling times or one involving disappearing yard signs (the offending party was varied at random). Control subjects received no treatment. We then inquired about how the treated subjects felt about dirty tricks in political campaigns and about all subjects’ trust in government. We find that partisans process information about dirty campaign tricks in a motivated way, expressing exceptional concern when the perpetrators are political opponents. However, there is almost no evidence that partisans’ evaluations of dirty political tricks in turn color other political attitudes, such as political trust.
Recent work suggests that in addition to actual attitudes, people often have desired attitudes that can vary in their congruence with their actual attitudes. We explored whether desired attitudes motivate goal-congruent outcomes by impacting people's evaluative responses over the effects of actual attitudes. Across four studies, we demonstrated that desired attitudes independently predicted behavioral intentions (Study 1), information seeking (Study 2), information processing (Study 3), and overt behavior (Study 4). Further, consistent with the idea that desired attitudes reflect attitudinal goals, these effects were strongest among people who reported that they were highly committed to the pursuit of their desired attitudes (Studies 3 and 4). Last, meta-analyses of the effects of desired attitudes and the desired × commitment to desired attitudes interaction revealed significant evidence for these effects across the four studies. Implications of the results for research on attitudes and persuasion, motivated reasoning, and goal pursuit are discussed.