Content uploaded by Cory Clark
Author content
All content in this area was uploaded by Cory Clark on Jul 08, 2020
Content may be subject to copyright.
Tribal bias from the wild to the laboratory
By Cory J Clark
“War is older than the human species. It is found in every region of the world, among all the
branches of humankind. It is found throughout human history, deeply and densely woven into its
causal tapestry. It is found in all eras, and in earlier periods no less than later… War is reflected
in the most fundamental features of human social life.”
Tooby & Cosmides, 2010, p. 191
Human evolution has been powerfully shaped by our history of war and intergroup conflict. For
groups to survive, reproduce, and pass their genes on to later generations of humans, they had to
coordinate and cooperate within their own groups in order to defeat other groups. Because group
loyalty and commitment were so important for group survival, humans would have evolved to
reward loyal and cooperative members of their ingroup—those who fervidly support the group’s
cause and contribute to the group’s success. Humans also would have evolved to punish and
ostracize disloyal members of the group—those who oppose or harm the goals of the group. In
these environments, where loyalty was rewarded with status and resources (and perhaps more
crucially, where disloyalty was highly costly), individuals would have evolved traits that enhance
coalitional commitment and tendencies to signal those traits to other group members. Modern
humans evolved from these highly group-loyal and cooperative individuals. In other words,
human evolution selected for traits that enhance and signal ingroup loyalty.
Group loyalty and cooperation sound like positive human tendencies—and they certainly can be
positive. But these group commitments can also lead to more problematic psychological
propensities, particularly when the goal is intergroup cooperation and coordination.
One of these propensities is ingroup favoritism. Indeed, even children arbitrarily assigned to
relatively meaningless groups (e.g., red t-shirts or blue t-shirts) display tendencies to be more
favorable toward ingroup than outgroup members. Ingroup favoritism comes in many varieties.
People not only like the members of one’s ingroup (whether one’s religious ingroup, political
ingroup, ethnic ingroup, etc.) better than the members of other groups, but they also tend to treat
the ideas and behaviours of ingroup members more favorably than those exact same ideas and
behaviours from outgroup members. For example, the immoral actions of a political ingroup
member are evaluated as more morally permissible than those exact same immoral actions when
performed by a political outgroup member.
But another, and perhaps more problematic type of tribal psychological tendency, is ingroup bias
or tribal bias. Tribal biases regard not merely how we feel about other people or how we evaluate
individual actions, but how we evaluate empirical information. Our apparent pursuit for truth and
understanding of human nature and the world can be warped by our desires to conform to the
ingroup and to gain status within the ingroup.
There are two primary levels of these tribal biases. The first is called selective exposure.
Selective exposure regards how we approach and avoid information in the world. Specifically,
people have tendencies to seek out information that supports their group’s goals and to avoid
information that opposes their group’s goals. An immigration restrictionist might seek out
information regarding the downfalls and costs of relatively open borders and avoid information
regarding the benefits of more liberal immigration policies, whereas a proponent of immigration
might avoid information regarding the costs and challenges of immigration and seek out
information on the benefits.
Perhaps the most obvious way in which people engage in selective exposure is in their media
consumption. They are inclined to read the newspapers, watch the news programmes, and visit
the news websites that support the beliefs of their ingroup and are similarly inclined to avoid any
media that opposes their group’s beliefs. But they also engage in selective exposure in their
social worlds. We choose to be friends with users on Facebook, on Twitter, and in the real world,
who are part of our various ingroups and who tend to exchange information with us that
conforms to our group’s beliefs. We selectively put ourselves in media environments and social
environments that are likely to bolster our pre-existing beliefs about the world—the beliefs that
support our ingroup—and selectively avoid media and social environments that put us at risk of
confronting discordant information. Thus, we are exposing ourselves and others to a biased set of
potentially relevant information.
The second level of tribal bias occurs after people are exposed to new information (whether they
purposefully sought out that information or failed to avoid it). The two critical biases here are
called motivated scepticism and motivated credulity. Motivated scepticism refers to the tendency
for people to be highly critical and unaccepting of information that opposes their own group’s
interests, whereas motivated credulity refers to the tendency for people to be highly credulous
toward and uncritical of very similar information that supports their own group’s interests.
A classic example of these tendencies comes from a 1979 paper by social psychologists, Lord,
Ross, and Lepper. These researchers had death penalty proponents and opponents evaluate the
scientific methods of an ostensibly real study that tested whether the death penalty does or does
not appear to deter crime. For example, participants read a paragraph describing a scientific
study that compared murder rates in 14 states in the United States before and after the adoption
of capital punishment. In one experimental condition (which half of the participants saw), the
study results demonstrated that murder rates were lower after adoption of the death penalty and
thus supported the deterrent efficacy of the death penalty; in the other experimental condition
(which the other half of the participants saw), the study results demonstrated that murder rates
were higher after adoption of the death penalty and thus opposed the deterrent efficacy of the
death penalty.
Participants were then asked to evaluate how well-conducted the study was. Note that the study
was conducted in exactly the same way in the two experimental conditions, and only the results
of the study differed. Lord, Ross, and Lepper found that participants who were proponents of the
death penalty evaluated the study as better conducted when the results indicated the death
penalty does deter homicide than when the results opposed the deterrent efficacy of the death
penalty, and the exact opposite pattern was observed for participants who opposed the death
penalty, such that they evaluated the study as better conducted when the results indicated that the
death penalty does not deter homicide than when the results supported the deterrent efficacy of
the death penalty. Thus, when people are exposed to new information, they tend to discount that
information as low quality when it challenges the beliefs of their political ingroup.
These types of biases appear to be particularly problematic in the political sphere. Politics is one
of the most salient modern tribal conflicts. We generally no longer kill our tribal opponents, but
we argue and debate in an effort to advance the success of our own political ingroup and to
squash our political opponents.
There are at least three main reasons why the political sphere elicits substantial tribal biases.
(1) Political arguments are highly consequential. Many political disagreements centre on who
should receive status and resources within society (for example, who should receive welfare
benefits and who should pay for them), and so group success is very important for individual
success. There is a strong motivation to win political disagreements.
(2) Political disagreements are often morally significant. Moral disagreements signal an
unwillingness to conform to the same rules—rules that are often set in place to advance the
interests of the ingroup (or to oppose disadvantages). For example, gun control is a major source
of political conflict in the United States. When Democrats, who are less likely to own guns than
Republicans, morally oppose certain gun rights, and Republicans reject this moral opposition,
Republicans signal an unwillingness to follow the same rules as Democrats (specifically, to give
up certain rights to gun ownership). Moreover, this results in at least one sort of power imbalance
between the two groups: in the US, Republicans are more armed than Democrats. This,
understandably, creates conflict between the two groups.
(3) Lastly, ambiguity exacerbates bias, and political issues are often if not always ambiguous.
Even experts disagree on many political issues. For example, experts disagree on how large pay
discrepancies are between men and women and they disagree on which factors contribute to such
discrepancies. On top of factual ambiguities, political issues often relate to opinions about what
ought to be the case based on that fuzzy understanding of the facts. One might think that
ambiguity would compel humility and open-mindedness, but when ambiguity occurs in the
context of political conflict, it appears to make people more biased and more dogmatic. Why?
Because there is more room for argument. People do not argue about obvious truths. It would be
pretty challenging to argue over whether animals must be killed to obtain meat, but it is easier to
argue about the costs and benefits of meat consumption. They are difficult to quantify; even
animal experts do not know how animals experience physical and emotional pain; even health
experts do not know whether there are long-term health consequences of avoiding animal
products. Given these unknowns, it is even easier to argue about whether the difficult-to-quantify
costs outweigh the difficult-to-quantify benefits. Nobody knows the answer for certain, so
compelling arguments can make the difference between whether one’s preferred policy is
supported or opposed.
Thus far, I have suggested that tribal conflict and biases are a fundamental feature of politics
because humans share an evolutionary history of intergroup conflict—and politics is the most
salient modern form of intergroup conflict. However, the social sciences have long emphasized
the shortcomings of more right-leaning or politically conservative ideologies, arguing that the
cognitive tendencies of political conservatives (e.g., threat avoidance, cognitive rigidity) likely
predict more bias in conservatives relative to liberals. In recent years, however, the social
sciences have been criticized for their left-leaning political homogeneity. Nearly all social
scientists identify as political liberals. It is possible then, that the overemphasis on the flaws and
biases of conservatives in the social sciences is merely a reflection of the ingroup biases of a left-
leaning field. In other words, the very scientists who have been exploring political biases may
have mischaracterized the cognitive tendencies of political conservatives due to their own
political biases.
This possibility inspired my colleagues and I to test whether indeed conservatives are more
biased than liberals as many scholars have contended, or whether liberals and conservatives are
more similar in their ingroup bias tendencies. We conducted two studies in the United States.
First, we simply asked Republicans and Democrats whether Republicans or Democrats are more
biased. The results displayed in the table below demonstrated that Republicans reported that
Democrats are more biased and Democrats reported that Republicans more biased. This is a nice
foreshadowing of the results to come.
We then conducted a meta-analysis of political bias research. A meta-analysis essentially
combines the results of all studies that measure a particular effect. In this case, we combined the
results of studies that were very similar to the classic Lord, Ross, and Lepper study described
above. We combined the results of 51 separate studies that presented political partisans with
virtually identical information with conclusions that either supported or opposed their political
ingroup and then asked participants to evaluate the validity of that information. Topics spanned
dozens of political issues, including capital punishment, gun control, abortion,
welfare, healthcare, climate change, same-sex marriage, affirmative action, immigration,
education, taxes, and marijuana. When all results were combined together and averaged, we
found near perfect symmetry between liberals and conservatives. That is, across 51 studies, both
liberals and conservatives evaluated information that supports their political ingroup as more
valid than that exact same information when it opposes their political ingroup, and to virtually
equal degrees.
In recent years, inspired by the realization that political homogeneity in the social sciences may
have biased conclusions about conservatives, many scholars have now been probing other
domains of ingroup favoritism among liberals and conservatives. The emerging trend has been
that liberals and conservatives are far more similar in their ingroup tendencies than thought
before. Whereas social scientists long thought support for authoritarianism was mainly a feature
of right-wing ideologies, more recent work has found that left-wing authoritarianism both exists
and predicts left-wing prejudices against right-wing targets. Similarly, scholars long thought that
conservatives were particularly intolerant toward outgroups, but more recently, researchers have
expanded the considered outgroups to those that liberals oppose, and found that liberals are
similarly as intolerant and prejudicial as conservatives. Many scholars had also suggested that
conservatives are particularly avoidant and skeptical of scientific conclusions that oppose their
political positions, but more recent work suggests that liberals and conservatives are similarly
prone to avoidance and denial of dissonant scientific findings.
Given the shared human evolutionary histories of individuals within different political groups
and parties, it seems plausible that all political groups and parties would be similarly tribal and
similarly prone to concomitant tribal biases. If the members of a particular political group
were not tribal, it seems unlikely that they would be able to compete with and survive alongside
other political groups with more loyal and tribal group members.
It may seem puzzling, then, that scholars have long placed far greater emphasis on the tribal
cognitive tendencies among political conservatives. However, the politically liberal homogeneity
of the social sciences suggests one possible explanation: Perhaps the very people attempting to
measure political biases have biases of their own. Specifically, because social scientists who
study bias are politically liberal, they may have been unable to perceive the biases of their own
ingroup. Indeed, research has shown that people struggle to perceive their own biases. For these
reasons, we should be somewhat skeptical of social scientific claims that the opposing political
group is relatively flawed or has particularly unflattering cognitive tendencies. And perhaps, we
should be somewhat sceptical of social scientific claims in general.
Tribalism is a natural and quite possibly an ineradicable element of human social groups. It is
likely that all groups, and all people within those groups, are susceptible to tribalism and tribal
biases. It is usually quite easy to convince people of this point. But people seem easily convinced
because they consider this claim as it relates to other groups and other people, and not to their
own group or to their own self. People seem to think to themselves, “Ah, yes, this explains why
that other group has such ludicrous beliefs!” But tribalism and tribal biases do not only affect our
outgroups. They likely affect all groups: mine, yours, and everyone else’s. We are all humans,
and thus we are all susceptible to these kinds of biases.
Tribalism and tribal biases are not all bad. Group loyalty and commitment can be beautiful
things, and biases often serve as useful heuristics. However, they can be toxic when the goal is to
make compromises between groups, which is so often the goal of modern groups and
governments. Moreover, and more troubling, tribalism and tribal biases can steer us away from a
true and accurate understanding of human nature. For those of us who care deeply about truth
and understanding, we should be vigilant in acknowledging and combating these tendencies in
our own groups and in ourselves.
*This article is a chapter forthcoming in Past and Present, based on a talk given at the
Engelsberg Seminar in Engelsberg, Sweden.
References
Berreby, D. (2005). Us and them: Understanding your tribal mind. New York, NY: Time Warner
Book Group.
Brandt, M. J., Reyna, C., Chambers, J. R., Crawford, J. T., & Wetherell, G. (2014). The
ideological-conflict hypothesis: Intolerance among both liberals and
conservatives. Current Directions in Psychological Science, 23, 27-34.
Claassen, R. L., & Ensley, M. J. (2016). Motivated reasoning and yard-sign-stealing partisans:
Mine is a likable rogue, yours is a degenerate criminal. Political Behavior, 38, 317-335.
Clark, C. J., Liu, B. S., Winegard, B. M., & Ditto, P. H. (2019). Tribalism is Human Nature.
Current Directions in Psychological Science.
Conway, L. G., Houck, S. C., Gornick, L. J., & Repke, M. A. (2018). Finding the Loch Ness
Monster: Left-Wing Authoritarianism in the United States. Political Psychology, 39,
1049-1067.
DeMarree, K. G., Clark, C. J., Wheeler, S. C., Briñol, P., & Petty, R. E. (2017). On the pursuit of
desired attitudes: Wanting a different attitude affects information processing and
behavior. Journal of Experimental Social Psychology, 70, 129-142.
Ditto, P. H., Clark, C. J., Liu, B. S., Wojcik, S. P., Chen, E. E., Grady, R. H., ... & Zinger, J. F.
(2019). Partisan bias and its discontents. Perspectives on Psychological Science, 14, 304-
316.
Ditto, P. H., Liu, B., Clark, C. J., Wojcik, S. P., Chen, E. E., Grady, R. H., Celniker, J. & Zinger,
J. F. (2018). At least bias is bipartisan: A meta-analytic comparison of partisan bias in
liberals and conservatives. Perspectives on Psychological Science, 14, 273-291.
Duarte, J. L., Crawford, J. T., Stern, C., Haidt, J., Jussim, L., & Tetlock, P. E. (2015). Political
diversity will improve social psychological science. Behavioral and Brain Sciences, 38,
1-13.
Geary, D. C. (2005) Origin of mind: evolution of brain, cognition, and general intelligence.
Washington DC: American Psychological Association.
Jost, J. T., Glaser, J., Kruglanski, A. W., & Sulloway, F. J. (2003). Political conservatism as
motivated social cognition. Psychological Bulletin, 129, 339-375.
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The
effects of prior theories on subsequently considered evidence. Journal of Personality and
Social Psychology, 37, 2098-2109.
Munro, G. D., Weih, C., & Tsai, J. (2010). Motivated suspicion: Asymmetrical attributions of the
behavior of political ingroup and outgroup members. Basic and Applied Social
Psychology, 32, 173-184.
Nisbet, E. C., Cooper, K. E., & Garrett, R. K. (2015). The partisan brain: How dissonant science
messages lead conservatives and liberals to (dis) trust science. The ANNALS of the
American Academy of Political and Social Science, 658, 36-66.
Pronin, E., Lin, D. Y., & Ross, L. (2002). The bias blind spot: Perceptions of bias in self versus
others. Personality and Social Psychology Bulletin, 28, 369-381.
Sierksma, J., Spaltman, M., & Lansu, T. A. (2019). Children tell more prosocial lies in favor of
in-group than out-group peers. Developmental Psychology.
Stroud, N. J. (2010). Polarization and partisan selective exposure. Journal of communication, 60,
556-576.
Tajfel, H. & Turner, J. C. (1979). An integrative theory of intergroup conflict. The Social
Psychology of Intergroup Relations, 33, 33-47.
Tooby, J., & Cosmides, L. (2010). Groups in mind: The coalitional roots of war and
morality. Human morality and sociality: Evolutionary and comparative perspectives, 91-
234.
Washburn, A. N., & Skitka, L. J. (2017). Science denial across the political divide: liberals and
conservatives are similarly motivated to deny attitude-inconsistent science. Social
Psychological and Personality Science, 1-9.
Wetherell, G. A., Brandt, M. J., & Reyna, C. (2013). Discrimination across the ideological
divide: The role of value violations and abstract values in discrimination by liberals and
conservatives. Social Psychological and Personality Science, 4, 658-667.
Winegard, B. M., Clark, C. J., Hasty, C. R., & Baumeister, R. F. (2018). Equalitarianism: A
source of liberal bias. Manuscript in preparation.