ChapterPDF Available

Do Cultural Misbeliefs Cause Costly Behavior?

Authors:

Abstract and Figures

Beliefs play a central role in our lives. They lie at the heart of what makes us human, they shape the organization and functioning of our minds, they define the boundaries of our culture, and they guide our motivation and behavior. Given their central importance, researchers across a number of disciplines have studied beliefs, leading to results and literatures that do not always interact. The Cognitive Science of Belief aims to integrate these disconnected lines of research to start a broader dialogue on the nature, role, and consequences of beliefs. It tackles timeless questions, as well as applications of beliefs that speak to current social issues. This multidisciplinary approach to beliefs will benefit graduate students and researchers in cognitive science, psychology, philosophy, political science, economics, and religious studies.
Content may be subject to copyright.
Do cultural misbeliefs cause costly behavior?
Hugo Mercier
Sacha Altay
Institut Jean Nicod, Département d’études cognitives, ENS, EHESS, PSL University,
CNRS, Paris France
Non proofread version
To be published as
Mercier, H. & Altay, S. (in press) Do cultural misbeliefs cause costly behavior? In
Musolino, J., Hemmer, P. & Sommer, J. (Eds.) The Science of Beliefs. Cambridge
University Press.
“Those who can make you believe absurdities can make you commit atrocities”
Voltaire (Torrey, 1961)
The epigraph above encapsulates a popular sentiment: when people hold misbeliefs (“a
false belief, or at least a belief that is not correct in all particulars,” McKay & Dennett,
2009, p. 493), personally or socially costly actions follow. The battle against fake news,
conspiracy theories, anti-vax rumors, and other popular misbeliefs is often justified in
those terms: if people believe all that hogwash, terrible consequences will ensue, as
they support demagogues, burn 5G towers, and refuse to vaccinate their children (see,
e.g., Douglas, 2021).
The link between misbelief and costly behavior is intuitively compelling for at least two
reasons:
(1) As a rule, beliefs should have behavioral consequences: this is what beliefs are
for, to guide our behaviors. Indeed, the vast majority of beliefs do have
behavioral consequences (if Luc believes it’ll rain, he’s more likely to take his
umbrella.)
(2) Costly actions are often accompanied by misbeliefs. Most Americans who
supported violent action following the results of the 2020 presidential election
believed the election to have been stolen (Pennycook & Rand, 2021). People
who burn 5G towers believe the towers will make them sick. People who refuse
to vaccinate their children believe, for instance, that vaccination causes autism.
We argue here that in fact cultural misbeliefs only rarely directly cause the costly
behaviors they are associated with--although they can have a variety of other
deleterious effects. We do not question that misbeliefs formed through perception and
individual inference can lead to costly actions: if Rosalie thinks the brakes on her car
have been fixed, but that they haven’t, she will behave accordingly, with potentially
deadly consequences. Our focus here is on cultural misbeliefs, that is misbeliefs that
are shared by many individuals, and that result at least in part from social transmission,
as do fake news, conspiracy theories, and anti-vax misinformation (see, Sperber, 1996).
We first show that there are cases in which the link between beliefs and behavior can
be systematically broken: when people hold reflective beliefs (thereby contesting (1)).
We then argue that most cultural misbeliefs do not result in costly actions and that, even
when costly actions are accompanied by misbeliefs, the misbeliefs rarely directly cause
the actions (thereby contesting (2)).
Intuitive and reflective beliefs
Beliefs should guide our behavior. Typically, in humans at least, beliefs do so by being
free to interact with relevant inferential or action planning mechanisms. For example, if
Hiroshi believes that his baby has fallen asleep, that belief can trigger a variety of
inferences and behaviors: that he can leave the room, that he shouldn’t make too much
noise in doing so, and so on and so forth. Such beliefs have been dubbed by Dan
Sperber (Sperber, 1997; see also, Baumard & Boyer, 2013) intuitive beliefs, and we
largely follow him in the following exposition of the distinction between intuitive and
reflective beliefs.
Humans are fluent in the ability to form metarepresentations: representations of
representations. We do so for instance when we use mentalizing (Simonetta believes
that Vittorio wants to leave the party) or ostensive communication (Simonetta believes
that Vittorio, by pretending to yawn, means that he thinks the movie is boring).
Metarepresentations allow us to have a variety of reflective attitudes towards beliefs.
Yuping can believe Xi thinks it’ll rain, even if she herself believes it’ll be sunny. Holding
such reflective attitudes towards beliefs is tremendously useful, as it allows us to
engage in mentalizing and communication without automatically endorsing the beliefs
we represent others to hold, and to practice hypothetical thinking.
If we can form skeptical reflective attitudes--Yuping can believe Xi is mistaken in
thinking it’ll rain--we can also have credal attitudes--Yuping can believe Xi is right in
thinking it’ll rain. When we hold beliefs such as “Xi is right in thinking it’ll rain,” it seems
we should also automatically form the belief “it’ll rain.” Clearly, in most cases, this is
exactly what we do; but not always.
Most of our beliefs rely on intuitive concepts (Sperber, 1997), for instance, the concept
of rain. We know what inferences to draw when we believe it’ll rain (that we’ll get wet if
we stay out, etc.). However, we can also use reflective concepts. Hamza doesn’t know
much about physics, but yet he accepts relativity theory. To him, concepts such as the
space-time continuum are reflective: he can form the belief “it is true that there is a
space-time continuum” while being unable to draw any inference from the belief that
there is a space-time continuum. Beliefs that rely on reflective concepts can only be
held reflectively.
Some cultural misbeliefs rely on reflective concepts: we do not have intuitive concepts
for an omnipotent being, the eternity we might spend in heaven or hell, or that one being
is in fact three consubstantial beings. In such cases, it is not surprising that beliefs
relying on these concepts remain reflective, and have no direct causal impact on our
behavior (e.g., Barrett, 1999; Barrett & Keil, 1996). Many cultural misbeliefs, however,
use intuitive concepts: that ballots were burnt, that Jewish people ritually murdered
children, that vaccines make you sick. We could form such beliefs through perception
and simple inference. Still, we presently argue that even when cultural misbeliefs rely on
intuitive concepts, they often remain reflective beliefs.
How to tell whether cultural misbeliefs are intuitive or reflective
Humans have the ability to hold beliefs reflectively, in a way that mostly insulates them
from inferential and action planning mechanisms. It is thus at least possible that
someone might hold a misbelief, and yet that this misbelief has few direct cognitive or
behavioral consequences. How can we tell whether that is the case? Several types of
evidence speak to this question.
Someone who would argue that a given misbelief is held intuitively, and not reflectively,
could point to its apparent behavioral consequences. Consider the humoral theory of
disease and the practice of bloodletting. For centuries, Western physicians have
endorsed the humoral theory of disease, and this theory appears to have driven the
practice of bloodletting--a behavior costly at least for the patients. Isn’t this
straightforward evidence that the belief in the humoral theory of disease was held
intuitively? Not necessarily. If bloodletting were practiced in cultures in which the
humoral theory of disease doesn’t exist, it would suggest that the practice of bloodletting
might be better explained by other factors. This is the case: bloodletting is or was a
popular therapy in at least a third of the world’s cultures (Miton et al., 2015). With the
exception of the Western tradition, none of these cultures entertained a belief in the
humoral theory of disease. Indeed, in many cultures very little explanation was given for
the practice of bloodletting. This does not rule out entirely the possibility that the
practice of bloodletting was caused by some factors in every culture but the West,
where it was due to the acceptance of the humoral theory of disease. This
interpretation, however, is obviously far-fetched. Instead, bloodletting appears to be
culturally attractive independently of any theory supporting it, theories that were instead
developed, in some cultures, to justify the practice. This removes the main argument
suggesting that the humoral theory of disease is held intuitively, as it appears that the
belief does not cause the behavior; instead the behavior (indirectly) causes the belief. A
first relevant type of evidence to tell whether a cultural misbelief is held intuitively or
reflectively thus consists in examining whether a behavior thought to be caused by this
misbelief is in fact better explained by other factors--for instance if the behavior is cross-
culturally recurrent even in the absence of any individual cultural misbelief.
The evidence above--that a given misbelief does not cause a given costly behavior--
leaves open the possibility that the misbelief is held reflectively, but it does not show it
conclusively. More direct evidence that a belief is held reflectively comes from the
observation (i) that this belief does not have the cognitive, affective, or behavioral
consequences that holding the belief intuitively does, and/or (ii) that this belief has
cognitive, affective, or behavioral consequences that holding the intuitive belief does not
have.
How can we tell what the cognitive, affective, or behavioral consequences of holding a
belief intuitively are? There is more or less variability in this respect. In some cases,
there is a strong link between belief and behavior--people who believe a pot is full of
sizzling oil consistently avoid dipping their hands in it. In other cases, the degree of
variability is much higher. Consider bystander intervention, often taken as a case study
of the power of the situation to shape the link between our belief (someone needs help)
and our actions (we help them). Early research showed that the number of people
witnessing a potential emergency situation had a strong effect on the individual
proclivity to help the victim: the more witnesses there were, the less likely each
individual was to intervene (Darley & Latané, 1968); recent research has also revealed
that some interpersonal differences play a role in deciding whether someone will help
(Hortensius & de Gelder, 2018). Although some of these behavioral differences stem
from the participants holding different beliefs (i.e. about the nature of the emergency),
even when the existence of an emergency situation is clear, behavioral responses vary.
Still, there are some clear patterns. First, nearly everyone is strongly affected by the
situation, even those who elect not to help (e.g., Hortensius & de Gelder, 2018; on the
effects of witnessing trauma more generally, see, e.g., Regehr et al., 2002). Second,
there is always a substantial share of participants that chooses to act--for instance, 62%
of participants in groups of five bystanders, in the original study (Darley & Latané,
1968). Thus, even cases taken as examples of the power of the situation to shape
behavior reveal strong regularities in the links between belief and behavior--when
people hold intuitive beliefs.
By contrast, for many misbeliefs we observe that (i) for most people who hold the
misbelief, it appears to have little or no cognitive, affective, or behavioral consequences,
and that (ii) the cognitive, affective, or behavioral consequences that do follow are not
those that would follow from holding the belief intuitively.
We will merely provide an illustration here, and review more evidence below. Consider
the belief in the so-called Pizzagate conspiracy theory, according to which high-level
Democrats were sexually abusing children in the basement of a restaurant from the
suburbs of Washington D.C. Although, according to polls, millions of Americans
endorsed this conspiracy, for the overwhelming majority of them holding this horrendous
misbelief appeared to have little or no consequence. The exceptions are notable. One
individual--who might have held the belief intuitively--stormed the restaurant and
demanded at gunpoint that the children be freed. By contrast, all other behavioral
responses consisted in expressive behaviors, such as posting negative reviews of the
restaurant. Such behavior would be completely incongruous if the belief were held
intuitively (imagine having seen pictures of children being abused in a restaurant, and
reacting by posting a one star review of the place). We have thus, for everyone but
maybe the lone individual who tried to free the supposed victims, either no
consequences from the belief, or consequences that would not follow if the belief were
held intuitively. This suggests that the vast majority of those who endorsed Pizzagate
did so reflectively. Note that doing so shielded these individuals from nearly all costly
consequences from holding that belief, by contrast with the single individual who might
have held it intuitively, and who served a multi-year jail sentence.
Before we move on to a brief review of the evidence regarding the status of cultural
misbeliefs, two clarifications. First, holding a belief reflectively does not mean that one is
being deceitful. When someone says they believe Pizzagate is real, and yet fails to do
anything about it, they are typically not lying--they are holding a reflective belief.
Second, how confidently a belief is held, and whether it is held intuitively or reflectively,
are orthogonal dimensions. Intuitive beliefs can be held with very little confidence (e.g. if
you think you might have recognized someone in a crowd), while reflective beliefs can
be held very confidently: a physicist’s belief in abstruse theoretical physics, or a priest’s
belief in abstract theological dogma can be absolute.
Diverse misbeliefs, common costly behaviors
As mentioned above, a crucial piece of evidence suggesting that a cultural misbelief
might be held reflectively and not intuitively is that some behavior (e.g. bloodletting)
associated with the belief in a given culture (the humoral theory of disease) can also be
found in many other cultures, even in the absence of that specific misbelief.
We used bloodletting and the humoral theory of disease as an illustration, but the
argument can be applied to other beliefs about therapies. Common forms of therapies
nearly always take a form similar to bloodletting: removing something seen as bad from
the body, whether it is through laxatives, emetics, or even sudation (Coury, 1967). In
spite of the commonality in these practices, different cultures entertain varied, and more
or less sophisticated, beliefs justifying these practices, which suggests that these beliefs
do not play any direct causal role in explaining the success of the practices.
Still in the domain of therapies, a more contemporary example is offered by anti-
vaccination beliefs. The spread of specific anti-vaccination beliefs is often thought to
explain refusals to vaccinate (oneself or one’s children). However, extreme anti-vax
positions are strikingly regular: they are present in nearly all countries studied (always at
a very low level--typically, below 5%, see de Figueiredo et al., 2020), and they have
existed since the dawn of vaccination (e.g., Durbach, 2000). Resistance to vaccination
did not wait for the infamous Wakefield study suggesting a link between vaccination and
autism, and it has been accompanied by a wide variety of beliefs (see Figure 1). Indeed,
a study suggested that the media frenzy that initially surrounded the Wakefield study
had no effect on vaccination rates (Smith et al., 2008). It seems that there is an intuitive
resistance to vaccination (Miton & Mercier, 2015), resistance that takes an extreme form
in a few individuals, and that specific beliefs about the dangers of vaccination--whether
it is supposed to cause autism or sterility--are used to justify this resistance, but do not
directly cause it.
Figure 1. Non-exhaustive map of the variety of anti-vaccine arguments throughout the
world.
Beyond the domain of therapies, cross-culturally common practices accompanied by
very different beliefs throughout the world can be found in many domains. Maybe the
most prominent example is that of rituals: ritualized behavior (which is “recognizable by
its stereotypy, rigidity, repetition, and apparent lack of rational motivation,” Boyer &
Liénard, 2006, p. 595) is present in most, if not all, human societies (see, e.g.,
Rappaport & Rappaport, 1999). Rituals are rarely very costly, but they do require some
time and energy. Boyer and Lienard suggest that rituals are a culturally attractive
practice because they tap into an “evolved Precaution System” (2006, p. 595). If it is
true that some universal cognitive machinery largely explains the cultural success of
rituals, the myriad explanations offered for why such and such ritual must be performed
are likely rationalizations with relatively little causal power.
Another salient domain in which we observe striking cross-cultural regularities in
apparently costly behavior is that of so-called “truth-making institutions” (Mercier &
Boyer, 2020). These institutions--such as divination or the ordeal--are perceived as
delivering true statements about various issues, in particular (but not only) the ascription
of guilt. Belief in such institutions appears costly, at least for those who suffer from them
directly (e.g. the individual accused of being a witch, the proband who has to undergo
the ordeal), but also for the community at large (since these institutions do not, by
themselves, provide accurate answers or point to actual culprits). It has been suggested
that these institutions owe their success to specific beliefs, for instance a belief that god
would never punish the innocent (e.g., Leeson, 2012). However, the fact that these
institutions are found in many cultures, sharing similar features, suggests that instead
they owe their success to more universal cognitive and social mechanisms, and that the
culturally-specific beliefs attached to them are not sufficient to explain the appearance
and persistence of the institutions (Mercier & Boyer, 2020).
Consequences of holding intuitive vs. reflective beliefs
We have argued that some costly behaviors are universal, while the beliefs that justify the
behaviors vary widely (or don’t even exist at all in some cultures), suggesting that the
beliefs do not cause the behavior. Another line of argument contrasts the behavioral
consequences of comparable beliefs when held either intuitively or reflectively. We will
see that the behavioral consequences that follow from holding beliefs reflectively are (i)
weak or indirect by contrast with the strong and direct consequences that follow from
holding the same belief intuitively, and (ii) are sometimes completely different from the
consequences of holding the same belief intuitively. We will illustrate this with the
examples of rumors and conspiracy theories.
Rumors
Rumors have been famously defined as “improvised news” (Shibutani, 1966). They
flourish when there is a demand for information that official channels of communication
cannot meet. Contrary to the common perception, in some contexts rumors are nearly
uniformly true (for review, see DiFonzo & Bordia, 2007)--in particular when they spread
in small networks, and when the object of the rumor is of direct practical relevance for
those who circulate it.
For example, in Iraq, rumors about Coalition airstrikes, fueled by propaganda and
disinformation campaigns, are rife. Despite the costs and uncertainty associated with
living close to the airstrikes, the closer people live to the airstrikes, the less likely they are
to believe false rumors about it (Silverman et al., 2021; see also, Diggory, 1956). For
these people, beliefs about the airstrikes have direct practical consequences: if the
airstrikes were indeed targeting civilians, they would need to leave the area and warn
their loved ones. They need to know the truth, as best as they can ascertain it, to draw
the appropriate inferences. On the other hand, for people living far away from the
airstrikes, incorrectly believing that airstrikes are targeting civilians does not have the
same practical consequences--it wouldn’t prompt them to move for instance. This
suggests that false rumors are more likely to spread when they have no behavioral
consequences--and thus when they could only be held reflectively.
In other cases, however, false rumors appear to have dramatic consequences. For
instance, rumors often precede, and appear to precipitate, violent ethnic riots (Horowitz,
2001). False rumors are also associated with the resistance to beneficial practices, such
as vaccination or condom use. Finally, false rumors are rife in the political domain, notably
to defame political opponents, such as rumors that Barack Obama is a Muslim, which
might lead some people to make significant political choices on the basis of false
information. In each case, rumors are associated with costly behaviors, but do the rumors
cause these behaviors? We will argue that they only do so weakly or indirectly, suggesting
that belief in these rumors is largely reflective.
First, we notice that many false rumors appear to justify, rather than cause, behaviors. As
we saw above, vaccines have encountered resistance in most times and places, and this
resistance is nearly always accompanied by false negative rumors about vaccines. Yet,
if resistance to vaccination is a quasi-universal phenomenon (in that it is found in most
cultures), there is wide variability in the rumors used to justify the resistance, suggesting
that they mostly play a post-hoc role. The same pattern can be observed in other cases
of resistance to beneficial practices, such as condom use, which is rejected because,
alternatively, the condoms cause cancer, have holes, contain HIV, have worms, or have
HIV-giving worms (see, e.g., Siegler et al., 2012).
Turning to political rumors, it is also plausible that they are used to justify the dislike for a
politician, rather than cause such dislike. This has at least been shown in the case of the
rumor that Obama is a Muslim. Kim and Kim (2019) studied the causal impact of the
rumor with a longitudinal survey that captured people’s beliefs and behaviors both before
and after the rumor’s large cultural success. They found that being exposed to the rumor
increased people’s belief that Obama is a Muslim. However, this effect was “driven almost
entirely by those predisposed to dislike Obama” (p. 307). Importantly the rumor had no
measurable effect on people’s behavior, such as their intent to vote for Obama. One could
imagine that, given the negative opinion of Muslims held by people who accepted the
rumor, if they had accepted it intuitively, it would have seriously bolstered their dislike of
Obama. That this doesn’t appear to have been the case suggests the rumor was held
reflectively.
Finally we turn to the most dramatic example of what appears to be the costly
consequences of believing in false rumors: the rumors that precede ethnic riots. Do these
rumors play the crucial role of convincing would-be rioters of the evilness of a particular
ethnic group? Not necessarily. Instead, ethnic rumors have been hypothesized to be
signals sent to facilitate coordination (Horowitz, 2001). For collective actions to take place,
people need to solve the coordination problem: how can I know if a sufficient number of
people is willing to engage with me in a collective action? Rumors could play a central
role in solving this problem by creating common knowledge that a sufficient number of
people is willing to participate in the collective action (Mercier, 2020; Petersen, 2020).
What matters to solve the problem of coordination is not so much that people believe the
rumors but rather that they be willing to share and endorse them. As a result, instead of
the specific and accurate rumors with practical consequences we find in small networks,
these rumors tend to be very vague about who the perpetrators are, only mentioning their
ethnicity (e.g., gypsies are stealing children in white vans). This means that the rumors
are not useful guides to a specific action (e.g. arresting or exerting revenge on specific
culprits), but that they are useful justifications to aggress any member of that ethnic group.
These justifications for ethnic riots are served by similar tropes the world over (e.g. of
children being kidnapped and sometimes killed, of food sources being poisoned, see
Szegőfi, unpublished document) in the rumors that precede ethnic riots, irrespective of
the local circumstances. For instance, false rumors of gypsies stealing children date back
to the middle ages in France, and have periodically resurfaced since, most recently in
2019.
Here again, we can see the contrast between intuitive and reflective beliefs. When a
member of the community is intuitively believed to have committed a crime, people take
targeted action towards that individual--whether they rely on the authorities or take
matters into their own hands. By contrast, in the case of the rumors that precede ethnic
riot, there is no intuitive connection between the purported atrocities committed by the
target group, and the reprisal, which often takes the form of looting, raping, and killing.
Moreover, such rumors tend to survive, in a more muted fashion, and without any
behavioral consequence, between episodes of ethnic riots. This suggests that rumors of
atrocities, even if they help coordinate ethnic riots, are only believed reflectively.
Conspiracy theories
Conspiracy theories are another type of misbelief that has people worried because of
their supposedly costly consequences, from attacking 5G towers to storming the U.S.
Capitol. Ironically, given that conspiracy theories are about powerful, hidden actors
exerting an oversized influence, they also worry very powerful institutions (e.g., Uscinski,
2020, p. vii). If conspiracy theories are sometimes defined as beliefs about conspiracies
that run afoul of “epistemological authorities” (such as the media, the government,
scientists, etc.) (e.g., Uscinski, 2020, p. 2), many beliefs about conspiracies (by contrast,
then, with conspiracy theories) are true--there are, indeed, (small) groups of powerful
people who conspire to turn things to their advantage. In order to argue that people’s
beliefs in conspiracy theories are usually reflective, we will contrast their consequences
with the consequences of intuitively believing that an actual conspiracy is taking place.
People sometimes notice that something appears to be amiss in their workplace.
Inconvenient documents disappear, fakes are fabricated, higher-ups act suspiciously, etc.
In such circumstances, people can develop an intuitive belief that their bosses are
engaged in some form of conspiracy. What do people do in such circumstances?
Overwhelmingly, they become anxious, and are afraid to speak up, since doing so would
jeopardize their position and potentially expose them to legal action (see, e.g., Santoro &
Kumar, 2018). As a result, whistleblowers will often try to remain anonymous (e.g. by
leaking documents to the press), or, to speak up publicly, they require guarantees from
the state that they will be protected from retaliation.
The contrast with belief in conspiracy theories--which are, according to the definition
above, overwhelmingly false--is stark. As a rule, people do not hide their belief in the
conspiracy; indeed, in many cases they are very vocal about their belief. Some conspiracy
theorists have blogs, radio channels, and websites where they expose, together with their
identity, truths that the most powerful groups in the world are (supposedly) hiding.
Consider Infowars, a popular U.S. conspiracy website created by Alex Jones, which
encourages viewers to tune in to “find out what the establishment is trying to hide.”
1
Jones
1
https://banned.video/watch?id=5b92a1e6568f22455f55be2b
has accused various branches of the U.S. government of horrendous conspiracies, such
as fabricating the Sandy Hook shooting as a false flag operation, or even orchestrating
9/11 as a controlled bombing. Apparently, it does not occur to him that if the U.S.
government could do things like this with impunity, they could very easily make an
inconvenient media personality disappear. What might look like courage--Jones is brave
enough to tell the truth to power--is merely a failure by Jones to hold his own beliefs
intuitively. As the journalist Jonathan Kay noted in his book on 9/11 Truthers: “one of the
great ironies of the Truth movement is that its activists typically hold their meetings in
large, unsecured locations such as college auditoriums—even as they insist that
government agents will stop at nothing to protect their conspiracy for world domination
from discovery” (Kay, 2011, p. 185). As a result of the asymmetry between how people
behave when they intuitively or reflectively believe in the existence of conspiracies, local
corporate or government malfeasance can persist for many years, with potential
whistleblowers remaining scared and silent, while beliefs in global but imaginary
conspiracies spreads widely, from the JFK assassination to the moon landing.
A good case can thus be made that beliefs in conspiracy theories are usually held
reflectively. What about the apparently costly behaviors that conspiracy theories appear
to lead to? We admit not to have an explanation for all of these behaviors, but we can
make two observations. First, in each case, only a small minority of people supposed to
believe in the conspiracy engage in such costly behaviors. Second, even such behaviors
would not be coherent with an intuitive belief in the conspiracy: as argued above, if the
government were powerful enough to want to poison us by installing 5G towers all over
the country, it could surely dispose of a few inconvenient rebels. In countries which are
actually ruled by ruthless governments bent on suppressing any opposition, people only
attempt such actions when they are part of a large-scale, organized movement that has
some chance of success.
Conclusion
The cultural success of misbeliefs--from fake news to conspiracy theories or beliefs in
quack remedies--is a longstanding object of concern, in large part because cultural
misbeliefs are thought to lead to behavior that is individually or socially costly. In this
chapter, we have argued that, in fact, most cultural misbeliefs are held reflectively, and
do not directly cause costly behavior. If this is a broadly optimistic conclusion, we must
note two big caveats.
First, unfortunately, many cultural true beliefs are also, we would argue, held reflectively
and have little (beneficial) behavioral consequences. For instance, belief in climate
change is rarely accompanied by commensurate action.
Second, even if misbeliefs are held reflectively, they can still be damaging in a variety of
ways: at the margin, they can help people engage in costly behavior by providing
justifications; they can help coordinate costly behaviors (as the rumors of atrocities that
precede ethnic riots); and they can lead to costly behavior if people are socially
compelled to act in a way that is perceived to be in line with their beliefs (e.g. people
might have felt pressured to take part in the riots of January 6 2021 because they had to
walk the walk, and not just talk the talk of widespread election fraud).
If cultural misbeliefs can have costly consequences--however indirectly--why does it
matter whether they are held intuitively or reflectively? We believe that this distinction is
crucial both for theoretical and practical reasons. Theoretically, the cognitive
mechanisms at play are very different for intuitive and reflective beliefs. Both types of
beliefs are accepted, held, and acted on through different processes (Sperber, 1997).
Practically, if it is true that cultural misbeliefs only have a limited, indirect role in
explaining costly behaviors, this suggests that attempting to stop the misbeliefs from
spreading will also only have a limited effect. Instead, the deeper causes that explain
both the spread and the misbeliefs and the costly behaviors--from ethnic antagonism to
lack of trust in government--must be addressed.
Acknowledgments
HM's work is supported by two ANR grants, to FrontCog (ANR-17-EURE-0017) and to
PSL (ANR-10-IDEX-0001-02). SA’s work is supported by a PhD grant from the Direction
Générale de L’Armement (DGA).
References
Barrett, J. L. (1999). Theological correctness: Cognitive constraint and the study of
religion. Method & Theory in the Study of Religion, 11(4), 325–339.
Barrett, J. L., & Keil, F. C. (1996). Conceptualizing a nonnatural entity:
Anthropomorphism in God concepts. Cognitive Psychology, 31(3), 219–247.
Baumard, N., & Boyer, P. (2013). Religious beliefs as reflective elaborations on
intuitions: A modified dual-process model. Current Directions in Psychological
Science, 22(4), 295–300.
Boyer, P., & Liénard, P. (2006). Why ritualized behavior? Precaution systems and
action parsing in developmental, pathological and cultural rituals. Behavioral and
Brain Sciences, 29(6), 595.
Coury, C. (1967). The basic principles of medicine in the primitive mind. Medical
History, 11(2), 111.
Darley, J. M., & Latané, B. (1968). Bystander intervention in emergencies: Diffusion of
responsibility. Journal of Personality and Social Psychology, 8(4p1), 377.
de Figueiredo, A., Simas, C., Karafillakis, E., Paterson, P., & Larson, H. J. (2020).
Mapping global trends in vaccine confidence and investigating barriers to vaccine
uptake: A large-scale retrospective temporal modelling study. The Lancet,
396(10255), 898–908.
DiFonzo, N., & Bordia, P. (2007). Rumor psychology: Social and organizational
approaches. American Psychological Association.
Diggory, J. C. (1956). Some consequences of proximity to a disease threat. Sociometry,
19(1), 47–53.
Douglas, K. M. (2021). Are conspiracy theories harmless? The Spanish Journal of
Psychology, 24.
Durbach, N. (2000). ‘They might as well brand us’: Working-class resistance to
compulsory vaccination in Victorian England. Social History of Medicine, 13(1),
4563.
Horowitz, D. L. (2001). The deadly ethnic riot. University of California Press.
Hortensius, R., & de Gelder, B. (2018). From empathy to apathy: The bystander effect
revisited. Current Directions in Psychological Science, 27(4), 249–256.
Kay, J. (2011). Among the Truthers: A journey through America’s growing conspiracist
underground. Harper Collins.
Kim, J. W., & Kim, E. (2019). Identifying the Effect of Political Rumor Diffusion Using
Variations in Survey Timing. Quarterly Journal of Political Science, 14(3), 293–
311.
Leeson, P. T. (2012). Ordeals. The Journal of Law and Economics, 55(3), 691–714.
McKay, R. T., & Dennett, D. C. (2009). The evolution of misbelief. Behavioral and Brain
Sciences, 32(06), 493–510.
Mercier, H. (2020). Not Born Yesterday: The Science of Who we Trust and What we
Believe. Princeton University Press.
Mercier, H., & Boyer, P. (2020). Truth-making institutions: From divination, ordeals and
oaths to judicial torture and rules of evidence. Evolution and Human Behavior.
Miton, H., Claidière, N., & Mercier, H. (2015). Universal cognitive mechanisms explain
the cultural success of bloodletting. Evolution and Human Behavior, 36(4), 303–
312.
Miton, H., & Mercier, H. (2015). Cognitive obstacles to pro-vaccination beliefs. Trends In
Cognitive Sciences, 19(11), 633–636.
Pennycook, G., & Rand, D. G. (2021). Research note: Examining false beliefs about
voter fraud in the wake of the 2020 Presidential Election. Harvard Kennedy
School Misinformation Review. https://doi.org/10.37016/mr-2020-51
Petersen, M. B. (2020). The evolutionary psychology of mass mobilization: How
disinformation and demagogues coordinate rather than manipulate. Current
Opinion in Psychology, 35, 7175.
Rappaport, R. A., & Rappaport, R. A. R. (1999). Ritual and Religion in the Making of
Humanity (Vol. 110). Cambridge University Press.
Regehr, C., Goldberg, G., & Hughes, J. (2002). Exposure to human tragedy, empathy,
and trauma in ambulance paramedics. American Journal of Orthopsychiatry,
72(4), 505–513.
Santoro, D., & Kumar, M. (2018). Speaking truth to power-A theory of whistleblowing
(Vol. 6). Springer.
Shibutani, T. (1966). Improvised News. A Sociological Study of Rumor. Bobbs-Merrill
Company.
Siegler, A. J., Mbwambo, J. K., McCarty, F. A., & DiClemente, R. J. (2012). Condoms
“contain worms” and “cause HIV” in Tanzania: Negative Condom Beliefs Scale
development and implications for HIV prevention. Social Science & Medicine,
75(9), 1685–1691. https://doi.org/10.1016/j.socscimed.2012.07.010
Silverman, D., Kaltenthaler, K., & Dagher, M. (2021). Seeing Is Disbelieving: The
Depths and Limits of Factual Misinformation in War. International Studies
Quarterly.
Smith, M. J., Ellenberg, S. S., Bell, L. M., & Rubin, D. M. (2008). Media coverage of the
measles-mumps-rubella vaccine and autism controversy and its relationship to
MMR immunization rates in the United States. Pediatrics, 121(4), e836–e843.
Sperber, D. (1996). Explaining Culture: A Naturalistic Approach. Blackwell.
Sperber, D. (1997). Intuitive and reflective beliefs. Mind and Language, 12(1), 67–83.
Szegőfi, Á. (unpublished document). Blood Libels as Evolved Coalition Signals.
Torrey, N. L. (1961). Les Philosophes. The Philosophers of the Enlightenment and
Modern Democracy. Capricorn Books.
Uscinski, J. E. (2020). Conspiracy theories: A primer. Rowman & Littlefield Publishers.
... Moreover, conspiracy believers may be using their controversial beliefs to advertise a subversive posture without necessarily acting on them (Wagner-Egger et al., 2022). Often, these beliefs and stated attitudes may thus be better understood as social signals rather than proper beliefs and attitudes that would be logically followed by behaviours (Wagner-Egger et al., 2022; for a more detailed discussion, see Mercier & Altay, 2022). ...
... If that is the case, stating that one believes in conspiracy theories may also be a product of these motivations to appear unique, critical, anticonformist or 'against the system'. Indeed, it is odd that conspiracy believers are so vocal and public about their beliefs, given that victims of real conspiracies are usually very cautious due to their fear of the perpetrators (Mercier & Altay, 2022). However, if conspiracy theories are tools that people strategically use to convey desired impressions, then it makes sense that conspiracy believers proudly harbour these fringe beliefs (Marwick & Partin, 2022). ...
Article
Full-text available
Conspiracy believers often claim to be critical thinkers their 'own research' instead of relying on others' testimony. In two preregistered behavioural studies conducted in the United Kingdom and Pakistan (Nparticipants = 864, Ntrials = 5408), we test whether conspiracy believers have a general tendency to discount social information (in favour of their own opinions and intuitions). We found that conspiracy mentality is not associated with social information use in text-based (Study 1) and image-based (Study 2) advice-taking tasks. Yet, we found discrepancies between self-reported and actual social information use. Conspiracy believers were more likely to report relying less on social information than actually relying less on social information in the behavioural tasks. Our results suggest that the scepticism of conspiracy believers towards epistemic authorities is unlikely to be the manifestation of a general tendency to discount social information. Conspiracy believers may be more permeable to social influence than they sometimes claim.
... This theoretical account differs from many popular (e.g., Collins 2020) and scholarly (Goertzel 1994) accounts of conspiracy theories which often imply strong "hypodermic" effects in which exposure to a conspiracy theory from a media source persuades people to adopt those conspiracy theories as beliefs, and in turn, those beliefs then lead directly to behaviors. In contrast to such models, our model accounts for (i) the large body of scholarship that casts doubt on the ability of messages to "hypodermically" persuade (e.g., Selb and Munzert 2018, Altay, Berriche, and Acerbi 2023), (ii) the emerging body of work showing that beliefs in conspiracy theories tend to be longstanding and stable at both the individual (Mancosu andVassallo 2022, Williams et al. 2024, Romer and Jamieson 2020) and aggregate levels (Uscinski et al. 2022b, Oliver andWood 2014), and (iii) recent questions about the ability of conspiracy theory beliefs to exogenously cause behaviors (Enders et al. 2022, Mercier andAltay 2022). With this said, our theoretical framework comports with widely accepted theories about the roles of elite influence (Zaller 1992), group dynamics (Tajfel 1981), and psychological traits in influencing beliefs and behaviors (Furnham, Richards, and Paulhus 2013). ...
Article
Full-text available
The “White Replacement” conspiracy theory, that governments and corporations are “replacing” white people, is linked to several mass shootings. Given its recent ubiquity in elite rhetoric, concerns have arisen about the popularity of this conspiracy theory among the United States mass public. Further, political scientists have noted a need to understand why people believe or act upon this conspiracy theory. Using a 2022 US national survey (n = 2001), we find that a third of Americans agree that leaders are replacing white people with people of color. These beliefs are related to anti-social personality traits, various forms of nationalist and authoritarian sentiments, and negative sentiments toward immigrants, minorities, women, and the political establishment. Regression analysis however fails to find significant effects of partisanship and ideology on these beliefs. Further, we observed that these beliefs are related to a desire to engage in both normative (e.g., run for political office) and nonnormative political participation (e.g., commit violence). Given the popularity of White Replacement conspiracy theories in the US and elsewhere, our findings suggest new avenues for research into potentially dangerous beliefs, as well as xenophobia, antisemitism, racism, sexism, extremism, and political violence.
... While this individual appeared to treat the PizzaGate theory as factual, he was the exception that proved the rule: the vast majority of people who believed this theory did not attempt to intervene, as one might expect of a person who truly believed that children were being abused there. Instead, they contented themselves with sharing the conspiracy theory amongst each other and posting negative reviews on the restaurant's Google page (Mercier & Altay, 2022). This suggests that for their adherents, the real function of the PizzaGate conspiracy theory was tied to the social experiences it engendered rather than a genuine attempt to uncover facts about the world (Wagner-Egger et al., 2022). ...
Article
Full-text available
Keeping track of what others believe is a central part of human social cognition. However, the social relevance of those beliefs can vary a great deal. Some belief attributions mostly tell us about what a person is likely to do next. Other belief attributions tell us more about a person's social identity. In this paper, I argue that we cope with this challenge by employing two distinct concepts of belief in our everyday social interactions. The epistemic concept of belief is primarily used to keep track of what other people take to be true, and this informs how we predict and interpret their behaviors. The symbolic concept of belief, in contrast, is primarily used as a means of signaling one's social identity to other members of one's community. In turn, community members closely monitor each other's symbolic beliefs as a means of enforcing social norms.
... Therefore, individual differences in unfounded beliefs about COVID-19 might be accounted for by the same latent traits as unfounded beliefs in general. Some unfounded beliefs are culturally specific (Mercier & Altay, 2022), precluding reliable cross-national research on their structure. Researchers have sought to bypass this problem by assessing either a psychological construct that underlies the propensity to endorse conspiracies (e.g., conspiracy mentality; Bruder et al., 2013) or decontextualized generic conspiracy beliefs (Brotherton et al., 2013). ...
Article
Unfounded—conspiracy and health— beliefs about COVID-19 have accompanied the pandemic worldwide. Here, we examined cross-nationally the structure and correlates of these beliefs with an 8-item scale, using a multigroup confirmatory factor analysis. We obtained a two-factor model of unfounded (conspiracy and health) beliefs with good internal structure (average CFI = .98, RMSEA = .05, SRMR = .04), but a high correlation between the two factors (average latent factor correlation = .57). This model was replicable across 50 countries (total N = 13,579), as evidenced by metric invariance between countries (CFI = .96, RMSEA= .06, SRMS = .07) as well as scalar invariance across genders (CFI = .98, RMSEA= .04, SRMS = .03) and educational levels (CFI = .98, RMSEA= .04, SRMS = .03). Also, lower levels of education, more fear of COVID-19, and more cynicism were weakly associated with stronger conspiracy and health beliefs. The study contributes to knowledge about the structure of unfounded beliefs, and reveals the potential relevance of affective (i.e., fear of COVID-19) and cognitive (i.e., cynicism) factors along with demographics, in endorsing such beliefs. In summary, we obtained cross-cultural evidence for the distinctiveness of unfounded conspiracy and health beliefs about COVID-19 in terms of their structure and correlates. Keywords: Unfounded beliefs; COVID-19; Conspiracy beliefs; Health beliefs; Cross-cultural
... Overall, these considerations are in line with a growing body of research suggesting that holding beliefs can serve social functions, that misbeliefs might play a particularly useful role in this respect (e.g. Mercier, 2020;Williams, 2020), and that misbeliefs only rarely cause costly behaviors Mercier & Altay, 2022;Uscinski et al., 2022). ...
Article
Full-text available
The negativity bias favours the cultural diffusion of negative beliefs, yet many common (mis)beliefs-naturopathy works, there's a heaven-are positive. Why? People might share 'happy thoughts'-beliefs that might make others happy-to display their kindness. Five experiments conducted among Japanese and English-speaking participants (N = 2412) show that: (i) people higher on communion are more likely to believe and share happier beliefs, by contrast with people higher in competence and dominance; (ii) when they want to appear nice and kind, rather than competent and dominant, people avoid sharing sad beliefs, and instead prefer sharing happy beliefs; (iii) sharing happier beliefs instead of sad beliefs leads to being perceived as nicer and kinder; and (iv) sharing happy beliefs instead of sad beliefs fleads to being perceived as less dominant. Happy beliefs could spread, despite a general negativity bias, because they allow their senders to signal kindness.
... hand, envy could indirectly facilitate the adoption of conspiracy beliefs via radicalization. Note too that this does not imply that radicalization or envy should necessarily precede conspiracy beliefs, as such beliefs may be held for other reasons (e.g., to coordinate with others; see, Mercier and Altay, 2022). We examine the relationship between dispositional envy, central aspects of radicalization (i.e., extremist attitude and endorsement of violence), and conspiracy mentality. ...
Article
Full-text available
Emotions are conspicuous components of radicalization, violent extremism, and conspiracy ideation. Of the emotions studied for their contribution to those social pathologies, envy has been relatively unexplored. We investigate the relationship between envy, radicalization, and conspiracy ideation. Envy appears to affect core aspects of radicalization, particularly the endorsement of extremism and the acceptance of violent means to achieve one’s ends, while radicalization facilitates the adoption of conspiracy ideation, rather than the latter being a cause of radicalization. Implications for future research on radicalization and violent extremism are discussed.
Article
Full-text available
This paper of ours further expounds on psyops and cultural misbeliefs-their complex interplay in how these deep-seated but very often cultural misbeliefs can be manipulated to have an influence upon perceptions and responses in the public. In psychological warfare, cultural misbelief has formed a two-edged sword wherein skilled players could use these in misleading the target groups in a direction that furthers the strategic aim. The present study has shown how rumors, especially those coinciding with deep-seated fears and prejudices can fuel acts of violence or justify negative behavior and has illustrated the huge impact which cultural background may have on individual behavior and social life. It also reflects upon the degree to which psychological operations have created public opinion and attempted to demoralize one's adversaries by using those cultural cleavages apt to awake social chaos and conflict and it also espouses critical thinking and education as imperatives for overcoming the aftermath of cultural misconceptions and propounds an inclusive approach toward attaining understanding and tolerance of the people within the community. It is when such misconceptions are identified and corrected that societies become resilient to such manipulative misinformation and create a united and peaceful atmosphere.
Article
Blood libels are narratives about Jews and Christians, featuring an accusation that a child or a woman had been kidnapped and assaulted due to religious or economic goals. Blood libel-like narratives, however, are not only found in Judeo-Christian history; they appear in many cultures. Using the framework of Cultural Attraction Theory, the paper considers their evolution, and identifies testable factors of attraction. The paper makes two claims regarding the morphology and the function of these ancient tales. Firstly, narratives about outgroups tend to evolve towards the shape of a blood libel, as it taps into an optimum number of universal cognitive preferences. The correspondence with the evolved features of the mind contributes to the success of the narrative in different cultures and time periods. Secondly, these narratives function as coalition signals. Upon calling ingroup members into action against an outgroup, the blood libel unifies audiences before engaging in exclusionary action.
Article
Scholars have rapidly produced a robust body of literature addressing the public’s beliefs in, and interactions with “misinformation.” Despite the literature’s stated concerns about the underlying truth value of the information and beliefs in question, the field has thus far operated without a reliable epistemology for determining the truth of the information and beliefs in question, often leaving researchers (or third parties) to make such determinations based on loose definitions and a naïve epistemology. We argue that, while this area of research has made great strides in recent years, more attention to definitions, epistemology, and terminology would both improve the validity of the literature and prevent the field of misinformation studies from becoming political conflict by another name.
Article
Full-text available
In recent years, there has been an increasing interest in the consequences of conspiracy theories and the COVID–19 pandemic raised this interest to another level. In this article, I will outline what we know about the consequences of conspiracy theories for individuals, groups, and society, arguing that they are certainly not harmless. In particular, research suggests that conspiracy theories are associated with political apathy, support for non-normative political action, climate denial, vaccine refusal, prejudice, crime, violence, disengagement in the workplace, and reluctance to adhere to COVID–19 recommendations. In this article, I will also discuss the challenges of dealing with the negative consequences of conspiracy theories, which present some opportunities for future research.
Article
Full-text available
The 2020 U.S. Presidential Election saw an unprecedented number of false claims alleging election fraud and arguing that Donald Trump was the actual winner of the election. Here we report a sur-vey exploring belief in these false claims that was conducted three days after Biden was declared the winner. We find that a majority of Trump voters in our sample – particularly those who were more politically knowledgeable and more closely following election news – falsely believed that election fraud was widespread and that Trump won the election. Thus, false beliefs about the elec-tion are not merely a fringe phenomenon. We also find that Trump conceding or losing his legal challenges would likely lead a majority of Trump voters to accept Biden’s victory as legitimate, alt-hough 40% said they would continue to view Biden as illegitimate regardless. Finally, we found that levels of partisan spite and endorsement of violence were equivalent between Trump and Biden voters.
Article
Full-text available
Background: There is growing evidence of vaccine delays or refusals due to a lack of trust in the importance, safety, or effectiveness of vaccines, alongside persisting access issues. Although immunisation coverage is reported administratively across the world, no similarly robust monitoring system exists for vaccine confidence. In this study, vaccine confidence was mapped across 149 countries between 2015 and 2019. Methods: In this large-scale retrospective data-driven analysis, we examined global trends in vaccine confidence using data from 290 surveys done between September, 2015, and December, 2019, across 149 countries, and including 284 381 individuals. We used a Bayesian multinomial logit Gaussian process model to produce estimates of public perceptions towards the safety, importance, and effectiveness of vaccines. Associations between vaccine uptake and a large range of putative drivers of uptake, including vaccine confidence, socioeconomic status, and sources of trust, were determined using univariate Bayesian logistic regressions. Gibbs sampling was used for Bayesian model inference, with 95% Bayesian highest posterior density intervals used to capture uncertainty. Findings: Between November, 2015, and December, 2019, we estimate that confidence in the importance, safety, and effectiveness of vaccines fell in Afghanistan, Indonesia, Pakistan, the Philippines, and South Korea. We found significant increases in respondents strongly disagreeing that vaccines are safe between 2015 and 2019 in six countries: Afghanistan, Azerbaijan, Indonesia, Nigeria, Pakistan, and Serbia. We find signs that confidence has improved between 2018 and 2019 in some EU member states, including Finland, France, Ireland, and Italy, with recent losses detected in Poland. Confidence in the importance of vaccines (rather than in their safety or effectiveness) had the strongest univariate association with vaccine uptake compared with other determinants considered. When a link was found between individuals' religious beliefs and uptake, findings indicated that minority religious groups tended to have lower probabilities of uptake. Interpretation: To our knowledge, this is the largest study of global vaccine confidence to date, allowing for cross-country comparisons and changes over time. Our findings highlight the importance of regular monitoring to detect emerging trends to prompt interventions to build and sustain vaccine confidence. Funding: European Commission, Wellcome, and Engineering and Physical Sciences Research Council.
Article
Misinformation, lies, and fake news are pervasive in war. But when are they actually believed by the people who live in war zones, and when are they not? This question is key, as their spread can spark greater violence and spoil efforts to make peace. In this study, we advance a new argument about lies in war. Building on existing research that links people's factual beliefs in conflict to their psychological and informational biases, we argue that they also hinge on their exposure and proximity to relevant events. While war is rife with lies, those close to the action have the means and the motives to see through them. We test this argument with a unique combination of survey and event data from the Coalition air campaign against the Islamic State of Iraq and the Levant in contemporary Iraq, finding support for our theory. Ultimately, the results help enhance our understanding of the dynamics of modern armed conflict and the reach of misinformation in contemporary world politics.
Article
In many human societies, truth-making institutions are considered necessary to establish an officially valid or “received” description of some specific situation. These range from divination, oaths, and ordeals to judicial torture or trial by jury. In many cases, these institutions may seem odd or paradoxical, e.g., why would an ordeal reveal a defendant's guilt or innocence? Here we propose to address the questions, why those institutions are considered the source of accepted truth, and why they have recurrent features in many different cultures. Our model is based on two well-documented set of evolved cognitive mechanisms. One is epistemic vigilance, the set of cognitive processes that help us evaluate the quality of communicated information we receive. We show how our epistemic intuitions account for otherwise puzzling aspects of divination, oaths, and ordeals. The other set of mechanisms consists in human capacities for coalition building and the recruitment of social support, which explains how truth-making institutions can be strategically used by individuals to influence mutual knowledge for their own interests. Taken together, these mechanisms explain the kinds of institutions found in small-scale societies (oaths, ordeals, divination), as well as the emergence of different institutions (laws of evidence, judicial torture, trial by jury) in large-scale and modern societies.
Book
Roy Rappaport argues that religion is central to the continuing evolution of life, although it has been been displaced from its original position of intellectual authority by the rise of modern science. His book, which could be construed as in some degree religious as well as about religion, insists that religion can and must be reconciled with science. Combining adaptive and cognitive approaches to the study of humankind, he mounts a comprehensive analysis of religion's evolutionary significance, seeing it as co-extensive with the invention of language and hence of culture as we know it. At the same time he assembles the fullest study yet of religion's main component, ritual, which constructs the conceptions which we take to be religious and has been central in the making of humanity's adaptation. The text amounts to a manual for effective ritual, illustrated by examples drawn from anthropology, history, philosophy, comparative religion, and elsewhere.
Article
In a great variety of cultures oaths, ordeals, or lie detectors are used to adjudicate in trials, even though they do not reliably discern liars from truth tellers. I suggest that these practices owe their cultural success to the triggering of cognitive mechanisms that make them more culturally attractive. Informal oaths would trigger mechanisms related to commitment in communication. Oaths used in judicial contexts, by invoking supernatural punishments, would trigger intuitions of immanent justice, linking misfortunes following an oath with perjury. These intuitions would justify the infliction of costs on oath takers in a way that appears morally justified. Ordeals reflect the same logic. Intuitions about immanent justice link a worse outcome following the ordeal with a guilty verdict. This link justifies the application of the ordeal, and the fixed costs involved (burning, poisoning). Lie detectors also rely on the creation of a link between a specified outcome and a guilty verdict. However, they do not rely on intuitions about immanent justice, but on a variety of intuitions ranging from the plausibly universal to the culturally idiosyncratic. As a result, lie detectors involve lower fixed costs than ordeals, and are less cross-culturally successful than oaths or ordeals.
Article
Large-scale mobilization is often accompanied by the emergence of demagogic leaders and the circulation of unverified rumors, especially if the mobilization happens in support of violent or disruptive projects. In those circumstances, researchers and commentators frequently explain the mobilization as a result of mass manipulation. Against this view, evolutionary psychologists have provided evidence that human psychology contains mechanisms for avoiding manipulation and new studies suggest that political manipulation attempts are, in general, ineffective. Instead, we can understand decisions to follow demagogic leaders and circulate fringe rumors as attempts to solve a social problem inherent to mobilization processes: The coordination problem. Essentially, these decisions reflect attempts to align the attention of individuals already disposed for conflict.