Content uploaded by Sacha Altay
Author content
All content in this area was uploaded by Sacha Altay on Nov 08, 2022
Content may be subject to copyright.
Do cultural misbeliefs cause costly behavior?
Hugo Mercier
Sacha Altay
Institut Jean Nicod, Département d’études cognitives, ENS, EHESS, PSL University,
CNRS, Paris France
Non proofread version
To be published as
Mercier, H. & Altay, S. (in press) Do cultural misbeliefs cause costly behavior? In
Musolino, J., Hemmer, P. & Sommer, J. (Eds.) The Science of Beliefs. Cambridge
University Press.
“Those who can make you believe absurdities can make you commit atrocities”
Voltaire (Torrey, 1961)
The epigraph above encapsulates a popular sentiment: when people hold misbeliefs (“a
false belief, or at least a belief that is not correct in all particulars,” McKay & Dennett,
2009, p. 493), personally or socially costly actions follow. The battle against fake news,
conspiracy theories, anti-vax rumors, and other popular misbeliefs is often justified in
those terms: if people believe all that hogwash, terrible consequences will ensue, as
they support demagogues, burn 5G towers, and refuse to vaccinate their children (see,
e.g., Douglas, 2021).
The link between misbelief and costly behavior is intuitively compelling for at least two
reasons:
(1) As a rule, beliefs should have behavioral consequences: this is what beliefs are
for, to guide our behaviors. Indeed, the vast majority of beliefs do have
behavioral consequences (if Luc believes it’ll rain, he’s more likely to take his
umbrella.)
(2) Costly actions are often accompanied by misbeliefs. Most Americans who
supported violent action following the results of the 2020 presidential election
believed the election to have been stolen (Pennycook & Rand, 2021). People
who burn 5G towers believe the towers will make them sick. People who refuse
to vaccinate their children believe, for instance, that vaccination causes autism.
We argue here that in fact cultural misbeliefs only rarely directly cause the costly
behaviors they are associated with--although they can have a variety of other
deleterious effects. We do not question that misbeliefs formed through perception and
individual inference can lead to costly actions: if Rosalie thinks the brakes on her car
have been fixed, but that they haven’t, she will behave accordingly, with potentially
deadly consequences. Our focus here is on cultural misbeliefs, that is misbeliefs that
are shared by many individuals, and that result at least in part from social transmission,
as do fake news, conspiracy theories, and anti-vax misinformation (see, Sperber, 1996).
We first show that there are cases in which the link between beliefs and behavior can
be systematically broken: when people hold reflective beliefs (thereby contesting (1)).
We then argue that most cultural misbeliefs do not result in costly actions and that, even
when costly actions are accompanied by misbeliefs, the misbeliefs rarely directly cause
the actions (thereby contesting (2)).
Intuitive and reflective beliefs
Beliefs should guide our behavior. Typically, in humans at least, beliefs do so by being
free to interact with relevant inferential or action planning mechanisms. For example, if
Hiroshi believes that his baby has fallen asleep, that belief can trigger a variety of
inferences and behaviors: that he can leave the room, that he shouldn’t make too much
noise in doing so, and so on and so forth. Such beliefs have been dubbed by Dan
Sperber (Sperber, 1997; see also, Baumard & Boyer, 2013) intuitive beliefs, and we
largely follow him in the following exposition of the distinction between intuitive and
reflective beliefs.
Humans are fluent in the ability to form metarepresentations: representations of
representations. We do so for instance when we use mentalizing (Simonetta believes
that Vittorio wants to leave the party) or ostensive communication (Simonetta believes
that Vittorio, by pretending to yawn, means that he thinks the movie is boring).
Metarepresentations allow us to have a variety of reflective attitudes towards beliefs.
Yuping can believe Xi thinks it’ll rain, even if she herself believes it’ll be sunny. Holding
such reflective attitudes towards beliefs is tremendously useful, as it allows us to
engage in mentalizing and communication without automatically endorsing the beliefs
we represent others to hold, and to practice hypothetical thinking.
If we can form skeptical reflective attitudes--Yuping can believe Xi is mistaken in
thinking it’ll rain--we can also have credal attitudes--Yuping can believe Xi is right in
thinking it’ll rain. When we hold beliefs such as “Xi is right in thinking it’ll rain,” it seems
we should also automatically form the belief “it’ll rain.” Clearly, in most cases, this is
exactly what we do; but not always.
Most of our beliefs rely on intuitive concepts (Sperber, 1997), for instance, the concept
of rain. We know what inferences to draw when we believe it’ll rain (that we’ll get wet if
we stay out, etc.). However, we can also use reflective concepts. Hamza doesn’t know
much about physics, but yet he accepts relativity theory. To him, concepts such as the
space-time continuum are reflective: he can form the belief “it is true that there is a
space-time continuum” while being unable to draw any inference from the belief that
there is a space-time continuum. Beliefs that rely on reflective concepts can only be
held reflectively.
Some cultural misbeliefs rely on reflective concepts: we do not have intuitive concepts
for an omnipotent being, the eternity we might spend in heaven or hell, or that one being
is in fact three consubstantial beings. In such cases, it is not surprising that beliefs
relying on these concepts remain reflective, and have no direct causal impact on our
behavior (e.g., Barrett, 1999; Barrett & Keil, 1996). Many cultural misbeliefs, however,
use intuitive concepts: that ballots were burnt, that Jewish people ritually murdered
children, that vaccines make you sick. We could form such beliefs through perception
and simple inference. Still, we presently argue that even when cultural misbeliefs rely on
intuitive concepts, they often remain reflective beliefs.
How to tell whether cultural misbeliefs are intuitive or reflective
Humans have the ability to hold beliefs reflectively, in a way that mostly insulates them
from inferential and action planning mechanisms. It is thus at least possible that
someone might hold a misbelief, and yet that this misbelief has few direct cognitive or
behavioral consequences. How can we tell whether that is the case? Several types of
evidence speak to this question.
Someone who would argue that a given misbelief is held intuitively, and not reflectively,
could point to its apparent behavioral consequences. Consider the humoral theory of
disease and the practice of bloodletting. For centuries, Western physicians have
endorsed the humoral theory of disease, and this theory appears to have driven the
practice of bloodletting--a behavior costly at least for the patients. Isn’t this
straightforward evidence that the belief in the humoral theory of disease was held
intuitively? Not necessarily. If bloodletting were practiced in cultures in which the
humoral theory of disease doesn’t exist, it would suggest that the practice of bloodletting
might be better explained by other factors. This is the case: bloodletting is or was a
popular therapy in at least a third of the world’s cultures (Miton et al., 2015). With the
exception of the Western tradition, none of these cultures entertained a belief in the
humoral theory of disease. Indeed, in many cultures very little explanation was given for
the practice of bloodletting. This does not rule out entirely the possibility that the
practice of bloodletting was caused by some factors in every culture but the West,
where it was due to the acceptance of the humoral theory of disease. This
interpretation, however, is obviously far-fetched. Instead, bloodletting appears to be
culturally attractive independently of any theory supporting it, theories that were instead
developed, in some cultures, to justify the practice. This removes the main argument
suggesting that the humoral theory of disease is held intuitively, as it appears that the
belief does not cause the behavior; instead the behavior (indirectly) causes the belief. A
first relevant type of evidence to tell whether a cultural misbelief is held intuitively or
reflectively thus consists in examining whether a behavior thought to be caused by this
misbelief is in fact better explained by other factors--for instance if the behavior is cross-
culturally recurrent even in the absence of any individual cultural misbelief.
The evidence above--that a given misbelief does not cause a given costly behavior--
leaves open the possibility that the misbelief is held reflectively, but it does not show it
conclusively. More direct evidence that a belief is held reflectively comes from the
observation (i) that this belief does not have the cognitive, affective, or behavioral
consequences that holding the belief intuitively does, and/or (ii) that this belief has
cognitive, affective, or behavioral consequences that holding the intuitive belief does not
have.
How can we tell what the cognitive, affective, or behavioral consequences of holding a
belief intuitively are? There is more or less variability in this respect. In some cases,
there is a strong link between belief and behavior--people who believe a pot is full of
sizzling oil consistently avoid dipping their hands in it. In other cases, the degree of
variability is much higher. Consider bystander intervention, often taken as a case study
of the power of the situation to shape the link between our belief (someone needs help)
and our actions (we help them). Early research showed that the number of people
witnessing a potential emergency situation had a strong effect on the individual
proclivity to help the victim: the more witnesses there were, the less likely each
individual was to intervene (Darley & Latané, 1968); recent research has also revealed
that some interpersonal differences play a role in deciding whether someone will help
(Hortensius & de Gelder, 2018). Although some of these behavioral differences stem
from the participants holding different beliefs (i.e. about the nature of the emergency),
even when the existence of an emergency situation is clear, behavioral responses vary.
Still, there are some clear patterns. First, nearly everyone is strongly affected by the
situation, even those who elect not to help (e.g., Hortensius & de Gelder, 2018; on the
effects of witnessing trauma more generally, see, e.g., Regehr et al., 2002). Second,
there is always a substantial share of participants that chooses to act--for instance, 62%
of participants in groups of five bystanders, in the original study (Darley & Latané,
1968). Thus, even cases taken as examples of the power of the situation to shape
behavior reveal strong regularities in the links between belief and behavior--when
people hold intuitive beliefs.
By contrast, for many misbeliefs we observe that (i) for most people who hold the
misbelief, it appears to have little or no cognitive, affective, or behavioral consequences,
and that (ii) the cognitive, affective, or behavioral consequences that do follow are not
those that would follow from holding the belief intuitively.
We will merely provide an illustration here, and review more evidence below. Consider
the belief in the so-called Pizzagate conspiracy theory, according to which high-level
Democrats were sexually abusing children in the basement of a restaurant from the
suburbs of Washington D.C. Although, according to polls, millions of Americans
endorsed this conspiracy, for the overwhelming majority of them holding this horrendous
misbelief appeared to have little or no consequence. The exceptions are notable. One
individual--who might have held the belief intuitively--stormed the restaurant and
demanded at gunpoint that the children be freed. By contrast, all other behavioral
responses consisted in expressive behaviors, such as posting negative reviews of the
restaurant. Such behavior would be completely incongruous if the belief were held
intuitively (imagine having seen pictures of children being abused in a restaurant, and
reacting by posting a one star review of the place). We have thus, for everyone but
maybe the lone individual who tried to free the supposed victims, either no
consequences from the belief, or consequences that would not follow if the belief were
held intuitively. This suggests that the vast majority of those who endorsed Pizzagate
did so reflectively. Note that doing so shielded these individuals from nearly all costly
consequences from holding that belief, by contrast with the single individual who might
have held it intuitively, and who served a multi-year jail sentence.
Before we move on to a brief review of the evidence regarding the status of cultural
misbeliefs, two clarifications. First, holding a belief reflectively does not mean that one is
being deceitful. When someone says they believe Pizzagate is real, and yet fails to do
anything about it, they are typically not lying--they are holding a reflective belief.
Second, how confidently a belief is held, and whether it is held intuitively or reflectively,
are orthogonal dimensions. Intuitive beliefs can be held with very little confidence (e.g. if
you think you might have recognized someone in a crowd), while reflective beliefs can
be held very confidently: a physicist’s belief in abstruse theoretical physics, or a priest’s
belief in abstract theological dogma can be absolute.
Diverse misbeliefs, common costly behaviors
As mentioned above, a crucial piece of evidence suggesting that a cultural misbelief
might be held reflectively and not intuitively is that some behavior (e.g. bloodletting)
associated with the belief in a given culture (the humoral theory of disease) can also be
found in many other cultures, even in the absence of that specific misbelief.
We used bloodletting and the humoral theory of disease as an illustration, but the
argument can be applied to other beliefs about therapies. Common forms of therapies
nearly always take a form similar to bloodletting: removing something seen as bad from
the body, whether it is through laxatives, emetics, or even sudation (Coury, 1967). In
spite of the commonality in these practices, different cultures entertain varied, and more
or less sophisticated, beliefs justifying these practices, which suggests that these beliefs
do not play any direct causal role in explaining the success of the practices.
Still in the domain of therapies, a more contemporary example is offered by anti-
vaccination beliefs. The spread of specific anti-vaccination beliefs is often thought to
explain refusals to vaccinate (oneself or one’s children). However, extreme anti-vax
positions are strikingly regular: they are present in nearly all countries studied (always at
a very low level--typically, below 5%, see de Figueiredo et al., 2020), and they have
existed since the dawn of vaccination (e.g., Durbach, 2000). Resistance to vaccination
did not wait for the infamous Wakefield study suggesting a link between vaccination and
autism, and it has been accompanied by a wide variety of beliefs (see Figure 1). Indeed,
a study suggested that the media frenzy that initially surrounded the Wakefield study
had no effect on vaccination rates (Smith et al., 2008). It seems that there is an intuitive
resistance to vaccination (Miton & Mercier, 2015), resistance that takes an extreme form
in a few individuals, and that specific beliefs about the dangers of vaccination--whether
it is supposed to cause autism or sterility--are used to justify this resistance, but do not
directly cause it.
Figure 1. Non-exhaustive map of the variety of anti-vaccine arguments throughout the
world.
Beyond the domain of therapies, cross-culturally common practices accompanied by
very different beliefs throughout the world can be found in many domains. Maybe the
most prominent example is that of rituals: ritualized behavior (which is “recognizable by
its stereotypy, rigidity, repetition, and apparent lack of rational motivation,” Boyer &
Liénard, 2006, p. 595) is present in most, if not all, human societies (see, e.g.,
Rappaport & Rappaport, 1999). Rituals are rarely very costly, but they do require some
time and energy. Boyer and Lienard suggest that rituals are a culturally attractive
practice because they tap into an “evolved Precaution System” (2006, p. 595). If it is
true that some universal cognitive machinery largely explains the cultural success of
rituals, the myriad explanations offered for why such and such ritual must be performed
are likely rationalizations with relatively little causal power.
Another salient domain in which we observe striking cross-cultural regularities in
apparently costly behavior is that of so-called “truth-making institutions” (Mercier &
Boyer, 2020). These institutions--such as divination or the ordeal--are perceived as
delivering true statements about various issues, in particular (but not only) the ascription
of guilt. Belief in such institutions appears costly, at least for those who suffer from them
directly (e.g. the individual accused of being a witch, the proband who has to undergo
the ordeal), but also for the community at large (since these institutions do not, by
themselves, provide accurate answers or point to actual culprits). It has been suggested
that these institutions owe their success to specific beliefs, for instance a belief that god
would never punish the innocent (e.g., Leeson, 2012). However, the fact that these
institutions are found in many cultures, sharing similar features, suggests that instead
they owe their success to more universal cognitive and social mechanisms, and that the
culturally-specific beliefs attached to them are not sufficient to explain the appearance
and persistence of the institutions (Mercier & Boyer, 2020).
Consequences of holding intuitive vs. reflective beliefs
We have argued that some costly behaviors are universal, while the beliefs that justify the
behaviors vary widely (or don’t even exist at all in some cultures), suggesting that the
beliefs do not cause the behavior. Another line of argument contrasts the behavioral
consequences of comparable beliefs when held either intuitively or reflectively. We will
see that the behavioral consequences that follow from holding beliefs reflectively are (i)
weak or indirect by contrast with the strong and direct consequences that follow from
holding the same belief intuitively, and (ii) are sometimes completely different from the
consequences of holding the same belief intuitively. We will illustrate this with the
examples of rumors and conspiracy theories.
Rumors
Rumors have been famously defined as “improvised news” (Shibutani, 1966). They
flourish when there is a demand for information that official channels of communication
cannot meet. Contrary to the common perception, in some contexts rumors are nearly
uniformly true (for review, see DiFonzo & Bordia, 2007)--in particular when they spread
in small networks, and when the object of the rumor is of direct practical relevance for
those who circulate it.
For example, in Iraq, rumors about Coalition airstrikes, fueled by propaganda and
disinformation campaigns, are rife. Despite the costs and uncertainty associated with
living close to the airstrikes, the closer people live to the airstrikes, the less likely they are
to believe false rumors about it (Silverman et al., 2021; see also, Diggory, 1956). For
these people, beliefs about the airstrikes have direct practical consequences: if the
airstrikes were indeed targeting civilians, they would need to leave the area and warn
their loved ones. They need to know the truth, as best as they can ascertain it, to draw
the appropriate inferences. On the other hand, for people living far away from the
airstrikes, incorrectly believing that airstrikes are targeting civilians does not have the
same practical consequences--it wouldn’t prompt them to move for instance. This
suggests that false rumors are more likely to spread when they have no behavioral
consequences--and thus when they could only be held reflectively.
In other cases, however, false rumors appear to have dramatic consequences. For
instance, rumors often precede, and appear to precipitate, violent ethnic riots (Horowitz,
2001). False rumors are also associated with the resistance to beneficial practices, such
as vaccination or condom use. Finally, false rumors are rife in the political domain, notably
to defame political opponents, such as rumors that Barack Obama is a Muslim, which
might lead some people to make significant political choices on the basis of false
information. In each case, rumors are associated with costly behaviors, but do the rumors
cause these behaviors? We will argue that they only do so weakly or indirectly, suggesting
that belief in these rumors is largely reflective.
First, we notice that many false rumors appear to justify, rather than cause, behaviors. As
we saw above, vaccines have encountered resistance in most times and places, and this
resistance is nearly always accompanied by false negative rumors about vaccines. Yet,
if resistance to vaccination is a quasi-universal phenomenon (in that it is found in most
cultures), there is wide variability in the rumors used to justify the resistance, suggesting
that they mostly play a post-hoc role. The same pattern can be observed in other cases
of resistance to beneficial practices, such as condom use, which is rejected because,
alternatively, the condoms cause cancer, have holes, contain HIV, have worms, or have
HIV-giving worms (see, e.g., Siegler et al., 2012).
Turning to political rumors, it is also plausible that they are used to justify the dislike for a
politician, rather than cause such dislike. This has at least been shown in the case of the
rumor that Obama is a Muslim. Kim and Kim (2019) studied the causal impact of the
rumor with a longitudinal survey that captured people’s beliefs and behaviors both before
and after the rumor’s large cultural success. They found that being exposed to the rumor
increased people’s belief that Obama is a Muslim. However, this effect was “driven almost
entirely by those predisposed to dislike Obama” (p. 307). Importantly the rumor had no
measurable effect on people’s behavior, such as their intent to vote for Obama. One could
imagine that, given the negative opinion of Muslims held by people who accepted the
rumor, if they had accepted it intuitively, it would have seriously bolstered their dislike of
Obama. That this doesn’t appear to have been the case suggests the rumor was held
reflectively.
Finally we turn to the most dramatic example of what appears to be the costly
consequences of believing in false rumors: the rumors that precede ethnic riots. Do these
rumors play the crucial role of convincing would-be rioters of the evilness of a particular
ethnic group? Not necessarily. Instead, ethnic rumors have been hypothesized to be
signals sent to facilitate coordination (Horowitz, 2001). For collective actions to take place,
people need to solve the coordination problem: how can I know if a sufficient number of
people is willing to engage with me in a collective action? Rumors could play a central
role in solving this problem by creating common knowledge that a sufficient number of
people is willing to participate in the collective action (Mercier, 2020; Petersen, 2020).
What matters to solve the problem of coordination is not so much that people believe the
rumors but rather that they be willing to share and endorse them. As a result, instead of
the specific and accurate rumors with practical consequences we find in small networks,
these rumors tend to be very vague about who the perpetrators are, only mentioning their
ethnicity (e.g., gypsies are stealing children in white vans). This means that the rumors
are not useful guides to a specific action (e.g. arresting or exerting revenge on specific
culprits), but that they are useful justifications to aggress any member of that ethnic group.
These justifications for ethnic riots are served by similar tropes the world over (e.g. of
children being kidnapped and sometimes killed, of food sources being poisoned, see
Szegőfi, unpublished document) in the rumors that precede ethnic riots, irrespective of
the local circumstances. For instance, false rumors of gypsies stealing children date back
to the middle ages in France, and have periodically resurfaced since, most recently in
2019.
Here again, we can see the contrast between intuitive and reflective beliefs. When a
member of the community is intuitively believed to have committed a crime, people take
targeted action towards that individual--whether they rely on the authorities or take
matters into their own hands. By contrast, in the case of the rumors that precede ethnic
riot, there is no intuitive connection between the purported atrocities committed by the
target group, and the reprisal, which often takes the form of looting, raping, and killing.
Moreover, such rumors tend to survive, in a more muted fashion, and without any
behavioral consequence, between episodes of ethnic riots. This suggests that rumors of
atrocities, even if they help coordinate ethnic riots, are only believed reflectively.
Conspiracy theories
Conspiracy theories are another type of misbelief that has people worried because of
their supposedly costly consequences, from attacking 5G towers to storming the U.S.
Capitol. Ironically, given that conspiracy theories are about powerful, hidden actors
exerting an oversized influence, they also worry very powerful institutions (e.g., Uscinski,
2020, p. vii). If conspiracy theories are sometimes defined as beliefs about conspiracies
that run afoul of “epistemological authorities” (such as the media, the government,
scientists, etc.) (e.g., Uscinski, 2020, p. 2), many beliefs about conspiracies (by contrast,
then, with conspiracy theories) are true--there are, indeed, (small) groups of powerful
people who conspire to turn things to their advantage. In order to argue that people’s
beliefs in conspiracy theories are usually reflective, we will contrast their consequences
with the consequences of intuitively believing that an actual conspiracy is taking place.
People sometimes notice that something appears to be amiss in their workplace.
Inconvenient documents disappear, fakes are fabricated, higher-ups act suspiciously, etc.
In such circumstances, people can develop an intuitive belief that their bosses are
engaged in some form of conspiracy. What do people do in such circumstances?
Overwhelmingly, they become anxious, and are afraid to speak up, since doing so would
jeopardize their position and potentially expose them to legal action (see, e.g., Santoro &
Kumar, 2018). As a result, whistleblowers will often try to remain anonymous (e.g. by
leaking documents to the press), or, to speak up publicly, they require guarantees from
the state that they will be protected from retaliation.
The contrast with belief in conspiracy theories--which are, according to the definition
above, overwhelmingly false--is stark. As a rule, people do not hide their belief in the
conspiracy; indeed, in many cases they are very vocal about their belief. Some conspiracy
theorists have blogs, radio channels, and websites where they expose, together with their
identity, truths that the most powerful groups in the world are (supposedly) hiding.
Consider Infowars, a popular U.S. conspiracy website created by Alex Jones, which
encourages viewers to tune in to “find out what the establishment is trying to hide.”
1
Jones
1
https://banned.video/watch?id=5b92a1e6568f22455f55be2b
has accused various branches of the U.S. government of horrendous conspiracies, such
as fabricating the Sandy Hook shooting as a false flag operation, or even orchestrating
9/11 as a controlled bombing. Apparently, it does not occur to him that if the U.S.
government could do things like this with impunity, they could very easily make an
inconvenient media personality disappear. What might look like courage--Jones is brave
enough to tell the truth to power--is merely a failure by Jones to hold his own beliefs
intuitively. As the journalist Jonathan Kay noted in his book on 9/11 Truthers: “one of the
great ironies of the Truth movement is that its activists typically hold their meetings in
large, unsecured locations such as college auditoriums—even as they insist that
government agents will stop at nothing to protect their conspiracy for world domination
from discovery” (Kay, 2011, p. 185). As a result of the asymmetry between how people
behave when they intuitively or reflectively believe in the existence of conspiracies, local
corporate or government malfeasance can persist for many years, with potential
whistleblowers remaining scared and silent, while beliefs in global but imaginary
conspiracies spreads widely, from the JFK assassination to the moon landing.
A good case can thus be made that beliefs in conspiracy theories are usually held
reflectively. What about the apparently costly behaviors that conspiracy theories appear
to lead to? We admit not to have an explanation for all of these behaviors, but we can
make two observations. First, in each case, only a small minority of people supposed to
believe in the conspiracy engage in such costly behaviors. Second, even such behaviors
would not be coherent with an intuitive belief in the conspiracy: as argued above, if the
government were powerful enough to want to poison us by installing 5G towers all over
the country, it could surely dispose of a few inconvenient rebels. In countries which are
actually ruled by ruthless governments bent on suppressing any opposition, people only
attempt such actions when they are part of a large-scale, organized movement that has
some chance of success.
Conclusion
The cultural success of misbeliefs--from fake news to conspiracy theories or beliefs in
quack remedies--is a longstanding object of concern, in large part because cultural
misbeliefs are thought to lead to behavior that is individually or socially costly. In this
chapter, we have argued that, in fact, most cultural misbeliefs are held reflectively, and
do not directly cause costly behavior. If this is a broadly optimistic conclusion, we must
note two big caveats.
First, unfortunately, many cultural true beliefs are also, we would argue, held reflectively
and have little (beneficial) behavioral consequences. For instance, belief in climate
change is rarely accompanied by commensurate action.
Second, even if misbeliefs are held reflectively, they can still be damaging in a variety of
ways: at the margin, they can help people engage in costly behavior by providing
justifications; they can help coordinate costly behaviors (as the rumors of atrocities that
precede ethnic riots); and they can lead to costly behavior if people are socially
compelled to act in a way that is perceived to be in line with their beliefs (e.g. people
might have felt pressured to take part in the riots of January 6 2021 because they had to
walk the walk, and not just talk the talk of widespread election fraud).
If cultural misbeliefs can have costly consequences--however indirectly--why does it
matter whether they are held intuitively or reflectively? We believe that this distinction is
crucial both for theoretical and practical reasons. Theoretically, the cognitive
mechanisms at play are very different for intuitive and reflective beliefs. Both types of
beliefs are accepted, held, and acted on through different processes (Sperber, 1997).
Practically, if it is true that cultural misbeliefs only have a limited, indirect role in
explaining costly behaviors, this suggests that attempting to stop the misbeliefs from
spreading will also only have a limited effect. Instead, the deeper causes that explain
both the spread and the misbeliefs and the costly behaviors--from ethnic antagonism to
lack of trust in government--must be addressed.
Acknowledgments
HM's work is supported by two ANR grants, to FrontCog (ANR-17-EURE-0017) and to
PSL (ANR-10-IDEX-0001-02). SA’s work is supported by a PhD grant from the Direction
Générale de L’Armement (DGA).
References
Barrett, J. L. (1999). Theological correctness: Cognitive constraint and the study of
religion. Method & Theory in the Study of Religion, 11(4), 325–339.
Barrett, J. L., & Keil, F. C. (1996). Conceptualizing a nonnatural entity:
Anthropomorphism in God concepts. Cognitive Psychology, 31(3), 219–247.
Baumard, N., & Boyer, P. (2013). Religious beliefs as reflective elaborations on
intuitions: A modified dual-process model. Current Directions in Psychological
Science, 22(4), 295–300.
Boyer, P., & Liénard, P. (2006). Why ritualized behavior? Precaution systems and
action parsing in developmental, pathological and cultural rituals. Behavioral and
Brain Sciences, 29(6), 595.
Coury, C. (1967). The basic principles of medicine in the primitive mind. Medical
History, 11(2), 111.
Darley, J. M., & Latané, B. (1968). Bystander intervention in emergencies: Diffusion of
responsibility. Journal of Personality and Social Psychology, 8(4p1), 377.
de Figueiredo, A., Simas, C., Karafillakis, E., Paterson, P., & Larson, H. J. (2020).
Mapping global trends in vaccine confidence and investigating barriers to vaccine
uptake: A large-scale retrospective temporal modelling study. The Lancet,
396(10255), 898–908.
DiFonzo, N., & Bordia, P. (2007). Rumor psychology: Social and organizational
approaches. American Psychological Association.
Diggory, J. C. (1956). Some consequences of proximity to a disease threat. Sociometry,
19(1), 47–53.
Douglas, K. M. (2021). Are conspiracy theories harmless? The Spanish Journal of
Psychology, 24.
Durbach, N. (2000). ‘They might as well brand us’: Working-class resistance to
compulsory vaccination in Victorian England. Social History of Medicine, 13(1),
45–63.
Horowitz, D. L. (2001). The deadly ethnic riot. University of California Press.
Hortensius, R., & de Gelder, B. (2018). From empathy to apathy: The bystander effect
revisited. Current Directions in Psychological Science, 27(4), 249–256.
Kay, J. (2011). Among the Truthers: A journey through America’s growing conspiracist
underground. Harper Collins.
Kim, J. W., & Kim, E. (2019). Identifying the Effect of Political Rumor Diffusion Using
Variations in Survey Timing. Quarterly Journal of Political Science, 14(3), 293–
311.
Leeson, P. T. (2012). Ordeals. The Journal of Law and Economics, 55(3), 691–714.
McKay, R. T., & Dennett, D. C. (2009). The evolution of misbelief. Behavioral and Brain
Sciences, 32(06), 493–510.
Mercier, H. (2020). Not Born Yesterday: The Science of Who we Trust and What we
Believe. Princeton University Press.
Mercier, H., & Boyer, P. (2020). Truth-making institutions: From divination, ordeals and
oaths to judicial torture and rules of evidence. Evolution and Human Behavior.
Miton, H., Claidière, N., & Mercier, H. (2015). Universal cognitive mechanisms explain
the cultural success of bloodletting. Evolution and Human Behavior, 36(4), 303–
312.
Miton, H., & Mercier, H. (2015). Cognitive obstacles to pro-vaccination beliefs. Trends In
Cognitive Sciences, 19(11), 633–636.
Pennycook, G., & Rand, D. G. (2021). Research note: Examining false beliefs about
voter fraud in the wake of the 2020 Presidential Election. Harvard Kennedy
School Misinformation Review. https://doi.org/10.37016/mr-2020-51
Petersen, M. B. (2020). The evolutionary psychology of mass mobilization: How
disinformation and demagogues coordinate rather than manipulate. Current
Opinion in Psychology, 35, 71–75.
Rappaport, R. A., & Rappaport, R. A. R. (1999). Ritual and Religion in the Making of
Humanity (Vol. 110). Cambridge University Press.
Regehr, C., Goldberg, G., & Hughes, J. (2002). Exposure to human tragedy, empathy,
and trauma in ambulance paramedics. American Journal of Orthopsychiatry,
72(4), 505–513.
Santoro, D., & Kumar, M. (2018). Speaking truth to power-A theory of whistleblowing
(Vol. 6). Springer.
Shibutani, T. (1966). Improvised News. A Sociological Study of Rumor. Bobbs-Merrill
Company.
Siegler, A. J., Mbwambo, J. K., McCarty, F. A., & DiClemente, R. J. (2012). Condoms
“contain worms” and “cause HIV” in Tanzania: Negative Condom Beliefs Scale
development and implications for HIV prevention. Social Science & Medicine,
75(9), 1685–1691. https://doi.org/10.1016/j.socscimed.2012.07.010
Silverman, D., Kaltenthaler, K., & Dagher, M. (2021). Seeing Is Disbelieving: The
Depths and Limits of Factual Misinformation in War. International Studies
Quarterly.
Smith, M. J., Ellenberg, S. S., Bell, L. M., & Rubin, D. M. (2008). Media coverage of the
measles-mumps-rubella vaccine and autism controversy and its relationship to
MMR immunization rates in the United States. Pediatrics, 121(4), e836–e843.
Sperber, D. (1996). Explaining Culture: A Naturalistic Approach. Blackwell.
Sperber, D. (1997). Intuitive and reflective beliefs. Mind and Language, 12(1), 67–83.
Szegőfi, Á. (unpublished document). Blood Libels as Evolved Coalition Signals.
Torrey, N. L. (1961). Les Philosophes. The Philosophers of the Enlightenment and
Modern Democracy. Capricorn Books.
Uscinski, J. E. (2020). Conspiracy theories: A primer. Rowman & Littlefield Publishers.