Article

A hobgoblin of large minds: Troubles with consistency in belief

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Beliefs are, in many ways, central to psychology and, in turn, consistency is central to belief. Theories in philosophy and psychology assume that beliefs must be consistent with each other for people to be rational. That people fail to hold fully consistent beliefs has, therefore, been the subject of much theorizing, with numerous mechanisms proposed to explain how inconsistency is possible. Despite the widespread assumption of consistency as a default, achieving a consistent set of beliefs is computationally intractable. We review research on consistency in philosophy and psychology and argue that it is consistency, not inconsistency, that requires explanation. We discuss evidence from the attitude, belief, and persuasion literatures, which suggests that accessibility of beliefs in memory is one possible mechanism for achieving a limited, but psychologically plausible, form of consistency. Finally, we conclude by suggesting future directions for research beginning from the assumption of inconsistency as the default. This article is categorized under: Psychology > Reasoning and Decision Making Psychology > Theory and Methods Philosophy > Knowledge and Belief Consistency among beliefs is a hallmark of rationality. However, we argue that achieving full consistency is so difficult that it cannot be accomplished by a human mind. Instead, the mind must rely on heuristics, which means that people can be only partially consistent.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Prior research has found mixed evidence for the long-theorized link between religiosity and pro-social behavior. To help overcome this divergence, I hypothesize that pro-social behavior is linked not to religiosity per se, but rather to the salience of religion and religious norms. I report a field experiment that examined when auction participants will respond to an appeal to continue bidding for secular charitable causes. Religious individuals are more likely than non-religious individuals to respond to an appeal “for charity” only on days that they visit their place of worship; on other days of the week, religiosity has no effect. Notably, the result persists after controlling for a host of factors that may influence bidding, but disappears when the appeal “for charity” is replaced by an appeal to bid for other (i.e., competitive) reasons. Implications for the link between religion and pro-social behavior are discussed.
Article
Full-text available
A single exposure to statements is typically enough to increase their perceived truth. This Truth-by-Repetition (TBR) effect has long been assumed to occur only with statements whose truth value is unknown to participants. Contrary to this hypothesis, recent research has found that statements contradicting participants' prior knowledge (as established from a first sample of participants) show a TBR effect following their repetition (in a second, independent sample of participants). As for now, however, attempts at finding a TBR effect for blatantly false (i.e., highly implausible) statements have failed. Here, we reasoned that highly implausible statements such as "Elephants run faster than cheetahs" may show repetition effects, provided a sensitive truth measure is used and statements are repeated more than just once. In a preregistered experiment, participants judged on a 100-point scale the truth of highly implausible statements that were either new to them or had been presented five times before judgment. We observed an effect of repetition: repeated statements were judged more true than new ones, although all judgments were judged below the scale midpoint. Exploratory analyses additionally show that about half the participants showed no or even a reversed effect of repetition. The results provide the first empirical evidence that repetition can increase perceived truth even for highly implausible statements, although not equally so for all participants and not to the point of making the statements look true.
Article
Full-text available
Across two studies with more than 1,700 U.S. adults recruited online, we present evidence that people share false claims about COVID-19 partly because they simply fail to think sufficiently about whether or not the content is accurate when deciding what to share. In Study 1, participants were far worse at discerning between true and false content when deciding what they would share on social media relative to when they were asked directly about accuracy. Furthermore, greater cognitive reflection and science knowledge were associated with stronger discernment. In Study 2, we found that a simple accuracy reminder at the beginning of the study (i.e., judging the accuracy of a non-COVID-19-related headline) nearly tripled the level of truth discernment in participants’ subsequent sharing intentions. Our results, which mirror those found previously for political fake news, suggest that nudging people to think about accuracy is a simple way to improve choices about what to share on social media.
Article
Full-text available
In an online experiment, participants who paused to explain why a headline was true or false indicated that they were less likely to share false information compared to control participants. Their intention to share accurate news stories was unchanged. These results indicate that adding “friction” (i.e., pausing to think) before sharing can improve the quality of information shared on social media.
Article
Full-text available
Axiomatic rationality is defined in terms of conformity to abstract axioms. Savage (The foundations of statistics, Wiley, New York, 1954) limited axiomatic rationality to small worlds (S, C), that is, situations in which the exhaustive and mutually exclusive set of future states S and their consequences C are known. Others have interpreted axiomatic rationality as a categorical norm for how human beings should reason, arguing in addition that violations would lead to real costs such as money pumps. Yet a review of the literature shows little evidence that violations are actually associated with any measurable costs. Limiting axiomatic rationality to small worlds, I propose a naturalized version of rationality for situations of intractability and uncertainty (as opposed to risk), all of which are not in (S, C). In these situations, humans can achieve their goals by relying on heuristics that may violate axiomatic rationality. The study of ecological rationality requires formal models of heuristics and an analysis of the structures of environments these can exploit. It lays the foundation of a moderate naturalism in epistemology, providing statements about heuristics we should use in a given situation. Unlike axiomatic rationality, ecological rationality can explain less-is-more effects (when using less information can be expected to generate more accurate predictions), formalize when one should move from ‘is’ to ‘ought,’ and be evaluated by goals beyond coherence, such as predictive accuracy, frugality, and efficiency. Ecological rationality can be seen as a formalization of means–end instrumentalist rationality, based on Herbert Simon’s insight that rational behavior is a function of the mind and its environment.
Article
Full-text available
Dispositionalism about belief has had a recent resurgence. In this paper we critically evaluate a popular dispositionalist program pursued by Eric Schwitzgebel. Then we present an alternative: a psychofunctional, representational theory of belief. This theory of belief has two main pillars: that beliefs are relations to structured mental representations, and that the relations are determined by the generalizations under which beliefs are acquired, stored, and changed. We end by describing some of the generalizations regarding belief acquisition, storage, and change.
Article
Full-text available
While religiosity is positively correlated with self-reported prosociality, observational and experimental studies on the long-hypothesized connection between religion and prosocial behavior have yielded mixed results. Recent work highlights the role of religious salience for stimulating prosocial behavior, but much of this research has involved priming Christian subjects in laboratory settings, limiting generalization to the real world. Here I present a field study conducted in the souks in the medina of Marrakesh, Morocco, which shows that religious salience can increase prosocial behavior with Muslim subjects in a natural setting. In an economic decision making task similar to a dictator game, shopkeepers demonstrated increased prosocial behavior when the Islamic call to prayer was audible compared to when it was not audible. This finding complements a growing literature on the connection between cultural cues, religious practices, and prosocial behavior, and supports the hypothesis that religious rituals play a role in galvanizing prosocial behavior. © 2015 Society for Judgment and Decision making. All rights reserved.
Article
Full-text available
It was hypothesized that the extent to which individuals' attitudes guide their subsequent perceptions of and behavior toward the attitude object is a function of the accessibility of those attitudes from memory. A field investigation concerning the 1984 presidential election was conducted as a test of these hypotheses. Attitudes toward each of the two candidates, Reagan and Mondale, and the accessibility of those attitudes, as indicated by the latency of response to the attitudinal inquiry, were measured for a large sample of townspeople months before the election. Judgments of the performance of the candidates during the televised debates served as the measure of subsequent perceptions, and voting served as the measure of subsequent behavior. As predicted, both the attitude-perception and the attitude-behavior relations were moderated by attitude accessibility. The implications of these findings for theoretical models of the processes by which attitudes guide behavior, along with their practical implications for survey research, are discussed.
Chapter
Beliefs play a central role in our lives. They lie at the heart of what makes us human, they shape the organization and functioning of our minds, they define the boundaries of our culture, and they guide our motivation and behavior. Given their central importance, researchers across a number of disciplines have studied beliefs, leading to results and literatures that do not always interact. The Cognitive Science of Belief aims to integrate these disconnected lines of research to start a broader dialogue on the nature, role, and consequences of beliefs. It tackles timeless questions, as well as applications of beliefs that speak to current social issues. This multidisciplinary approach to beliefs will benefit graduate students and researchers in cognitive science, psychology, philosophy, political science, economics, and religious studies.
Chapter
Beliefs play a central role in our lives. They lie at the heart of what makes us human, they shape the organization and functioning of our minds, they define the boundaries of our culture, and they guide our motivation and behavior. Given their central importance, researchers across a number of disciplines have studied beliefs, leading to results and literatures that do not always interact. The Cognitive Science of Belief aims to integrate these disconnected lines of research to start a broader dialogue on the nature, role, and consequences of beliefs. It tackles timeless questions, as well as applications of beliefs that speak to current social issues. This multidisciplinary approach to beliefs will benefit graduate students and researchers in cognitive science, psychology, philosophy, political science, economics, and religious studies.
Book
Beginning with its first edition and through subsequent editions, Thinking and Deciding has established itself as the required text and important reference work for students and scholars of human cognition and rationality. In this fourth edition, first published in 2007, Jonathan Baron retains the comprehensive attention to the key questions addressed in the previous editions - how should we think? What, if anything, keeps us from thinking that way? How can we improve our thinking and decision making? - and his expanded treatment of topics such as risk, utilitarianism, Baye's theorem, and moral thinking. With the student in mind, the fourth edition emphasises the development of an understanding of the fundamental concepts in judgement and decision making. This book is essential reading for students and scholars in judgement and decision making and related fields, including psychology, economics, law, medicine, and business.
Chapter
This interdisciplinary work is a collection of major essays on reasoning: deductive, inductive, abductive, belief revision, defeasible (non-monotonic), cross cultural, conversational, and argumentative. They are each oriented toward contemporary empirical studies. The book focuses on foundational issues, including paradoxes, fallacies, and debates about the nature of rationality, the traditional modes of reasoning, as well as counterfactual and causal reasoning. It also includes chapters on the interface between reasoning and other forms of thought. In general, this last set of essays represents growth points in reasoning research, drawing connections to pragmatics, cross-cultural studies, emotion and evolution.
Book
This book introduces readers to the fundamentals of Bayesian epistemology. It begins by motivating and explaining the idea of a degree of belief (also known as a “credence”). It then presents Bayesians’ five core normative rules governing degrees of belief: Kolmogorov’s three probability axioms, the Ratio Formula for conditional credences, and Conditionalization for updating credences over time. After considering a few proposed additions to these norms, it applies the core rules to confirmation and decision theory. The book then details arguments for the Bayesian rules based on representation theorems, Dutch Books, and accuracy measures. Finally, it looks at objections and challenges to Bayesian epistemology. It presents problems concerning memory loss, self-location, old evidence, logical omniscience, and the subjectivity of priors. It considers the rival statistical paradigms of frequentism and likelihoodism. Then it explores alternative Bayesian-style formalisms involving comparative confidence rankings, credences ranges, and Dempster-Shafer functions.
Book
Why do people come to reject climate science or the safety and efficacy of vaccines, in defiance of the scientific consensus? A popular view explains bad beliefs like these as resulting from a range of biases that together ensure that human beings fall short of being genuinely rational animals. This book presents an alternative account. It argues that bad beliefs arise from genuinely rational processes. We’ve missed the rationality of bad beliefs because we’ve failed to recognize the ubiquity of the higher-order evidence that shapes beliefs, and the rationality of being guided by this evidence. The book argues that attention to higher-order evidence should lead us to rethink both how minds are best changed and the ethics of changing them: we should come to see that nudging—at least usually—changes belief (and behavior) by presenting rational agents with genuine evidence, and is therefore fully respectful of intellectual agency. We needn’t rethink Enlightenment ideals of intellectual autonomy and rationality, but we should reshape them to take account of our deeply social epistemic agency.
Chapter
This collection of new essays discusses the hypothesis that the mind is fragmented, or compartmentalized. This Introduction explains what this hypothesis amounts to. It begins by outlining what different approaches to fragmentation have in common and what motivates them, contrasting fragmentation with a rival model called ‘unity.’ It then discusses the relationship between fragmentation and theses about cognitive architecture, introduces two classical theories of fragmentation, and sketches recent developments of the idea in the literature. Finally, as an overview of the volume, it presents some of the open questions about and issues with fragmentation that the contributions to this volume address.
Chapter
What Beliefs Are Made From explores the nature and purpose of belief. The book describes several strange beliefs that have been shared by many members of whole communities. The intellectualistic, dispositional, feeling and eliminativist theories of belief are then examined critically. This is followed by a review of factors that can influence people in their beliefs. These include faulty use of evidence, unconscious reasoning biases, inability to withhold judgement, wishful thinking, prior beliefs, shared beliefs, personal experience, testimony, judgements about the source of testimony, personality, in-group psychology, emotions and feelings, language, symbolism, non-verbal communication, repetition, propaganda, mysticism, rumour, conspiracy theories, and illness. The book also covers beliefs of children and belief during dreaming. The regulation of inquiry by belief and disbelief is described. What Beliefs Are Made From is a useful reference for general readers interested in the philosophy of the mind, and the psychology of belief.
The empirical study of belief is emerging at a rapid clip, uniting work from all corners of cognitive science. Reliance on belief in understanding and predicting behavior is widespread. Examples can be found, inter alia, in the placebo, attribution theory, theory of mind, and comparative psychological literatures. Research on belief also provides evidence for robust generalizations, including about how we fix, store, and change our beliefs. Evidence supports the existence of a Spinozan system of belief fixation: one that is automatic and independent of belief rejection. Independent research supports the existence of a system of fragmented belief storage: one that relies on large numbers of causally isolated, context‐sensitive stores of belief in memory. Finally, empirical and observational data support at least two systems of belief change. One system adheres, mostly, to epistemological norms of updating; the other, the psychological immune system, functions to guard our most centrally held beliefs from potential inconsistency with newly formed beliefs. Refining our understanding of these systems can shed light on pressing real‐world issues, such as how fake news, propaganda, and brainwashing exploit our psychology of belief, and how best to construct our modern informational world. This article is categorized under: • Psychology > Reasoning and Decision Making • Philosophy > Knowledge and Belief • Philosophy > Foundations of Cognitive Science
Article
People are more inclined to believe that information is true if they have encountered it before. Little is known about whether this illusory truth effect is influenced by individual differences in cognition. In seven studies (combined N = 2,196), using both trivia statements (Studies 1-6) and partisan news headlines (Study 7), we investigate moderation by three factors that have been shown to play a critical role in epistemic processes: cognitive ability (Studies 1, 2, 5), need for cognitive closure (Study 1), and cognitive style, that is, reliance on intuitive versus analytic thinking (Studies 1, 3-7). All studies showed a significant illusory truth effect, but there was no evidence for moderation by any of the cognitive measures across studies. These results indicate that the illusory truth effect is robust to individual differences in cognitive ability, need for cognitive closure, and cognitive style.
Article
A Bayesian mind is, at its core, a rational mind. Bayesianism is thus well‐suited to predict and explain mental processes that best exemplify our ability to be rational. However, evidence from belief acquisition and change appears to show that we do not acquire and update information in a Bayesian way. Instead, the principles of belief acquisition and updating seem grounded in maintaining a psychological immune system rather than in approximating a Bayesian processor.
Article
Repeatedly and successfully, the celebrated Harvard philosopher Robert Nozick has reached out to a broad audience beyond the confines of his discipline, addressing ethical and social problems that matter to every thoughtful person. Here Nozick continues his search for the connections between philosophy and "ordinary" experience. In the lively and accessible style that his readers have come to expect, he offers a bold theory of rationality, the one characteristic deemed to fix humanity's "specialness." What are principles for? asks Nozick. We could act simply on whim, or maximize our self-interest and recommend that others do the same. As Nozick explores rationality of decision and rationality of belief, he shows how principles actually function in our day-to-day thinking and in our efforts to live peacefully and productively with one another. In Nozick's view, misconceptions of rationality have resulted in many intractable philosophical problems. For example, the Kantian attempt to make principled behavior the sole ultimate standard of conduct extends rationality beyond its bounds. In this provocative volume, Nozick reformulates current decision theory to include the symbolic meaning of actions in areas from controlling impulses to fighting society's war against drugs. The author proposes a new rule of rational decision, "maximizing decision-value, " which is a weighted sum of causal, evidential, and symbolic utility. In a particularly fascinating section of the book he traces the implications of this rule for the famous Prisoner's Dilemma and for Newcomb's Problem. Rationality of belief, according to Nozick, involves two aspects: support by reasons that make the belief credible, andgeneration by a process that reliably produces true beliefs. A new evolutionary account explains how some factual connections are instilled in us as seemingly self-evident, thus reversing the direction of Kant's "Copernican Revolution." Proposing a theory of rational belief that inc
Article
This paper studies the role of consistency as a signaling device. We propose a two-period model that highlights the informativeness of consistency as a signal of skills and allows for the analysis of consequences for behavior. In a simple principal–agent experiment, we test the basic intuition of the model. We show that consistency is indeed associated with skills. Consequently, consistency is valued by others, inducing people to act consistently. Data, as supplemental material, are available at http://dx.doi.org/10.1287/mnsc.2016.2459. This paper was accepted by Uri Gneezy, behavioral economics.
Article
The abstract for this document is available on CSA Illumina.To view the Abstract, click the Abstract button above the document title.
Article
This paper reviews intellectualistic, dispositional, and feeling or occurrent theories of belief. The feeling theory is favored. The purpose of belief is to guide action, not to indicate truth. Decisions about actions often have to be made quickly in the absence of evidence. Belief gives speed and economy to inquiry and counterfactual thinking. The feeling theory explains this role of belief and suggests mechanisms for overconfidence of correctness, confirmation bias, wishful believing, vacillating belief, the difficulty with multifactorial reasoning, the inability to withhold judgment, the delusions of mental illness, and the relations between belief, opinion, and knowledge. The intellectualistic theory of belief fails because it gives undue weight to evidence as the most salient or available factor concerned with belief, which leads to the mistaken conclusion that the purpose of belief is to indicate truth.
Article
Introduction Part I: Theoretical Foundations 1. The Elements of Epistemology 2. Skepticism 3. Knowledge 4. Justification: A Rule Framework 5. Justification and Reliability 6. Problem Solving, Power, and Speed 7. Truth and Realism 8. The Problem of Content Part II: Assessing Our Cognitive Resources 9. Perception 10. Memory 11. Constraints on Representation 12. Internal Codes 13. Deductive Reasoning 14. Probability judgments 15. Acceptance and Uncertainty 16. Belief Updating 17. Production Systems and Second-Order Processes Conclusion: Primary Epistemics and Cognitive Science Notes Illustration Credits Author Index Subject Index
Article
The question, "What is Cognitive Science?" is often asked but seldom answered to anyone's satisfaction. Until now, most of the answers have come from the new breed of philosophers of mind. This book, however, is written by a distinguished psychologist and computer scientist who is well-known for his work on the conceptual foundations of cognitive science, and especially for his research on mental imagery, representation, and perception. In Computation and Cognition, Pylyshyn argues that computation must not be viewed as just a convenient metaphor for mental activity, but as a literal empirical hypothesis. Such a view must face a number of serious challenges. For example, it must address the question of "strong equivalents" of processes, and must empirically distinguish between phenomena which reveal what knowledge the organism has, phenomena which reveal properties of the biologically determined "functional architecture" of the mind. The principles and ideas Pylyshyn develops are applied to a number of contentious areas of cognitive science, including theories of vision and mental imagery. In illuminating such timely theoretical problems, he draws on insights from psychology, theoretical computer science, artificial intelligence, and psychology of mind. A Bradford Book