
Gordon PennycookCornell University | CU · Department of Psychology
Gordon Pennycook
B.A., M.A., PhD
About
205
Publications
292,450
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
21,978
Citations
Citations since 2017
Introduction
Additional affiliations
July 2018 - present
September 2016 - July 2018
September 2010 - August 2016
Publications
Publications (205)
One widely used approach for quantifying misinformation consumption and sharing is to evaluate the quality of the news domains a user interacts with. However, different media organizations and fact-checkers have produced different sets of news domain quality ratings, raising questions about the reliability of these quality ratings. Here, we compare...
Recent work suggests that personality moderates the relationship between political ideology and the sharing of misinformation. Specifically, Lawson and Kakkar (2022) claimed that fake news sharing was driven mostly by low conscientiousness conservatives. We reanalyzed their data and conducted five new preregistered conceptual replications to reexam...
Identifying successful approaches for reducing the belief and spread of online misinformation is of great importance. Social media companies currently rely largely on professional fact-checking as their primary mechanism for identifying falsehoods. However, professional fact-checking has notable limitations regarding coverage and speed. In this art...
Intellectual humility (IH) is commonly defined as recognizing the limits of one’s knowledge and abilities. However, most research has relied entirely on self-report measures of IH, without testing whether these instruments capture the metacognitive core of the construct. Across two studies (Ns = 898; 914), using generalized additive mixed models to...
De Neys proposes that deliberation is triggered and sustained by uncertainty. I argue that there are cases where deliberation occurs with low uncertainty - such as when problems are excessively complicated and the reasoner decides against engaging in deliberation - and that there are likely multiple factors that lead to (or undermine) deliberation....
Fake news emerged as an apparent global problem during the 2016 U.S. Presidential election. Addressing it requires a multidisciplinary effort to define the nature and extent of the problem, detect fake news in real time, and mitigate its potentially harmful effects. This will require a better understanding of how the Internet spreads content, how p...
The spread of misinformation online is a global problem that requires global solutions. To that end, we conducted an experiment in 16 countries across 6 continents (N = 34,286; 676,605 observations) to investigate predictors of susceptibility to misinformation about COVID-19, and interventions to combat the spread of this misinformation. In every c...
Recent work suggests that personality moderates the relationship between political ideology and the sharing of misinformation. Specifically, Lawson and Kakkar (2021) claimed that fake news sharing was driven mostly by low conscientiousness conservatives. We re-analyzed their data and conducted five new preregistered conceptual replications to reexa...
Why is disbelief in anthropogenic climate change common despite broad scientific consensus to the contrary? A widely held explanation involves politically motivated (system 2) reasoning: Rather than helping uncover the truth, people use their reasoning abilities to protect their partisan identities and reject beliefs that threaten those identities....
Professional fact-checking of individual news headlines is an effective way to fight misinformation, but it is not easily scalable, because it cannot keep pace with the massive speed at which news content gets posted on social media. Here we provide evidence for the effectiveness of ratings of news sources, instead of individual news articles. In a...
Content moderators regularly review problematic content for technology companies, and repeated exposure to claims could cause moderators to come to believe the very claims they are supposed to moderate. However, the nature of their jobs might buffer against this repetition-induced truth effect. We conducted a field experiment with TaskUs, a global...
There is widespread concern about misinformation circulating on social media. In particular, many argue that the context of social media itself may make people susceptible to the influence of false claims. Here, we test that claim by asking whether simply considering sharing news on social media reduces the extent to which people discriminate truth...
Many measures have been developed to index intuitive versus analytic thinking. Yet it remains an open question whether people primarily vary along a single dimension or if there are genuinely different types of thinking styles. We distinguish between four distinct types of thinking styles: Actively Open-minded Thinking, Close-Minded Thinking, Prefe...
Recent experiments have found that prompting people to think about accuracy reduces misinformation sharing intentions. The process by which this effect operates, however, remains unclear. Do accuracy prompts cause people to “stop and think,” increasing deliberation? Or do they change what people think about, drawing attention to accuracy? Since the...
Science often advances through disagreement among scientists and the studies they produce. For members of the public, however, conflicting results from scientific studies may trigger a sense of uncertainty that in turn leads to a feeling that nothing new has been learned from those studies. In several scenario studies, participants read about pairs...
Does one’s stance toward evidence evaluation and belief revision have relevance for actual beliefs? We investigate the role of endorsing an actively open-minded thinking style about evidence (AOT-E) on a wide range of beliefs, values, and opinions. Participants indicated the extent to which they think beliefs (Study 1) or opinions (Studies 2 and 3)...
One widely used approach for quantifying misinformation consumption and sharing is to evaluate the quality of the news domains a user interacts with. However, different media organizations and fact-checkers have produced different sets of news domain quality ratings, raising questions about the reliability of these quality ratings. Here, we compare...
There is a pressing need to understand belief in false conspiracies. Past work has focused on the needs and motivations of conspiracy believers, as well as the role of overreliance on intuition. Here, we propose an alternative driver of belief in conspiracies: overconfidence. Across eight studies with 4,181 U.S. adults, conspiracy believers not onl...
What are the underlying cognitive mechanisms that support belief in conspiracies? Common dual-process perspectives suggest that deliberation helps people make more accurate decisions and decreases belief in conspiracy theories that have been proven wrong (therefore, bringing people closer to objective accuracy). However, evidence for this stance is...
Identifying successful approaches for combating the belief in and spread of online misinformation is of great importance to researchers and policy makers. Social media companies currently rely largely on professional fact-checking as their primary mechanism for identifying falsehoods. However, professional fact-checking has notable limitations rega...
Social and behavioral science research proliferated during the COVID-19 pandemic, reflecting the substantial increase in influence of behavioral science in public health and public policy more broadly. This review presents a comprehensive assessment of 742 scientific articles on human behavior during COVID-19. Two independent teams evaluated 19 sub...
Recent years have seen a proliferation of experiments seeking to combat misinformation. Yet there has been little consistency across studies in how the effectiveness of interventions is evaluated, which undermines the field's ability to identify efficacious strategies. We provide a framework for differentiating between common research designs on th...
Modern computational systems have an unprecedented ability to detect, leverage and influence human attention. Prior work identified user engagement and dwell time as two key metrics of attention in digital environments, but these metrics have yet to be integrated into a unified model that can advance the theory andpractice of digital attention. We...
Some theoretical models assume that a primary source of contention surrounding science belief is political and that partisan disagreement drives beliefs; other models focus on basic science knowledge and cognitive sophistication, arguing that they facilitate proscientific beliefs. To test these competing models, we identified a range of controversi...
Unlike traditional media, social media typically provides quantified metrics of how many users have engaged with each piece of content. Some have argued that the presence of these cues promotes the spread of misinformation. Here we investigate the causal effect of social cues on users' engagement with social media posts. We conducted an experiment...
Conspiracy theories tend to involve doubt and skepticism, but are conspiracy believers really more deliberative? We review recent research that investigates the relative roles of intuition and reason in conspiracy belief and find that the preponderance of evidence indicates that conspiracy belief is linked to an overreliance on intuition and a lack...
Conspiracy theories tend to involve doubt and skepticism, but are conspiracy believers really more deliberative? We review recent research that investigates the relative roles of intuition and reason in conspiracy belief and find that the preponderance of evidence indicates that conspiracy belief is linked to an overreliance on intuition and a lack...
Artificial Intelligence (AI) can generate text virtually indistinguish- able from text written by humans. A key question, then, is whether people believe news headlines generated by AI as much as news headlines generated by humans. AI is viewed as lacking human motives and emotions, suggesting that people might view news written by AI as more accur...
Pro‐Kremlin disinformation campaigns have long targeted Ukraine. We investigate susceptibility to this pro‐Kremlin disinformation from a cognitive‐science perspective. Is greater analytic thinking associated with less belief in disinformation, as per classical theories of reasoning? Or does analytic thinking amplify motivated system 2 reasoning (or...
Interventions that shift users attention toward the concept of accuracy represent a promising approach for reducing misinformation sharing online. We assess the replicability and generalizability of this accuracy prompt effect by meta-analyzing 20 experiments (with a total N = 26,863) completed by our group between 2017 and 2020. This internal meta...
The spread of misinformation has become a central concern in American politics. Recent studies of social media sharing suggest that Republicans are considerably more likely to share fake news than Democrats. However, such inferences are confounded by the far greater supply of right-leaning fake news—Republicans may indeed be more prone to sharing f...
What are the underlying cognitive mechanisms that support belief in conspiracies? Common dual-process perspectives suggest that deliberation helps people make more accurate decisions and decreases belief in conspiracy theories that have been proven wrong (therefore, bringing people closer to objective accuracy). However, evidence for this stance is...
A meaningful portion of online misinformation sharing is likely attributable to Internet users failing to consider accuracy when deciding what to share. As a result, simply redirecting attention to the concept of accuracy can increase sharing discernment. Here we discuss the importance of accuracy and describe a limited-attention utility model that...
The spread of misinformation online is a global problem that requires global solutions. To that end, we conducted an experiment in 16 countries across 6 continents (N = 33,480) to investigate predictors of susceptibility to misinformation and interventions to combat misinformation. In every country, participants with a more analytic cognitive style...
Recent experiments have found that prompting people to think about accuracy reduces the sharing of misinformation. The cognitive mechanism that produces this effect, however, remains unclear. Do accuracy prompts cause people to "stop and think," increasing deliberation? Or do they change what people think about, drawing attention to accuracy? Since...
Online behavioral data, such as digital traces from social media, have the potential to allow researchers an unprecedented new window into human behavior in ecologically valid everyday contexts. However, research using such data is often purely observational, which limits its usefulness for identifying causal relationships. Here we review recent in...
There is widespread concern about fake news and other misinformation circulating on social media. In particular, many argue that the context of social media itself may make people particularly susceptible to the influence of false claims. Here, we test that claim by asking whether simply considering whether to share news on social media reduces peo...
Professional fact-checking, a prominent approach to combating misinformation, does not scale easily. Furthermore, some distrust fact-checkers because of alleged liberal bias. We explore a solution to these problems: using politically balanced groups of laypeople to identify misinformation at scale. Examining 207 news articles flagged for fact-check...
Artificial Intelligence (AI) can generate text virtually indistinguishable from text written by humans. A key question, then, is whether people believe news generated by AI as much as news generated by humans. AI is viewed as lacking human motives and emotions, suggesting that people might view news written by AI as more accurate. By contrast, two...
Recent work indicates that a meaningful portion of online misinformation sharing can be attributed to users merely failing to consider accuracy when deciding what to share. As a result, simply redirecting attention to the concept of accuracy can increase sharing discernment. Here we discuss the relevance of accuracy, and outline a limited-attention...
Coincident with the global rise in concern about the spread of misinformation on social media, there has been influx of behavioral research on so-called “fake news” (fabricated or false news headlines that are presented as if legitimate) and other forms of misinformation. These studies often present participants with news content that varies on rel...
With the rise of social media, everyone has the potential to be both a consumer and producer of online content. As a result, the role that word of mouth plays in news consumption has been dramatically increased. Although one might assume that consumers share news because they believe it to be true, widespread concerns about the spread of misinforma...
Ukraine has been the target of a long-running Russian disinformation campaign. We investigate susceptibility to this pro-Kremlin disinformation from a cognitive science perspective. Is greater analytic thinking associated with less belief in disinformation, as per classical theories of reasoning? Or does analytic thinking amplify motivated reasonin...
What are the psychological consequences of the increasingly politicized nature of the COVID-19 pandemic in the United States relative to similar Western countries? In a two-wave study completed early (March) and later (December) in the pandemic, we found that polarization was greater in the United States ( N = 1,339) than in Canada ( N = 644) and t...
Online behavioral data, such as digital traces from social media, have the potential to allow researchers an unprecedented new window into human behavior in ecologically valid everyday contexts. However, research using such data is often purely observational, limiting its ability to identify causal relationships. Here we review recent innovations i...
Simply failing to consider accuracy when deciding what to share on social media has been shown to play an important role in the spread of online misinformation. Interventions that shift users’ attention towards the concept of accuracy – accuracy prompts or nudges – are therefore a promising approach to improve the quality of news content that users...
A major focus of current research is understanding why people fall for and share fake news on social media. While much research focuses on understanding the role of personality-level traits for those who share the news, such as partisanship and analytic thinking, characteristics of the articles themselves have not been studied. Across two pre-regis...
Recent research suggests that shifting users’ attention to accuracy increases the quality of news they subsequently share online. Here we help develop this initial observation into a suite of deploy-able interventions for practitioners. We ask (i) how prior results generalize to other approaches for prompting users to consider accuracy, and (ii) fo...
What are the psychological consequences of the increasingly politicized nature of the COVID-19 pandemic in the United States relative to similar Western countries? In a two-wave study completed early (March) and later (December) in the pandemic, we found that polarization was greater in the U.S. (N=1,339) than in Canada (N=644) and the U.K. (N=1,28...
In recent years, there has been a great deal of concern about the proliferation of false and misleading news on social media1–4. Academics and practitioners alike have asked why people share such misinformation, and sought solutions to reduce the sharing of misinformation5–7. Here, we attempt to address both of these questions. First, we find that...
When users on social media share content without considering its veracity, they may unwittingly be spreading misinformation. In this work, we investigate the design of lightweight interventions that nudge users to assess the accuracy of information as they share it. Such assessment may deter users from posting misinformation in the first place, and...
COVID-19 vaccine hesitancy among Black Americans threatens to further magnify racial inequities in COVID-19 related health outcomes that emerged in the earliest stages of the pandemic. Here we shed new light on attitudes towards COVID-19 vaccines by considering intragroup variation. Rather than analyzing Blacks as a homogenous group, we examine the...
A common claim is that people vary not just in what they think, but how they think. In fact, there are a large number of scales that have been developed to ostensibly measure thinking styles. These measures share a lot of conceptual overlap and, in particular, most purport to index some aspect of the disposition to think more analytically and effor...
Recent research suggests that shifting users’ attention to accuracy increases the quality of news they subsequently share online. Here we help develop this initial observation into a suite of deployable interventions for practitioners. We ask (i) how prior results generalize to other approaches for prompting users to consider accuracy, and (ii) for...
Objective:
Health misinformation on social media threatens public health. One question that could lend insight into how and through whom misinformation spreads is whether certain people are susceptible to many types of health misinformation, regardless of the health topic at hand. This study provided an initial answer to this question and also test...
We synthesize a burgeoning literature investigating why people believe and share false or highly misleading news online. Contrary to a common narrative whereby politics drives susceptibility to fake news, people are ‘better’ at discerning truth from falsehood (despite greater overall belief) when evaluating politically concordant news. Instead, poo...
We investigate the relationship between individual differences in cognitive reflection and behavior on the social media platform Twitter, using a convenience sample of N = 1,901 individuals from Prolific. We find that people who score higher on the Cognitive Reflection Test—a widely used measure of reflective thinking—were more discerning in their...
Countering misinformation can reduce belief in the moment, but corrective messages quickly fade from memory. We tested whether the longer-term impact of fact-checks depends on when people receive them. In two experiments (total N = 2,683), participants read true and false headlines taken from social media. In the treatment conditions, “true” and “f...
When users on social media share content without considering its veracity, they may unwittingly be spreading misinformation. In this work, we investigate the design of lightweight interventions that nudge users to assess the accuracy of information as they share it. Such assessment may deter users from posting misinformation in the first place, and...
The 2020 U.S. Presidential Election saw an unprecedented number of false claims alleging election fraud and arguing that Donald Trump was the actual winner of the election. Here we report a sur-vey exploring belief in these false claims that was conducted three days after Biden was declared the winner. We find that a majority of Trump voters in our...
The 2020 U.S. Presidential Election saw an unprecedented number of false claims alleging election fraud and arguing that Donald Trump was the actual winner of the election. Here we report a survey exploring belief in these false claims that was conducted three days after Biden was declared the winner. We find that a majority of Trump voters in our...
Why is disbelief in anthropogenic climate change widespread despite broad scientific consensus to the contrary? A widely-held explanation involves politically motivated (“system 2”) reasoning: Rather than helping uncover truth, people use their reasoning abilities to protect their partisan identities and reject beliefs that threaten those identitie...
Why is disbelief in anthropogenic climate change widespread despite broad scientific consensus to the contrary? A widely-held explanation involves politically motivated (“system 2”) reasoning: Rather than helping uncover truth, people use their reasoning abilities to protect their partisan identities and reject beliefs that threaten those identitie...
Social media platforms rarely provide data to misinformation researchers. This is problematic as platforms play a major role in the diffusion and amplification of mis- and disinformation narratives. Scientists are often left working with partial or biased data and must rush to archive relevant data as soon as it appears on the platforms, before it...
We synthesize a burgeoning literature investigating why people believe and share “fake news” and other misinformation online. Surprisingly, the evidence contradicts a common narrative whereby partisanship and politically motivated reasoning explain failures to discern truth from falsehood. Instead, poor truth discernment is linked to a lack of care...
Analytic and intuitive reasoning processes have been implicated as important determinants of belief in (or skepticism of) fake news. However, the underlying cognitive mechanisms that encourage endorsement of fake news remain unclear. The present study investigated cognitive decoupling/response inhibition and the potential role of conflict processin...
Objective: Health misinformation on social media threatens public health. One question that could lend insight into how and through whom misinformation spreads is whether certain people are generally susceptible to health misinformation regardless of the health topic at hand. This study provided an initial answer to this question and also tested fo...
A surprising finding from U.S. opinion surveys is that political disagreements tend to be greatest among the most cognitively sophisticated opposing partisans. Recent experiments suggest a hypothesis that could explain this pattern: cognitive sophistication magnifies politically biased processing of new information. However, the designs of these ex...
Partisan disagreement over policy-relevant facts is a salient feature of contemporary American politics. Perhaps surprisingly, such disagreements are often the greatest among opposing partisans who are the most cognitively sophisticated. A prominent hypothesis for this phenomenon is that cognitive sophistication magnifies politically motivated reas...
What is the role of emotion in susceptibility to believing fake news? Prior work on the psychology of misinformation has focused primarily on the extent to which reason and deliberation hinder versus help the formation of accurate beliefs. Several studies have suggested that people who engage in more reasoning are less likely to fall for fake news....