ArticleLiterature Review

A short Review on Susceptibility to Falling for Fake Political News

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

This review discusses recent findings on individuals’ susceptibility to falling for fake news in the political context. Considering political attitudes and analytical thinking, we find that individuals tend to overrate the accuracy of true and fake political news that are consistent with their own political attitudes. This tendency, however, cannot be explained by motivated reasoning. This is supported by findings showing that analytical thinking is negatively related to susceptibility to falling for fake news, regardless of whether they are consistent or inconsistent with one’s political attitudes. We suggest that future works should aim at i) examining how, for example, news consumption habits relate to susceptibility to falling for fake news and ii) implementing other, more external valid fake news tests.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... However, other characteristics, such as personality traits, have been found to be relevant. Still, more work is required to establish the interactions between different individual level factors (see Sindermann et al., 2020). ...
... Decades of literature on personality and human behavior (Ajzen, 2005;Argyle & Little, 1972;Blickle, 1996), and more recent literature on misinformation engagement, suggests that personality traits may help explain differences in perceived accuracy of misinformation. In considering individuals' susceptibility to misinformation, scholars have argued that some traits like openness to experience and conscientiousness could lower the risk of believing misinformation (Sindermann et al., 2020). Extraversion and neuroticism, two highly aroused emotions, are positively associated with believing misinformation (Lai et al., 2020). ...
... In summary, our findings on sharing are mostly consistent with previous literature. Scholars have argued that conscientiousness would provide buffer against misinformation (Sindermann et al., 2020). The findings also concur with scholarship that suggests that those with high cognitive ability are better at misinformation discernment and are also less likely to share them on social media (Ahmed, 2021a;Ahmed, 2022). ...
Article
This study examines whether personality traits predict an individual's perceived accuracy and sharing intention of political misinformation and if cognitive ability further moderates this relationship. An analysis of survey data from the US revealed that individuals with high agreeableness and low extraversion were more likely to discern pro-conservative misinformation correctly. Individuals with high agreeableness and conscientiousness were also less likely to share partisan misinformation. Further, individuals with high cognitive ability were more likely to discern correctly and less likely to share misinformation across the partisan spectrum. Moderation analyses suggest that low cognitive individuals with higher neuroticism and openness are more prone to sharing partisan misinformation. However, high levels of agreeableness can safeguard low cognitive individuals from sharing. We discuss the implications of this work for theorizing misinformation engagement and its effects.
... During election cycles, the ability to judge the veracity of information is particularly important since the amount of misinformation and disinformation which crosses social media accounts increases (Allcott & Gentzkow, 2017). In the last few years, work has begun to diagnose the individual risk factors associated with inaccurately assessing the veracity of political information (Pennycook & Rand, 2021;Scheufele & Krause, 2019;Sindermann et al., 2020). Reduced analytical thinking is associated with difficulty discerning the accuracy of political information (Bronstein et al., 2019;Sindermann et al., 2020), but increasing one's time to reflect on political information has been shown to increase one's ability to discern (Bago et al., 2020). ...
... In the last few years, work has begun to diagnose the individual risk factors associated with inaccurately assessing the veracity of political information (Pennycook & Rand, 2021;Scheufele & Krause, 2019;Sindermann et al., 2020). Reduced analytical thinking is associated with difficulty discerning the accuracy of political information (Bronstein et al., 2019;Sindermann et al., 2020), but increasing one's time to reflect on political information has been shown to increase one's ability to discern (Bago et al., 2020). Age is associated with an increased likelihood to come in contact with and share false information (Brashier & Schacter, 2020;Grinberg et al., 2019). ...
... It is possible that these biases in information processing, memory, and attention may affect the evaluation of political information. This is particularly important given that reduced cognitive reflection and analytical thinking are already linked to poor truth discernment (Bronstein et al., 2019;Sindermann et al., 2020). The causes of reduced cognitive reflection and analytical thinking may be (at least partially) linked to the informational and attentional biases associated with poor well-being. ...
Preprint
Full-text available
In the last few years, work has begun to diagnose the individual risk factors associated with believing political misinformation. However, little is known about whether individual differences in interpersonal behaviors and well-being (broadly defined) are associated with biases in judging the veracity of political information. The goal of this work was two-fold. First, it tested whether interpersonal (e.g. prosociality and need to belong), affective (e.g. anxiety and happiness), eudaimonic well-being (e.g. autonomy and one’s sense of purpose in life), and mental health (depressive symptoms) factors were associated with the ability to judge the veracity of true and false political statements. Prosociality, high negative affect, and poor eudaimonic well-being were all highly associated with the tendency to believe that most headlines were true. However, low prosociality, low negative affect, and high eduaimonic well-being were associated with assessing news with a partisan bias. Second, given that several of the psychological factors covaried, out-of-sample validation was used to understand the combination of psychological factors which best predicted truth discernment. Political and demographic factors known to predict accuracy were considered in tandem. By including measures of a.) interpersonal behaviors, b.) hedonic/affective well-being, and c.) eudaimonic well-being in a model, more than 50% of the variance was explained for both true and false statements. The best out-of-sample validated models did not include some factors previously found to predict accuracy (such as conservative ideology). This works demonstrates the importance of mental health factors and interpersonal forces when individuals attempt to navigate the political landscape.
... Studies have identified several individual differences in the ability to discern true and false headlines (Bronstein et al., 2019;. While reviewing the literature that examines an individual's susceptibility to fake political news, Sindermann, Cooper, and Montag (2020) noted a paucity of research on the relationship between personality traits and susceptibility to fake news, and suggested that researchers examine the relationship between an individual's news consumption and susceptibility to fake news. The present study follows this recommendation and examined the relationship between personality factors, news consumption, and perceived accuracy of true and false news headlines. ...
... Thus, this study seeks to build upon the literature on the relationship between personality factor's and an individual's susceptibility to misinformation. Sindermann, Cooper, and Montag (2020) suggested that conscientiousness and openness to experience should be correlated with lower susceptibility to fake news. Yet, few studies have directly examined how personality relates to fake news. ...
... The present study tested Sindermann, Cooper, and Montag's (2020) predictions about the relationships between personality factors, news consumption, and news discernment. First, we predicted that extraversion, negative emotionality, and agreeableness would negatively correlate with news discernment, whereas conscientiousness and openmindedness would positively correlate with news discernment. ...
Article
Full-text available
The existence of fake news on social media is likely to influence important issues such as elections, attitudes toward public policy, and health care decisions. Studies have shown that individual differences predict participants' ability to discern real and fake news. The present study examined whether personality factors and news consumption predict an individual's political news discernment. Participants (N = 353) judged the accuracy of true and false political news headlines, completed a personality inventory, and reported how many hours they obtained political news from various sources in a typical week. Regression analyses revealed that greater levels of agreeableness, conscientiousness, open-mindedness, lower levels of extraversion, and fewer hours of news consumption were related to better news discernment. Participants also showed a bias toward headlines consistent with their self-reported political ideology, and this bias was related to consumption of ideologically biased news sources. These results extend those that have identified individual differences in news discernment, demonstrating that personality factors and news consumption are related to the ability to discern between true and false political news.
... Other studies have already reported that personality can influence judgment and decisionmaking in a variety of situations (Byrne et al. 2015). While Szebeni et al. (2021) have shown that a propensity for a conspiratorial mindset can make people more vulnerable to misinformation, other studies (Wolverton and Stevens 2019;Sindermann et al. 2020;Calvillo et al. 2021a) seem to agree that extraversion is related to belief in fake news. While Talwar et al. (2019) explains that sharing and engaging with fake news can be associated with the social anxiety of people feeling connected and included (known as Fear of Missing Out-FOMO), on the other hand, Szebeni et al. (2021) revealed that the conspiratorial mindset is a strong predictor of belief, surpassing political or ideological motivation. ...
... In addition, conservative and right-wing people have also been associated with a more intuitive cognitive style in the way they process information (Deppe et al. 2015). This cognitive style is positively correlated with the belief and dissemination of fake news Sindermann et al. 2020). ...
Article
Full-text available
Political fake news continues to be a threat to contemporary societies, negatively affecting public and democratic institutions. The literature has identified political bias as one of the main predictors of belief and spread of fake news. However, the academic debate has not been consensual regarding the effect of political identity on the discernment of fake news. This systematic literature review (2017–2021) seeks to understand whether there is consistent evidence that one political identity may be more vulnerable to fake news than others. Focusing the analysis on European and North American (United States) studies, we used Scopus and Web of Science databases to examine the literature. Our findings revealed that most studies are consistent in identifying the conservative or right-wing audience as more vulnerable to fake news. Although there seems to be a motivated political reasoning for both sides, left-wing people or liberals were not, in any analyzed study, associated with a greater propensity to believe in political fake news. Motivated reasoning seems stronger and more active among conservatives, both in the United States and Europe. Our study reinforces the need to intensify the fight against the proliferation of fake news among the most conservative, populist, and radical right audience.
... Earlier research also found that performance on the cognitive reflection test (CRT; Frederick, 2005;Toplak et al., 2014) and the pseudo-profound bullshit receptivity scale (BSR; Pennycook et al., 2015) related to people's detection of fake news , 2021Sindermann et al., 2020). We expected to replicate these relationships. ...
... Cognitive reflection emerged as an important predictor of fake news detection, consistent with previous research Sindermann et al., 2020). For further exploration, we related mindsets, gender, BSR and cheating (as indicated by the number of times participants changed browser tabs) to CRT performance 2 . ...
... Fact-checking agencies recommend several processing steps, such as checking whether the style is manipulative and mixes facts and opinions, checking the (original) source, or whether similar reports on the topic can also be found on other platforms [Correctiv Rechererchen für die Gesellschaft 2022]. When it comes to psychological processes, researchers looked at cognitive and motivational characteristics that correlated with fake news susceptibility [Sindermann et al. 2020]. Fake news susceptibility has been interpreted as form of motivated reasoning, i.e., believing what is consistent with one's prior beliefs or political ideology. ...
... Fake news susceptibility has been interpreted as form of motivated reasoning, i.e., believing what is consistent with one's prior beliefs or political ideology. It has been found repeatedly that people are more likely to believe fake news in alignment with their political orientation [Allcott and Gentzkow 2017;Sindermann et al. 2020;Van Bavel and Pereira 2018]. Republicans, people who lean more towards the right or who are more conservative also perform worse in detecting fake news [Allcott and Gentzkow 2017;Calvillo et al. 2020]. ...
... The adverse effects of online misinformation have prompted researchers to investigate the interaction between humans and technology regarding what may explain higher susceptibility to fake news (e.g., Bryanov and Vziatysheva 2021;Sindermann et al. 2020). When summarizing the findings of scholarly articles on the topic, Bryanov and Vziatysheva (2021) identify three broad categories of determinants; namely, message characteristics, individual factors, and accuracy-promoting interventions. ...
... When summarizing the findings of scholarly articles on the topic, Bryanov and Vziatysheva (2021) identify three broad categories of determinants; namely, message characteristics, individual factors, and accuracy-promoting interventions. Several researchers have examined the importance of belief consistency and confirmation bias (Kim and Dennis 2019;Sindermann et al. 2020;Calvillo et al. 2021;Bringula et al. 2021), referring to the tendency of people to be more susceptible to fake news that aligns with pre-existing values, beliefs, or political views. Second, individual factors, including cognitive modes, predispositions, and news and information literacy differences may determine individual susceptibility to fake news. ...
Article
In late 2019 about a dozen BISE chairs from the German-speaking community met around ICIS to discuss the ethical challenges arising from the current construction, deployment, and marketing of Information Systems (IS). It turned out that many were and are concerned about the negative implications of IS while at the same time being convinced that digitization also supports society for the better. The questions at hand are what the BISE community is contributing in terms of solutions to the societal challenges caused by IS, how it should handle politically and socially ambiguous developments (i.e., when teaching students), and what kind of relevant research questions should be addressed.
... When talking about social media, demographics are often taken into consideration. For instance, age and level of education may affect the extent to which one relies on social media to access news (Sindermann et al., 2020). Although more factors might play a role, we focus on age and level education as prior literature gives most reason to believe these may play a role. ...
... This suggests that the expected positive relationship between the training protocol and one's ability to detect fake news might be stronger for those with lower levels of education. It has also been suggested that age negatively affects one's ability to detect fake news, suggesting the older one gets the less likely they are to believe fake news (Sindermann et al., 2020). This has been recognized in other research as well, in which news media literacy among young people seems limited (Loos & Nijenhuis, 2020). ...
Article
Full-text available
We explore whether training protocols can enhance the ability of social media users to detect fake news, by conducting an online experiment ( N = 417) to analyse the effect of such a training protocol, while considering the role of scepticism, age, and level of education. Our findings show a significant relationship between the training protocol and the ability of social media users to detect fake news, suggesting that the protocol can play a positive role in training social media users to recognize fake news. Moreover, we find a direct positive relationship between age and level of education on the one hand and ability to detect fake news on the other, which has implications for future research. We demonstrate the potential of training protocols in countering the effects of fake news, as a scalable solution that empowers users and addresses concerns about the time-consuming nature of fact-checking.
... Starting from the widespread use of the internet, and especially with the worldwide diffusion of social media, like Facebook, concerns about the effect of recommendation systems started to rise [1,2]. ...
... We repeated the evolution process with the same initial factors for R replicas, but every time with different sequences of random extraction of users emitting messages. We then measured the distance D 1r between replica 1 and r by averaging the distance of evolved users in replica r, U (r) ik from the same users in the first replica U (1) ik , and averaging over all replicas, ...
Article
Full-text available
We investigate the problem of the formation of communities of users that selectively exchange messages among them in a simulated environment. This closed community can be seen as the prototype of the bubble effect, i.e., the isolation of individuals from other communities. We develop a computational model of a society, where each individual is represented as a simple neural network (a perceptron), under the influence of a recommendation system that honestly forward messages (posts) to other individuals that in the past appreciated previous messages from the sender, i.e., that showed a certain degree of affinity. This dynamical affinity database determines the interaction network. We start from a set of individuals with random preferences (factors), so that at the beginning, there is no community structure at all. We show that the simple effect of the recommendation system is not sufficient to induce the isolation of communities, even when the database of user–user affinity is based on a small sample of initial messages, subject to small-sampling fluctuations. On the contrary, when the simulated individuals evolve their internal factors accordingly with the received messages, communities can emerge. This emergence is stronger the slower the evolution of individuals, while immediate convergence favors to the breakdown of the system in smaller communities. In any case, the final communities are strongly dependent on the sequence of messages, since one can get different final communities starting from the same initial distribution of users’ factors, changing only the order of users emitting messages. In other words, the main outcome of our investigation is that the bubble formation depends on users’ evolution and is strongly dependent on early interactions.
... Fake News comprise inaccurate information created in a format similar to that of traditional information media, but ignoring the processes used to confirm information accuracy and credibility, as they are not structured according to editorial standards like traditional media. Fake News, in their beginning, may have had the initial intention of deceiving people, although this does not always occur (Lazer et al., 2018;Sindermann, Cooper, & Montag, 2020). ...
... In this sense, even in the face of real-world evidence, people can rely on misinformation corresponding to variables that make up their worldviews, resisting misinformation corrections, i.e., a person's preexisting attitudes can determine, at a certain level, his/her beliefs concerning misinformation (Lewandowsky et al., 2012). As such, people tend to overestimate the accuracy of false and true political news that corroborate their own political attitudes (Sindermann et al., 2020). ...
Article
Full-text available
Coping with the COVID-19 pandemic is a global challenge, and social isolation is one of the main strategies for preventing contagion. In Brazil, however, discussions concerning isolation have been inserted in the political arena of an already polarized context. In this context, this study sought to investigate the role played by morality and Fake News in the relationship between political orientation and attitude towards social isolation. A total of 147 people participated in the survey, indicating their political orientation and responding to social isolation measures, Fake News and morality. The results indicate that political orientation directly influences social isolation, regardless of belief in Fake News and morality indices. It is concluded that the country's political polarization seems to be becoming a public health problem during the pandemic.
... For example, RWA has been associated with preference for right-leaning parties (Beierlein et al., 2014;Sindermann et al., 2020b), or negative attitudes/prejudice towards immigrants in German samples (Beierlein et al., 2014). In addition, previous studies suggest that individuals tend to overrate the accuracy of news items fitting with their political opinions but underrate the accuracy of non-fitting news items (Allcott and Gentzkow, 2017;Anthony and Moulding, 2019;Bago et al., 2020;Sindermann et al., 2020a); however, this does not seem to be due to motivated reasoning (Bago et al., 2020;Pennycook and Rand, 2019;Sindermann et al., 2020a). As the focus of the present study was not on political news headlines with, for example, left-versus right-leaning content, it is not surprising that neither RWA nor SDO were strongly associated with any of the variables derived from the present Fake and True News Test. ...
... For example, RWA has been associated with preference for right-leaning parties (Beierlein et al., 2014;Sindermann et al., 2020b), or negative attitudes/prejudice towards immigrants in German samples (Beierlein et al., 2014). In addition, previous studies suggest that individuals tend to overrate the accuracy of news items fitting with their political opinions but underrate the accuracy of non-fitting news items (Allcott and Gentzkow, 2017;Anthony and Moulding, 2019;Bago et al., 2020;Sindermann et al., 2020a); however, this does not seem to be due to motivated reasoning (Bago et al., 2020;Pennycook and Rand, 2019;Sindermann et al., 2020a). As the focus of the present study was not on political news headlines with, for example, left-versus right-leaning content, it is not surprising that neither RWA nor SDO were strongly associated with any of the variables derived from the present Fake and True News Test. ...
Article
Full-text available
Individual differences in cognitive abilities and personality help to understand individual differences in various human behaviors. Previous work investigated individual characteristics in light of believing (i.e., misclassifying) fake news. However, only little is known about the misclassification of true news as fake, although it appears equally important to correctly identify fake and true news for unbiased belief formation. An online study with N = 530 (n = 396 men) participants was conducted to investigate performance in a Fake and True News Test in association with i) performance in fluid and crystallized intelligence tests and the Big Five Inventory, and ii) news consumption as a mediating variable between individual characteristics and performance in the Fake and True News Test. Results showed that fluid intelligence was negatively correlated with believing fake news (the association did not remain significant in a regression model); crystallized intelligence was negatively linked to misclassifying true news. Extraversion was negatively and crystallized intelligence was positively associated with fake and true news discernment. The number of different news sources consumed correlated negatively with misclassifying true news and positively with fake and true news discernment. However, no meaningful mediation effect of news consumption was observed. Only interpersonal trust was negatively related to misclassifying both fake and true news as well as positively related to news discernment. The present findings reveal that underlying factors of believing fake news and misclassifying true news are mostly different. Strategies that might help to improve the abilities to identify both fake and true news based on the present findings are discussed.
... Because of the strong interests and attitudes associated with the issue, it is possible that individuals were strongly motivated to seek information that aligns with their opinions in addition to the fact-based news sources (Thorson, 2016). In fact, studies show that individuals tend to overrate the accuracy of news items that fit more closely with their political beliefs and attitudes compared to non-fitting news items (Bago et al., 2020;Sindermann et al., 2020). Research on motivated reasoning consistently shows that individuals develop negative affect and render snap judgments when they encounter new information that contradicts their preexisting attitudes as defensive processing to maintain their position (S. ...
... We did not directly examine whether heuristic or central processing occurred when participants were exposed to the fake news label. While our findings are consistent with previous studies that explain the intuitive judgment of information quality and credibility (Bago et al., 2020;Sindermann et al., 2020), we did not directly observe participants' time spent in making these judgments and whether careful central processing may change the discounting impact of fake news labels. As the discussion at the beginning of this study suggests, fake news often contains nuance and complexity, including politically motivated intentions, emotive language, and strategic reframing of facts -rather than their outright dismissal (Bakir & McStay, 2018;Lewandowsky et al., 2013;Southwell et al., 2018). ...
Article
Using a mixed-design online experiment, this study examined how individuals determine the quality of information they encounter online and engage in information verification and authentication processes. An online experiment tested the effects of “fake news” labels as discounting cues on individuals’ ability to correctly identify disinformation and their motivations to authenticate it with other credible sources. Results showed main effects of this “fake news” cue in online comments on participants’ accuracy in identifying fake news, need to authenticate the information, and their reliance on legacy news channels to do so.
... Aún si lo que se informa es falso, haberlo leído repetidas veces aumenta la posibilidad de considerarlo como verdadero, efectos que estos investigadores han denominado 'verdad ilusoria'. La Teoría de la Disonancia Cognitiva explica además que las personas tienden a clasificar como falsa y a rechazar aquellas noticias que son contrarias a sus creencias, como una forma de reducir la incomodidad que les genera el conflicto que surge entre la (des)información y sus valores, incluso cuando las noticias son ciertas (Festinger, 1957;Sindermann et al., 2020). ...
... Mientras que la teoría de la manipulación de la información (McCornack et al., 2014) aporta en la compresión sobre cómo se fabrica desinformación, las teorías de las cámaras de eco (Cardenal et al., 2019;Munson & Resnick, 2010) y de burbujas de filtro (Pariser, 2011) aportan a la tendencia a compartir (des)información que coincide con los intereses y valores de los usuarios que las consumen. Por su parte, la teoría de la disonancia cognitiva (Festinger, 1957;Sindermann et al., 2020), complementa las anteriores al explicar la manera como se decide sobre lo que se comparte o no, bien sea (1) añadiendo (buscando) información consonante, (2) trivializando o infravalorando las actitudes consonantes de quien experimenta la disonancia, o (3) cambiando de postura o actitud hacia la (des)información que se consume. ...
Article
Introducción: Los productores de desinformación y noticias falsas encuentran en el temor, la incertidumbre en tiempos de pandemia y las redes sociales virtuales facilitadores para su difusión, haciendo más difícil su detección para expertos y legos en el tema. Las tipologías diseñadas para la identificación y clasificación de bulos permiten su análisis desde perspectivas teóricas como las cámaras de eco, las burbujas de filtro, la manipulación de la información y la disonancia cognitiva. Método: Se realizó un análisis de contenido a 371 noticias falsas, previamente verificadas por fact-checkers. Luego de una prueba de intercodificadores, se procedió a clasificar los bulos según su tipo, intencionalidad, tema principal abordado, las redes en que circularon, la técnica de engaño, el país de origen, su carácter transnacional, entre otras variables. Resultados: La intención de bulo más común fue de carácter ideológico, asociada con temas como los falsos anuncios de gobiernos, organizaciones o personajes públicos, así como con la técnica de contexto falso para su elaboración. Una cuarta parte de los bulos analizados se repitieron en varios países, promoviendo principalmente falsas curas con contenidos fabricados como técnica de engaño. Discusión y Conclusiones: Desinformar es un fenómeno de manipulación y filtraje basado en la coincidencia ideológica y emocional que comparten quienes circulan bulos. La (des)información que converge con los intereses de sus usuarios, hace que su difusión se haga de manera indiscriminada y facilite su transnacionalidad, con leves modificaciones, sin que esto afecte su aceptación y su recirculación
... For example, RWA has been associated with preference for right-leaning parties (Beierlein et al., 2014;Sindermann et al., 2020b), or negative attitudes/prejudice towards immigrants in German samples (Beierlein et al., 2014). In addition, previous studies suggest that individuals tend to overrate the accuracy of news items fitting with their political opinions but underrate the accuracy of non-fitting news items (Allcott and Gentzkow, 2017;Anthony and Moulding, 2019;Bago et al., 2020;Sindermann et al., 2020a); however, this does not seem to be due to motivated reasoning (Bago et al., 2020;Pennycook and Rand, 2019;Sindermann et al., 2020a). As the focus of the present study was not on political news headlines with, for example, left-versus right-leaning content, it is not surprising that neither RWA nor SDO were strongly associated with any of the variables derived from the present Fake and True News Test. ...
... For example, RWA has been associated with preference for right-leaning parties (Beierlein et al., 2014;Sindermann et al., 2020b), or negative attitudes/prejudice towards immigrants in German samples (Beierlein et al., 2014). In addition, previous studies suggest that individuals tend to overrate the accuracy of news items fitting with their political opinions but underrate the accuracy of non-fitting news items (Allcott and Gentzkow, 2017;Anthony and Moulding, 2019;Bago et al., 2020;Sindermann et al., 2020a); however, this does not seem to be due to motivated reasoning (Bago et al., 2020;Pennycook and Rand, 2019;Sindermann et al., 2020a). As the focus of the present study was not on political news headlines with, for example, left-versus right-leaning content, it is not surprising that neither RWA nor SDO were strongly associated with any of the variables derived from the present Fake and True News Test. ...
Preprint
Individual differences in personality help to understand various human behaviors. With regard to fake news susceptibility, previous work investigated individual characteristics mostly in light of believing fake news but little is known about misclassifying true news as fake. However, identifying both fake and true news are important prerequisites for unbiased belief formation. We conducted an online study with N=530 (n=396 men) participants investigating results in a Fake and True News test in the light of i) fluid and crystallized intelligence tests and the Big Five Inventory, and ii) news consumption. Results show that fluid intelligence was negatively associated with believing fake news (but the association did not remain significant in a regression model); crystallized intelligence was negatively associated with misclassifying true news. Extraversion was negatively associated and crystallized intelligence positively associated with fake and true news discernment. The number of different news sources consumed was negatively associated with misclassifying true news and positively associated with fake and true news discernment. The present findings reveal differences in the underlying factors of believing fake news and misclassifying true news. Protective measures are discussed, which might help to improve the abilities to identify both fake and true news.
... It has been found, quite consistently, that cognitive reflection, analytical thinking, and basic knowledge about the respective topic (e.g., COVID-19, political news, or climate) helps to differentiate fake from real news (Amazeen and Bucy 2019;Bago et al. 2020;Calvillo et al. 2020;Pennycook and Rand 2019;Van Der Linden 2022;Vegetti and Mancosu 2020). Notably, whether people tend to believe news that is in accordance with their own political ideology is still a matter of debate (see Kahan 2012;Pennycook and Rand 2019;Pennycook and Rand 2021;Sindermann et al. 2020). Some studies suggest that it is not the not the match between political beliefs and the content of news articles that has a modulating influence on the ability to make correct judgments, but rather the political attitude itself. ...
Article
Full-text available
Xenophobic and right-wing attitudes have become a major issue in Western societies. The present study investigated how such attitudes and stereotypes influence media perception in terms of identifying manipulated news articles. In a fake news paradigm, N = 326 participants provided self-report measures of xenophobia and conservatism, and were presented with real news media articles describing crimes that were committed either by putative German (i.e., in-group) or putative immigrant (i.e., out-group) perpetrators. Half of the articles were manipulated, and the participants were asked to rate the articles with respect to the perceived veracity of the article and the reprehensibility of the described criminal offences. Xenophobia, but not conservatism, was associated with poorer news discernment and higher perceived veracity in the immigrant offender condition, but not in the native German offender condition. Reprehensibility was not differentially associated with xenophobia in the two origin-of-offender conditions. The fake news paradigm revealed an out-group bias with respect to the perceived veracity of media news, and this result offers an alternative to measure stereotypes about immigrants more subtly than by explicit self-report. Xenophobia seems to make people less sensitive to hints that could inform them about the falsehood of information.
... Research intending to distinguish between the two approaches has been, so far, less supportive of the motivated reasoning account (i.e., biased processing due to a preference for reaching a certain conclusion), but more supportive of the classical approach, showing that deception occurs due to lack of analytical reasoning Pennycook & Rand, 2019Sindermann et al., 2020). Moreover, one problem with the existing evidence for motivated reasoning was outlined by Mandel (2014), applying a Bayesian framework. ...
Article
The present research examines processes involved in how people believe and share news posts on social media. We tested whether the relation between individuals' previous political beliefs and judging the accuracy of and willingness to share fake and true news is mediated by epistemic emotional response (surprise and interest) and perceived credibility (trustworthiness, rigorosity, impartiality). In a within-subjects experiment, we presented ten publications (5 true, 5 fake) with political content, extracted from Facebook, to 259 Portuguese participants. The results showed that fake and true news were processed in a similar way. Emotional response and perceived credibility did not only depend on the content, but also on previous beliefs. Negative beliefs about the political system increased emotional response to true and false news, which in turn increased perceptions of credibility, leading to higher accuracy attributions and willingness of sharing news (true or false). The most distinctive difference between the participants interactions with fake and true news was that participants willingness to share fake news was not entirely explained by emotional response and credibility perceptions. We conclude that people seem to rely on emotional cues, appraised with regard to previous beliefs, and on emotionally biased credibility indicators to guess whether news are true or worth sharing.
... The dark side of the data business model behind social media services has become apparent over time. For instance, design elements such as personalized news-feeds might not only lead to addictive use of social media services, but might also create political filter bubbles among some of the users [14] , thus putatively narrowing the world view especially of those users who get news exclusively via social media [42] . This business model also pays little regard to widely treasured notions of privacy and abuses it as a matter of course [38 , 43] . ...
Article
Full-text available
Social media has captured a large share of the public sphere at a pace far quicker than any other means of communication did in the past but the initial techno-optimism that marked this ascent has recently started giving way to critical assessments of its wide-ranging effects. In this article, we argue that just as there is a need to assess and highlight its many ills, there is also an urgent need to foster and expand discussion on what a healthier version of social media could be. We examine social media from the perspective of its three constituent parts, namely social networks, communication within these networks and the platforms that enable them. Subsequently, we argue that social media as an idea should be reimagined independently of the limited group of platforms that currently monopolize it. To that end, we discuss alternative models such as federated, blockchain-based and public-service social media platforms, and the measures required to ensure a level playing field for their emergence.
... In addition to environmental factors, people's vulnerability to fake news is also influenced by individual characteristics, including cognitive factors (Sindermann et al., 2020). Analytical thinking and an open-minded can affect the low belief about fake news (Bronstein et al., 2021;. ...
Article
Full-text available
During Coronavirus Disease 2019 (COVID-19) pandemic, the spread of fake news on social media occurred massively. To avoid any negative impacts, people are expected to exhibit investigative behavior towards the news they encounter, including conducting fact checks. Previous research has shown that intellectual humility can influence investigative behavior when exposed to fake news about COVID-19. This study aims to examine intellectual humility’s ability to predict investigative behavior when dealing with news about the COVID-19 vaccine. The study involved 227 students (157 female and 70 male, M = 21, SD = 1.19) as respondents who were selected by convenience sampling. The instruments used in revealing the two variables are the General Intellectual Humility Scale, headlines of news articles about the COVID-19 vaccine, and the scale of investigative behavior tendencies towards news about the COVID-19 vaccine. The results showed that intellectual humility could predict investigative behavior towards fake (B = 0.89; 95% CI [0,62, 1,15], p < 0,001) and fact news headlines (B = 0,87; 95% CI [0,60, 1,15], p < 0,001) about the COVID-19 vaccine. This finding implies that higher intellectual humility in individuals is predicted to increase investigative behavior towards news about the COVID-19 vaccine.
... Individual schemas of cognitive processes, along with emotional and behavioral patterns, constitute a more general concept of personality (10). Various personality traits have been postulated to be involved in the way we process information (11), yet there have been very few attempts to explain the role of personality differences in the susceptibility to misinformation (12). It is rather puzzling given that the Big Five personality traits, extraversion, conscientiousness, agreeableness, openness to experience, and neuroticism [the Five-Factor Model (13)], as well as anxiety, understood as a stable personality characteristic (14,15), have the potential to shape humans' perception of truthfulness. ...
Article
Full-text available
Misinformation on social media poses a serious threat to democracy, sociopolitical stability, and mental health. Thus, it is crucial to investigate the nature of cognitive mechanisms and personality traits that contribute to the assessment of news items' veracity, failures in the discernment of their truthfulness, and behavioral engagement with the news, especially if one wants to devise any intervention to stop the spread of misinformation in social media. The current research aimed to develop and test a 4-fold taxonomy classifying people into four distinct phenotypes of susceptibility to (mis)information. In doing so, it aimed to establish differences in cognitive and psychological profiles between these phenotypes. The investigated cognitive processes included sensitivity to feedback, belief updating, and cognitive judgment bias. Psychological traits of interest included the Big Five model, grandiose narcissism, anxiety, and dispositional optimism. The participants completed online surveys that consisted of a new scale designed to classify people into one of four phenotypes of susceptibility to (mis)information, advanced cognitive tests, and reliable psychological instruments. The four identified phenotypes, Doubters, Knowers, Duffers, and Consumers, showed that believing in misinformation does not imply denying the truth. In contrast, the numerically largest phenotypes encompassed individuals who were either susceptible (Consumers) or resistant (Doubters), in terms of veracity judgment and behavioral engagement, to any news, regardless of its truthfulness. Significantly less frequent were the phenotypes characterized by excellent and poor discernment of the news' truthfulness (the Knowers and the Duffers, respectively). The phenotypes significantly differed in sensitivity to positive and negative feedback, cognitive judgment bias, extraversion, conscientiousness, agreeableness, emotional stability, grandiose narcissism, anxiety, and dispositional optimism. The obtained results constitute a basis for a new and holistic approach in understanding susceptibility to (mis)information as a psycho-cognitive phenotype.
... Until now, much of the work has considered the effect of cognitive factors (Martel et al., 2020). Without explicitly considering the cognitive factors which have been shown to relate to accuracy in truth assessment (Bronstein et al., 2019;, 2021Sindermann et al., 2020), over 50% of the variance could be explained. ...
Article
Full-text available
More work needs to be done to understand how mental well‐being and interpersonal factors are associated with biases in judging the veracity of true and false political information. Three days before the 2020 U.S. presidential election, 477 participants guessed the veracity of true and false political statements. Interpersonal factors (e.g. high prosociality and a need to belong) and mental health risk factors (e.g. high depressive symptoms and low eudaimonic well‐being) were highly associated with believing false information. Further, positive well‐being was associated with assessing news with a partisan bias. Next, hierarchical regression was used to better understand the combination of factors which best predict accurate judgements. To reduce the chances of overfitting, out‐of‐sample validation was used. 40% of the variance for believing false information was explained by high prosociality and low well‐being. In addition, well‐being mediated the effects of political ideology when assessing the veracity of political information.
... Lastly, we explore the moderating effect of pre-existing issue attitudes of news recipients for the proposed influences of deepfakes and media literacy education. Previous literature has suggested that people often fall for fake news that is consistent with their worldviews (Sindermann, Cooper, and Montag 2020), and such worldviewconfirming fake news is harder to correct with post-hoc debunking efforts (Walter et al. 2020). The current study finds whether similar patterns occur for deepfakes and correction efforts based on the signaling theory. ...
Article
With rapid technical advancements, deepfakes-i.e., hyper-realistic fake videos using face swaps-have become more widespread and easier to create, challenging the old notion of "seeing is believing." Despite raised concerns over potential impacts of deepfakes on people's credibility toward audiovisual evidence in journalism, systematic investigation of the topic has been lacking. This study conducted an experiment (N = 230) that tested (1) how a news article using deepfake video (vs. real video) affects news credibility and viral behavioral intentions and (2) whether, based on signaling theory, obtaining knowledge about the low cost of producing deepfakes reduces the impact of deepfake news. Results show that people whose pre-existing attitudes toward controversial issues (abortion, marijuana legalization) are congruent with the advocated position of a news article are more likely to believe and be willing to share deepfake news as much as real video news. In addition, educating participants about the low cost of producing deepfakes was effective in reducing the credibility and viral behavioral intention of deepfake news for those who have congruent issue attitudes. This study provides evidence for differing levels of susceptibility for deepfake news and the importance of media literacy education regarding deepfakes that would prevent biased reasoning.
... The BLINCS approach may also provide a relevant framework for examining important contemporary matters such as how we tell "fake news" from "real news." This topic is a growing issue of concern, particularly in political and public health contexts (e.g., Coscia & Rossi, 2020;Lazer et al., 2018;Redelmeier & Shafir, 2020;Sindermann, Cooper, & Montag, 2020) and is indicative of the influence of factors such as attitudes, beliefs, lack of knowledge, attribution biases, and fears in our estimation of the reality-fiction distinction. The BLINCS model allows the formulation of specific hypotheses, such as whether the lighter the degree of inference evoked in a particular piece of news, the more likely it is to be relegated to the "fiction" category. ...
Article
The human ability to tell apart reality from fiction is intriguing. Through a range of media, such as novels and movies, we are able to readily engage in fictional worlds and experience alternative realities. Yet even when we are completely immersed and emotionally engaged within these worlds, we have little difficulty in leaving the fictional landscapes and getting back to the day-to-day of our own world. How are we able to do this? How do we acquire our understanding of our real world? How is this similar to and different from the development of our knowledge of fictional worlds? In exploring these questions, this article makes the case for a novel multilevel explanation (called BLINCS) of our implicit understanding of the reality–fiction distinction, namely that it is derived from the fact that the worlds of fiction, relative to reality, are bounded, inference-light, curated, and sparse.
... HOFN) (Williams et al., 2017). In fact, people who perceive themselves to be more in control are more likely to utilize analytic thinking for online news (Sindermann et al., 2020;Zhou et al., 2012), which is negatively associated with trust in the news because of its low perceived accuracy . Conversely, the perception of a loss of control often drives people to make efforts to increase their control over a situation (Alonso-Ferres et al., 2020;Liu et al., 2021). ...
Article
Purpose Health-related online fake news (HOFN) has become a major social problem. HOFN can lead to the spread of ineffective and even harmful remedies. The study aims to understand Internet users' responses to HOFN during the coronavirus (COVID-19) pandemic using the protective action decision model (PADM). Design/methodology/approach The authors collected pandemic severity data (regional number of confirmed cases) from government websites of the USA and China (Studies 1 and 2), search behavior from Google and Baidu search engines (Studies 1 and 2) and data regarding trust in two online fake news stories from two national surveys (Studies 2 and 3). All data were analyzed using a multi-level linear model. Findings The research detected negative time-lagged relationships between pandemic severity and regional HOFN search behavior by three actual fake news stories from the USA and China (Study 1). Importantly, trust in HOFN served as a mediator in the time-lagged relationship between pandemic severity and search behavior (Study 2). Additionally, the relationship between pandemic severity and trust in HOFN varied according to individuals' perceived control (Study 3). Originality/value The authors' results underscore the important role of PADM in understanding Internet users' trust in and search for HOFN. When people trust HOFN, they may seek more information to implement further protective actions. Importantly, it appears that trust in HOFN varies with environmental cues (regional pandemic severity) and with individuals' perceived control, providing insight into developing coping strategies during a pandemic.
... Openness, in contrast, refers to the urge for experiences as well as the tendency toward cognitive exploration (Kaufman et al., 2016), and is associated with more effortful information seeking, while those low in openness have been found to prefer the confirmation of familiar information (Heinström, 2003). Sindermann et al. (2020) in a recent review suggest that openness should act as a buffer against fake news belief, and some research seems to support this, as higher openness has been found to be associated with being better at discerning fake from real news (Heinström, 2003;Calvillo et al., 2021) and lower susceptibility to misinformation (Doughty et al., 2017). However, Wolverton and Stevens (2019) found the exact opposite, that participants who scored low on openness were better at identifying false information than those who scored high, while Sindermann et al. (2021) found no major role of openness explaining any tendencies of fake news discernment. ...
Article
Full-text available
Accessing information online is now easier than ever. However, also false information is circulated in increasing quantities. We sought to identify social psychological factors that could explain why some people are more susceptible to false information. Specifically, we investigated whether psychological predispositions (social dominance orientation, right-wing authoritarianism, system justification beliefs (SJB), openness, need for closure, conspiracy mentality), competencies (scientific and political knowledge, interest in politics) or motivated reasoning based on social identity (political orientation) could help explain who believes fake news. Hungarian participants ( N = 295) judged political (anti- and pro-government) and non-political news. The Hungarian context—characterized by low trust in media, populist communication by the government and increasing polarization—should be fertile ground for the proliferation of fake news. The context in making this case particularly interesting is that the major political fault line in Hungary runs between pro- and anti-government supporter groups and not, for instance, between conservative and liberal ideology or partisanship. We found clear support for the motivational reasoning explanation as political orientation consistently predicted belief in both fake and real political news when their contents aligned with one’s political identity. The belief in pro-government news was also associated with higher SJB among pro-government supporters. Those interested in politics showed better capacity to distinguish real political news from the fake ones. Most importantly, the only psychological predisposition that consistently explained belief in all types of fake news was a conspiracy mentality. This supports the notion of ideological symmetry in fake news belief—where a conspiracy mentality can be found across the political spectrum, and it can make people susceptible to disinformation regardless of group-memberships and other individual differences.
... Such non-scientific sources often use scientific-looking claims to encourage readers to accept the information given (Health Feedback, 2021;Spencer, 2020;Health Feedback, 2021b). The articles rely on emotive language and a natural tendency of the audience to believe information from sources outside of the mainstream to convince audiences (Garrett & Weeks, 2017;O'Brien, et al., 2018;Sindermann, et al., 2020). ...
Article
Full-text available
This paper seeks to understand the phenomenon of fake news and its’ particular application in the areas of medical misinformation. A review of the literature shows that there is an increase in the amount of fake news and medical misinformation available, especially through social media channels. The paper examines the rise of fake news and how a particular section of this – medical misinformation – has increased over the past several years. The research shows that the increase in medical misinformation has led to an increase in patient harm and has even led to death. By completing a comprehensive review of the literature on medical misinformation, the authors are able to show that such misinformation is spread quickly and with little cost to the producer/disseminator. The motivations behind the production and dissemination of medical misinformation are examined as they are not based on financial factors alone. The authors also examine how medical misinformation has changed public perception in a number of cases, leading to large scale outbreaks of infectious disease, poisonings, and other harmful outcomes. At a time when massive vaccination attempts are being made due to the current COVID-19 pandemic, the dissemination of medical misinformation could hamper public health efforts. Although, at the current time (2021), a large amount of the reporting and research on medical misinformation is based around COVID-19, the authors have attempted to make this paper more generally based around all medical misinformation. The impact of social media on the spread of medical misinformation is also examined. The authors show that social media is one of the main channels for the dissemination of medical misinformation, in no small part due to the very low cost and the speed that information can be shared with millions. Finally, the paper suggests that policymakers, social media leaders, journalists, researchers, scientists, and medical providers must all work quickly to challenge this and to put in place robust solutions to reduce the rise of medical information.
... In recent years, great attention has been paid to studying the (a) symmetries between motivated social cognition of conservatives and liberals regarding politically salient information (Harper & Baguley, n. d.;Jost et al., 2003;Sindermann et al., 2020;. While original work highlighted conservatism as a driver of motivated social cognition (Jost et al., 2003) and evidence pointed to conservatives being more influenced by source similarity than liberals (Jost et al., 2018), recent evidence suggests that both sides of the political spectrum have a tendency to regard politically disconcordant news as illegitimate and fake (Harper & Baguley, n.d.), potentially due a general tendency of individuals to associate politically incongruent media sources themselves with the term "fake news" (van der Linden et al., 2020), thereby delegitimizing any news they may present. ...
Article
The viral spread of misinformation poses a threat to societies around the world. Recently, researchers have started to study how motivated reasoning about news content influences misinformation susceptibility. However, because the importance of source credibility in the persuasion process is well-documented, and given that source similarity contributes to credibility evaluations, this raises the question of whether individuals are more susceptible to misinformation from ideologically congruent news sources because they find them to be more credible. In a large between-subject pilot (N = 656) and a pre-registered online mixed-subject experiment with a US sample (N = 150) using simulated social media posts, we find clear evidence that both liberals and con-servatives judge misinformation to be more accurate when the source is politically congruent, and that this effect is mediated by perceived source credibility. We show that source effects play a greater role in veracity judgements for liberals than conservatives, but that individuals from both sides of the spectrum judge politically congruent sources as less slanted and more credible. These findings add to our current understanding of source effects in online news environments and provide evidence for the influential effect of perceived source similarity and perceived credibility in misinformation susceptibility.
... However, reflecting upon this question reveals that the correct response is 5 cents. Better performance on the CRT predicts correct responses to many thinking and reasoning tasks (Toplak et al., 2011), lower levels of epistemically suspect beliefs (Pennycook et al., 2015a), and less belief in and sharing of misinformation Bronstein et al., 2019;Pennycook & Rand, 2019a, 2019bRoss et al., 2021;Sindermann et al., 2020). ...
Article
The classical account of reasoning posits that analytic thinking weakens belief in COVID-19 misinformation. We tested this account in a demographically representative sample of 742 Australians. Participants completed a performance-based measure of analytic thinking (the Cognitive Reflection Test) and were randomized to groups in which they either rated the perceived accuracy of claims about COVID-19 or indicated whether they would be willing to share these claims. Half of these claims were previously debunked misinformation, and half were statements endorsed by public health agencies. We found that participants with higher analytic thinking levels were less likely to rate COVID-19 misinformation as accurate and were less likely to be willing to share COVID-19 misinformation. These results support the classical account of reasoning for the topic of COVID-19 misinformation and extend it to the Australian context.
... Our primary prediction was that giving people a warning before false information would be the most effective in promoting disbelief in information initially. If the pre-warning prompts people to think more deliberately or analytically about the headline as they read it (Sindermann et al., 2020), then they may also show less difference between fake news that supports or opposes their political allegiances (a reduced political congruency effect). Our core research question was how these warnings would fare over time. ...
Article
Full-text available
Politically oriented “fake news”—false stories or headlines created to support or attack a political position or person—is increasingly being shared and believed on social media. Many online platforms have taken steps to address this by adding a warning label to articles identified as false, but past research has shown mixed evidence for the effectiveness of such labels, and many prior studies have looked only at either short-term impacts or non-political information. This study tested three versions of fake news labels with 541 online participants in a two-wave study. A warning that came before a false headline was initially very effective in both discouraging belief in false headlines generally and eliminating a partisan congruency effect (the tendency to believe politically congenial information more readily than politically uncongenial information). In the follow-up survey two weeks later, however, we found both high levels of belief in the articles and the re-emergence of a partisan congruency effect in all warning conditions, even though participants had known just two weeks ago the items were false. The new pre-warning before the headline showed some small improvements over other types, but did not stop people from believing the article once seen again without a warning. This finding suggests that warnings do have an important immediate impact and may work well in the short term, though the durability of that protection is limited.
... The apparently vast amount and heterogeneity of recent empirical research output addressing the antecedents to people's belief in fake news calls for integrative work summarizing and mapping the newly generated findings. We are aware of a single review article published to date synthesizing empirical findings on the factors of individuals' susceptibility to believing fake news in political contexts, a narrative summary of a subset of relevant evidence [21]. In order to systematically survey the available literature in a way that permits both transparency and sufficient conceptual breadth, we employ a scoping review methodology, most commonly used in medical and public health research. ...
Article
Full-text available
Background Proliferation of misinformation in digital news environments can harm society in a number of ways, but its dangers are most acute when citizens believe that false news is factually accurate. A recent wave of empirical research focuses on factors that explain why people fall for the so-called fake news. In this scoping review, we summarize the results of experimental studies that test different predictors of individuals’ belief in misinformation. Methods The review is based on a synthetic analysis of 26 scholarly articles. The authors developed and applied a search protocol to two academic databases, Scopus and Web of Science. The sample included experimental studies that test factors influencing users’ ability to recognize fake news, their likelihood to trust it or intention to engage with such content. Relying on scoping review methodology, the authors then collated and summarized the available evidence. Results The study identifies three broad groups of factors contributing to individuals’ belief in fake news. Firstly, message characteristics—such as belief consistency and presentation cues—can drive people’s belief in misinformation. Secondly, susceptibility to fake news can be determined by individual factors including people’s cognitive styles, predispositions, and differences in news and information literacy. Finally, accuracy-promoting interventions such as warnings or nudges priming individuals to think about information veracity can impact judgements about fake news credibility. Evidence suggests that inoculation-type interventions can be both scalable and effective. We note that study results could be partly driven by design choices such as selection of stimuli and outcome measurement. Conclusions We call for expanding the scope and diversifying designs of empirical investigations of people’s susceptibility to false information online. We recommend examining digital platforms beyond Facebook, using more diverse formats of stimulus material and adding a comparative angle to fake news research.
... Fake news is spreading across all spheres: science [32][33][34]; art [35,36]; finance [37][38][39]; marketing [40][41][42]; politics [43][44][45]; security [46][47][48]; defense [49,50]; civil protection [51][52][53]; and health [54][55][56][57], including recent sensitive topics such as vaccines [58,59] and COVID-19 [59][60][61][62]. "Fake news" was even named the word of the decade for 2011-2020 by the Mac- The spread of the fake news phenomenon poses a serious negative impact on individuals and society [12], as it can break the authenticity balance of the news ecosystem, it intentionally persuades consumers to accept biased or false beliefs, and it changes the way people interpret and respond to real news. ...
Article
Full-text available
In recent years, we have witnessed a rise in fake news, i.e., provably false pieces of information created with the intention of deception. The dissemination of this type of news poses a serious threat to cohesion and social well-being, since it fosters political polarization and the distrust of people with respect to their leaders. The huge amount of news that is disseminated through social media makes manual verification unfeasible, which has promoted the design and implementation of automatic systems for fake news detection. The creators of fake news use various stylistic tricks to promote the success of their creations, with one of them being to excite the sentiments of the recipients. This has led to sentiment analysis, the part of text analytics in charge of determining the polarity and strength of sentiments expressed in a text, to be used in fake news detection approaches, either as a basis of the system or as a complementary element. In this article, we study the different uses of sentiment analysis in the detection of fake news, with a discussion of the most relevant elements and shortcomings, and the requirements that should be met in the near future, such as multilingualism, explainability, mitigation of biases, or treatment of multimedia elements.
... There is no label that can be applied to such conversations happening after the point of initial delivery. Consumers with stronger analytical and critical thinking skills are observed to be more likely to perform the necessary research to verify the accuracy of misinformation even when it does conform with their previously held beliefs [76]. Given this, the proposed labeling system's effectiveness and impact will vary-potentially significantly-based on the types of content that it is applied to and the other interactions (and level of interactions) between members of the public outside of online news consumption. ...
Article
Full-text available
So-called ‘fake news’—deceptive online content that attempts to manipulate readers—is a growing problem. A tool of intelligence agencies, scammers and marketers alike, it has been blamed for election interference, public confusion and other issues in the United States and beyond. This problem is made particularly pronounced as younger generations choose social media sources over journalistic sources for their information. This paper considers the prospective solution of providing consumers with ‘nutrition facts’-style information for online content. To this end, it reviews prior work in product labeling and considers several possible approaches and the arguments for and against such labels. Based on this analysis, a case is made for the need for a nutrition facts-based labeling scheme for online content.
... Indeed, social media repeatedly propose contents which the user already expects to read and which confirm his pre-existing beliefs. As a matter of facts, some psychological studies (Sindermann et al. 2020) have shown that, especially in the political sphere, fake news are consistent with pre-existing users' attitudes. In addition to the emotional appeal, the misleading use of the images represents a fake news feature. ...
Conference Paper
Purpose. Fake news is not a new phenomenon and their spreadable has increased due to the social media advent, but their actual impact remains an open question. From the management literature, it emerges that the consequences of the fake news dissemination are crucial for the brands, as in this context brands can lose the control over their marketing communication strategy. However, fake news literature reviews are few and far too little attention has been paid to summarizing the fake news' dimensions in the various research areas. The purpose of this paper is to evaluate existing studies and to provide an appropriate systematic review about the fake news phenomenon. Design/methodology/approach. To achieve this research objective, the systematic review method was adopted. The work was divided into four phases. The phase 1 was focused on defining the research questions and the scope of the study. The phase 2 regards search criteria definition and bibliographic research. Phase 3 is dedicated to the data extraction and organization. Finally, phase 4 analyzed and synthetizes the contributions with the aim of providing a fake news framework. Findings. This review reveled fake news topic was addressed in different research areas. By analyzing 209 journal articles, we identify definitions, dimensions and theories of fake news, which we presented in form of three clusters' dimensions, i.e. content, appearance and purpose. Moreover, two main reasons have emerged for providing fake news: (1) economic motivation and (2) ideological motivation. Finally, the fake news' effects are presented both for users and companies. 2 Originality. The uniqueness of this paper exists in the fact that there was not a systematic review on the analyzed topic. In conclusion, in light of the emerged framework, this research offers new prospects for empirically investigating the fake news phenomenon.
... Here, the classic Social Impact Theory (SIT) by Latané (37) tries to understand how to best measure the impact of people on a single individual/individuals. This theory-originating in the pre-social-media-age-gained a lot of visibility with the rise of social media services because, in particular, in the age of filter bubbles, fake news, and misinformation campaigns (38,39), it is interesting to understand how individual users on social media are socially influenced by others, for instance, in the area of their (political) attitudes. The SIT postulates three highly relevant factors called strength, immediacy, and number (of sources) to predict such a social impact. ...
Article
Full-text available
TikTok (in Chinese: DouYin; formerly known as musical.ly) currently represents one of the most successful Chinese social media applications in the world. Since its founding in September 2016, TikTok has seen widespread distribution, in particular, attracting young users to engage in viewing, creating, and commenting on “LipSync-Videos” on the app. Despite its success in terms of user numbers, psychological studies aiming at an understanding of TikTok use are scarce. This narrative review provides a comprehensive overview on the small empirical literature available thus far. In particular, insights from uses and gratification theory in the realm of TikTok are highlighted, and we also discuss aspects of the TikTok platform design. Given the many unexplored research questions related to TikTok use, it is high time to strengthen research efforts to better understand TikTok use and whether certain aspects of its use result in detrimental behavioral effects. In light of user characteristics of the TikTok platform, this research is highly relevant because TikTok users are often adolescents and therefore from a group of potentially vulnerable individuals.
... Shin and Lee (2022) examine how news articles containing a real or Deepfaked video influenced the credibility of, and intentions to share, that story. When there was a match between pre-existing attitudes and the content of the videos, people believed Deepfakes just as much as authentic videos and had higher intentions to share Deepfaked content (a finding consistent with work elsewhere on fake news; e.g., Sindermann et al., 2020). ...
... Our results showed that problem-solving accuracy (as measured by both CRT and Rebus puzzles) correlates positively with a discerning fake from real news, indicating that an individual's willingness to engage in analytic and reflective thinking is associated with a reduced belief in fake news. In line with other studies, we found that individuals who perform better on the CRT (Bronstein et al., 2019;Pennycook and Rand, 2019), and visual-semantic puzzles (Sindermann et al., 2020) are better able to discern fake from real news. Tackling complicated problems requires continuous reframing and changing the initial representation of a problem to see it under a new light. ...
Article
Full-text available
In times of uncertainty, people often seek out information to help alleviate fear, possibly leaving them vulnerable to false information. During the COVID-19 pandemic, we attended to a viral spread of incorrect and misleading information that compromised collective actions and public health measures to contain the spread of the disease. We investigated the influence of fear of COVID-19 on social and cognitive factors including believing in fake news, bullshit receptivity, overclaiming, and problem-solving—within two of the populations that have been severely hit by COVID-19: Italy and the United States of America. To gain a better understanding of the role of misinformation during the early height of the COVID-19 pandemic, we also investigated whether problem-solving ability and socio-cognitive polarization were associated with believing in fake news. Results showed that fear of COVID-19 is related to seeking out information about the virus and avoiding infection in the Italian and American samples, as well as a willingness to share real news (COVID and non-COVID-related) headlines in the American sample. However, fear positively correlated with bullshit receptivity, suggesting that the pandemic might have contributed to creating a situation where people were pushed toward pseudo-profound existential beliefs. Furthermore, problem-solving ability was associated with correctly discerning real or fake news, whereas socio-cognitive polarization was the strongest predictor of believing in fake news in both samples. From these results, we concluded that a construct reflecting cognitive rigidity, neglecting alternative information, and black-and-white thinking negatively predicts the ability to discern fake from real news. Such a construct extends also to reasoning processes based on thinking outside the box and considering alternative information such as problem-solving.
... Despite, for example, the GRU's probable emphasis on using machine-translations to support digital psychological operations, the fact that linguistic mistakes have been frequently used to detect and identify their operations indicates technology has fallen short 11 For instance, an early 2019 poll conducted by Gallup revealed that US President Donald Trump's job approval rating that year marked the most entrenched political polarisation within the US than previously recorded (Jones, 2019). At the same time, academic research has demonstrated a positive correlation between polarisation and receptivity to 'fake news', such as individuals' propensity to overrate the accuracy of news consistent with their political views (Sindermann, Cooper and Montag, 2020). of ambition. 12 While Russian influence actors have recently demonstrated the ability to use 'deep fake' technology to create false social media profiles, such as the Internet Research Agency's (IRA) effort to support a covert website through a handful of inauthentic profiles (Macaulay, 2020), cyber security firms were able to quickly identify them. ...
Chapter
Full-text available
While Moscow's willingness to launch cyber operations depends in no small part on how the Russian leadership interprets geopolitics, resources and personnel determine the ability to conduct them. Russia has demonstrated a capacity to craft sophisticated malware to support operations that range from espionage to disrupting critical infrastructure, to interfering in states' internal affairs through cyber-enabled influence campaigns, but the government still faces difficulties recruiting and retaining the needed technological talent to keep pace with its rivals. While some of the factors inhibiting the growth of Moscow's cyber programme are internal to the organisations tasked with executing them, such as a culture-clash between specialist recruits and the bureaucracy, the most significant impediments are exoge-nous to them and include brain-drain and the health of Russia's economy. Moscow's litany of perceived adversaries in cyberspace ensures continuous efforts by the state to prevent the emigration of computer science and IT specialists and expand the ranks of those serving Russia's offensive and defensive cyber capabilities. As evolving technologies like artificial intelligence and quantum computing carry implications for future cyber operations, Moscow's ability to marshal its resources to remain competitive in a furtive digital arms race similarly depends on many of these factors. This chapter aims to address key questions arising from the probable gap that separates Russian cyber personnel and capabilities, especially technological innovation, from its ambitions and what effect this disparity might have on future state-backed cyber campaigns. It starts by accounting for different factors that affect the ability of Russia's military and security services to successfully expand recruiting and support technological innovation related to cyber operations. This is followed by an examination of various initiatives and strategies that Russian agencies have introduced to address Russia's cyber limitations and cultivate technological innovation. Finally, 32 it discusses how Russia's current official policies and informal practices are likely to affect the nature of its cyber operations in the future and to what extent NATO and its members can leverage these limitations to achieve desired effects in the Alliance's cyber security efforts.
... A cognitive style is an individual's preferred approach for perceiving, processing and remembering information (Zhang and Sternberg 2006). Evidence suggests a reflexive (or 'Type 1' Evans and Stanovich 2013;Kahneman 2011;Ross et al. 2016), rather than a reflective ('Type 2'), cognitive style is associated with the formation and maintenance of various implausible beliefs (Bronstein et al. 2019;Greene and Murphy, this issue;Pennycook et al. 2015a;Pennycook et al. 2015b;Pennycook and Rand 2020;Sindermann et al. 2020). A reflexively open-minded cognitive style describes a 'lazy' approach to decisionmaking, whereby a broad range of claims are uncritically accepted, irrespective of their epistemic value (Pennycook and Rand 2020). ...
Article
Full-text available
Past research suggests that an uncritical or ‘lazy’ style of evaluating evidence may play a role in the development and maintenance of implausible beliefs. We examine this possibility by using a quasi-experimental design to compare how low- and high-quality evidence is evaluated by those who do and do not endorse implausible claims. Seven studies conducted during 2019–2020 provided the data for this analysis ( N = 746). Each of the seven primary studies presented participants with high- and/or low-quality evidence and measured implausible claim endorsement and evaluations of evidence persuasiveness (via credibility, value, and/or weight). A linear mixed-effect model was used to predict persuasiveness from the interaction between implausible claim endorsement and evidence quality. Our results showed that endorsers were significantly more persuaded by the evidence than non-endorsers, but both groups were significantly more persuaded by high-quality than low-quality evidence. The interaction between endorsement and evidence quality was not significant. These results suggest that the formation and maintenance of implausible beliefs by endorsers may result from less critical evidence evaluations rather than a failure to analyse. This is consistent with a limited rather than a lazy approach and suggests that interventions to develop analytical skill may be useful for minimising the effects of implausible claims.
... Such research is important, given prior unproven myths regarding online socialization replacing offline social contact [discussed in Ref. 13]. Other authors, however, review research on adverse consequences of computer technology use, including excessive internet and gaming use [14][15][16], adverse effects of interruptive notifications [17], cyberbullying [18], fake news and filter bubbles [19], and challenges to and concern with our online privacy and security [20,21]. These reviews should remind us that new technology is not inherently good or bad, but it is how we use such technology that determines whether it will have positive or adverse consequences. ...
Article
Given the complexity of contrasting the spread of fake news and conspiracy theories, past research has started investigating some novel pre-emptive strategies, such as inoculation and prebunking. In the present research, we tested whether counterfactual thinking can be employed as a prebunking strategy to prompt critical consideration of fake news spread online. In two experiments, we asked participants to read or generate counterfactuals on the research and development of COVID-19 treatments, and then to evaluate the veridicality and plausibility of a fake news headline related to the topic. Participants' conspiracy mentality was also measured. Among participants with higher levels of conspiracy mentality, those exposed to counterfactual prebunking rated the fake news headline less plausible than those in the control condition (Study 1) and those exposed to another type of prebunking, that is, forewarning of the existence of misinformation (Study 2). The counterfactual prebunking strategy also induced less reactance than the alternative one. Discussion focuses on the development of new strategies to prevent the spread of misinformation, and the conditions under which these strategies may be successful.
Article
Increasing misinformation spread poses a threat to older adults but there is little research on older adults within the fake news literature. Embedded in the Changes in Integration for Social Decisions in Aging (CISDA) model, this study examined the role of (a) analytical reasoning; (b) affect; (c) news consumption frequency, and their interplay with (d) news content on news veracity detection in aging. Conducted during the early phase of the COVID-19 pandemic, the present study asked participants to view and evaluate COVID or non-COVID (i.e., everyday) news articles, followed by measures of analytical reasoning, affect, and news consumption frequency. News veracity detection was comparable between young and older adults. Additionally, fake news detection for non-COVID news was predicted by individual differences in analytic reasoning for both age groups. However, chronological age effects in fake news detection emerged within the older adult sample and interacted with the CISDA-derived components of analytical reasoning, affect, and news consumption frequency by news content. Collectively, these findings suggest that age-related vulnerabilities to deceptive news are only apparent in very old age. Our findings advance understanding of psychological mechanisms in news veracity detection in aging. (PsycInfo Database Record (c) 2022 APA, all rights reserved).
Article
Purpose Coronavirus disease 2019-related fake news consistently appears on social media. This study uses appraisal theory to analyze the impact of such rumors on individuals' emotions, motivations, and intentions to share fake news. Furthermore, the concept of psychological distance and construal level theory are used in combination with appraisal theory to compare toilet paper shortages and celebrity scandal rumors. Design/methodology/approach Data collected from 299 Taiwanese respondents to 150 toilet paper shortage-related and 149 celebrity gossip-related questionnaires were processed using partial least squares regression and multigroup analysis. Findings In both cases, surprise is felt most intensely. However, unlike in the celebrity fake news scenario, worry plays a prominent role in driving the altruistic sharing motivation related to the toilet paper shortage rumor. Furthermore, while emotional attributes (basic or self-conscious, concrete, or abstract) serve as a guide for how emotions change with psychological distance, the degree to which an emotion is relevant to the fake news context is key to its manifestation. Originality/value This study examines the impact of individuals' emotions on their motivations and intention to share fake news, applying the appraisal theory and the psychological distance concept in a single study to fake news sharing intention. It evaluates the relationship between psychological distance and emotions, revealing that it is not absolute and need not necessarily shift according to psychological distance change; rather, the relationship is context-sensitive.
Preprint
Trust is crucial for successful social interaction across the lifespan. Perceiver age, facial age and facial emotion have been shown to influence trustworthiness perception, but the complex interplay between these perceiver and facial characteristics has not been examined. Adopting an adult lifespan developmental approach, 199 adults (aged 22-78 years) rated the trustworthiness of faces that systematically varied in age (young, middle-aged, older) and emotion (neutral, happy, sad, fearful, angry, disgusted) from the FACES Lifespan Database. The study yielded three key results. First, on an aggregated level, facial trustworthiness perception did not differ by perceiver age. Second, all perceivers rated young faces as most trustworthy; and middle-aged and older (but not young) perceivers rated older faces as least trustworthy. Third, facial emotions signaling threat (fear, anger, disgust) relative to neutral, happy, and sad expressions, moderated age effects on facial trustworthiness perception. Findings from this study highlight the impact of perceiver and facial characteristics on facial trustworthiness perception in adulthood and aging and have potential to inform first impression formation, with effects on trait attributions as well as behavior. This publication also provides normative data on perceived facial trustworthiness for the FACES Lifespan Database.
Article
Full-text available
Objective This study aimed to explore the British public’s healthcare-seeking beliefs concerning eye symptoms, and assess how the first COVID-19 lockdown influenced these. Methods and analysis An anonymous web-based survey was disseminated through mailing lists and social media between June and August 2020. The survey sought participants’ views on the severity and urgency of the need for medical review for four ophthalmic and two general medical scenarios on a five-point scale. Participants were asked to answer questions twice: once ignoring the COVID-19 pandemic, and once taking this into account, with additional questions asked to identify factors influencing the decision to seek medical attention and ward admission. Results A total of 402 participants completed the survey (mean age 61.6 years, 63.1% female and 87.7% of white ethnicity). Scores for symptom severity and urgency of medical review increased significantly with the severity of the clinical scenario (both p<0.001). However, participants gave significantly lower scores for the urgency of medical attention when accounting for the COVID-19 pandemic (compared with no pandemic) for all scenarios (all p<0.001). Younger age, greater deprivation and non-white ethnicity were correlated with a lower perception of seriousness and urgency of medical attention. Conclusions During the first UK lockdown of the COVID-19 pandemic, reduced urgency of medical review for ocular and systemic pathologies was reported in response to the pandemic, which represents a barrier to healthcare-seeking behaviour. This has the potential to critically delay medical review and timely management, negatively impacting patient outcomes.
Chapter
In the overload information era, we need to be conscious of the dissemination of incoherent and misleading content both in the traditional and social media. It is a problem that has worsened recently and called the attention of some governments worldwide. The so-called Fake News has got notoriety due to the popularization and rapid consumption of online news. The democratization of the internet access carried out an increase in independent production and consumption of a variety of unverified information contents, which are also spread around on a large scale. Because the production capacity is much higher than that of the fact-checking agencies, it becomes necessary the support of systems for automatic detection of this type of content. Therefore, in this article, we propose a linguistic-structure analysis approach with named-entity recognition to identify fake news. By applying our approach, we can identify linguistic-structures that must unveil an article produced and verified by professional news agencies from that false information and sensationalist. In this regard, we present a linguistic analysis system with 90% on average accuracy of identification surpassing the state-of-the-art of this type of content in the literature datasets.
Article
Unstructured: By March 2021, the SARS-CoV-2 virus has been responsible for over 115 million cases of COVID-19 worldwide, resulting in over 2.5 million deaths. As the virus grew exponentially, so did its media coverage, resulting in a proliferation of conflicting information on social media platforms - a so-called "infodemic." In this mixed scoping review, we survey past literature investigating the role of automated accounts, or "bots," in spreading such misinformation, drawing connections to the COVID-19 pandemic. We also review strategies used by bots to spread (mis)information and examine potential origins of bots. We conclude by conducting and presenting a secondary analysis of known bot datasets in which we find that up to 66% of bots are discussing COVID-19. The proliferation of COVID-19 (mis)information by bots, coupled with human susceptibility to believing and sharing misinformation, may well impact the course of the pandemic.
Article
Full-text available
Potential effects of demographics, personality, and ideological attitudes on the number of news sources consumed should be investigated. The number of news sources consumed, in turn, was seen as inverse proxy for the susceptibility to be caught in "filter bubbles" and/or "echo chambers" (online), which are hotly discussed topics also in politics. A sample of 1,681 (n = 557 males) participants provided data on demographics, the Big Five as well as Right-Wing Authoritarianism (RWA) alongside the number of different news sources consumed and current voting preferences. Results showed that age (positively), gender (higher in males), Openness (positively), and RWA (negatively) predicted the number of different news sources consumed. The group of participants consuming news exclusively offline showed highest scores in Conscientiousness and lowest scores in Neuroticism compared to the "news feeds only" and the "news feeds and online" groups. However, less than 5% of the participants exclusively consumed news via news feeds of social networking sites. Participants who stated that they would not vote reported the lowest number of different news sources consumed. These findings reveal first insights into predisposing factors for the susceptibility to be caught in "filter bubbles" and/or "echo chamber" online and how this might be associated with voting preferences.
Article
Full-text available
In this paper, we address the question of whether disinforming news spread online possesses the power to change the prevailing political circumstances during an election campaign. We highlight factors for believing disinformation that until now have received little attention, namely trust in news media and trust in politics. A panel survey in the context of the 2017 German parliamentary election (N = 989) shows that believing disinforming news had a specific impact on vote choice by alienating voters from the main governing party (i.e., the CDU/CSU), and driving them into the arms of right-wing populists (i.e., the AfD). Furthermore, we demonstrate that the less one trusts in news media and politics, the more one believes in online disinformation. Hence, we provide empirical evidence for Bennett and Livingston’s notion of a disinformation order, which forms in opposition to the established information system to disrupt democracy.
Article
Full-text available
We examined the relationships among smartphone addiction, social-emotional distress (e.g., anxiety, depression, sleep quality, and loneliness), and personality traits among 150 undergraduate college students. Participants completed the Smartphone Addiction Scale, the Outcome Questionnaire-45.2, the Pittsburgh Sleep Quality Index, the UCLA Loneliness Scale-3, and the Neuroticism-Extraversion-Openness Five-Factor Inventory-3. Results showed that the more students were addicted to their smartphone, the higher their reported social-emotional distress was. Additionally, logistic analyses supported the predictive nature of smartphone addiction on specific domains of social-emotional distress. Personality did not moderate the relationship between smartphone addiction and social-emotional distress. However, neuroticism had a positive relationship with smartphone addiction, while extraversion, openness, agreeableness, and conscientious all had a negative relationship with smartphone addiction. Overall, these findings can inform assessment and interventions targeted at reducing smartphone use and improving mental health of college students. Research implications are also provided considering the infancy of studying the effects of smartphone use on psychological well-being.
Article
Full-text available
The spread of online misinformation poses serious challenges to societies worldwide. In a novel attempt to address this issue, we designed a psychological intervention in the form of an online browser game. In the game, players take on the role of a fake news producer and learn to master six documented techniques commonly used in the production of misinformation: polarisation, invoking emotions, spreading conspiracy theories, trolling people online, deflecting blame, and impersonating fake accounts. The game draws on an inoculation metaphor, where preemptively exposing, warning, and familiarising people with the strategies used in the production of fake news helps confer cognitive immunity when exposed to real misinformation. We conducted a large-scale evaluation of the game with N = 15,000 participants in a pre-post gameplay design. We provide initial evidence that people's ability to spot and resist misinformation improves after gameplay, irrespective of education, age, political ideology, and cognitive style.
Article
Full-text available
Based on an extensive literature review, we suggest that ‘fake news’ alludes to two dimensions of political communication: the fake news genre (i.e. the deliberate creation of pseudojournalistic disinformation) and the fake news label (i.e. the instrumentalization of the term to delegitimize news media). While public worries about the use of the label by politicians are increasing, scholarly interest is heavily focused on the genre aspect of fake news. We connect the existing literature on fake news to related concepts from political communication and journalism research, present a theoretical framework to study fake news, and formulate a research agenda. Thus, we bring clarity to the discourse about fake news and suggest shifting scholarly attention to the neglected fake news label.
Chapter
Full-text available
This entry elaborates on style and style guides, the sets of house-rules issued by news organizations in order to manage the use of news style by their journalists and reporters. News media can use these style guides to align itself towards its own journalists and to externally profile itself towards the audience and other news media. By doing so, style and style guides can be studied as an institutional voice that goes beyond the simple use of words. With continuing changes in society and the ongoing updates to style guides, the question about more research into style guides remains relevant. This entry argues that more research is desirable in order to gain greater insights into this important and continually evolving journalistic instrument.
Article
Full-text available
Objective Fake news represents a particularly egregious and direct avenue by which inaccurate beliefs have been propagated via social media. We investigate the psychological profile of individuals who fall prey to fake news. Method We recruited 1,606 participants from Amazon's Mechanical Turk for three online surveys. Results The tendency to ascribe profundity to randomly generated sentences – pseudo‐profound bullshit receptivity – correlates positively with perceptions of fake news accuracy, and negatively with the ability to differentiate between fake and real news (media truth discernment). Relatedly, individuals who overclaim their level of knowledge also judge fake news to be more accurate. We also extend previous research indicating that analytic thinking correlates negatively with perceived accuracy by showing that this relationship is not moderated by the presence/absence of the headline's source (which has no effect on accuracy), or by familiarity with the headlines (which correlates positively with perceived accuracy of fake and real news). Conclusion Our results suggest that belief in fake news may be driven, to some extent, by a general tendency to be overly accepting of weak claims. This tendency, which we refer to as reflexive open‐mindedness, may be partly responsible for the prevalence of epistemically suspect beliefs writ large. This article is protected by copyright. All rights reserved.
Article
Full-text available
Objective Politically‐slanted fake news (FN)—manufactured disinformation, hoaxes, and satire appearing to present true information about events—is currently receiving extensive attention in the mainstream media. However, it is currently unclear what factors may influence an individual's likelihood to believe in FN, outside of political identity. As FN is often conspiratorial in nature and usually negative, it was theorised that conspiracist belief, and factors that have been found to relate to a conspiratorial worldview (i.e., dangerous worldview and schizotypy), may also relate to political FN. Method A correlational design (N = 125, M = 27.6, SD = 4.26), was used to examine predictors of FN. Results Political viewpoint was a consistent predictor of FN endorsement. Conspiratorial worldview and schizotypal personality also predicted FN belief, with weaker or less consistent prediction by other variables including dangerous worldview, normlessness, and randomness beliefs. Partial correlation analysis suggested that most variables related to FN through their association with conspiracist ideation and political identity beliefs. Conclusion Prior political beliefs and the tendency for conspiracist ideation appear particularly important for individuals' endorsement of FN, regardless of prior exposure to the specific news presented. As such, conspiracy theory (CT) belief and its underlying mechanisms appear a useful starting point in identifying some of the underlying individual difference variables involved in conspiratorial and non‐conspiratorial FN belief. Implications and limitations are discussed.
Conference Paper
Full-text available
Warning messages are being discussed as a possible mechanism to contain the circulation of false information on social media. Their effectiveness for this purpose, however, is unclear. This article describes a survey experiment carried out to test two designs of warning messages: a simple one identical to the one used by Facebook, and a more complex one informed by recent research. We find no evidence that either design is clearly superior to not showing a warning message. This result has serious implications for brands and politicians, who might find false information about them spreading uncontrollably, as well as for managers of social media platforms, who are struggling to find effective means of controlling the diffusion of misinformation.
Article
Full-text available
Delusion-prone individuals may be more likely to accept even delusion-irrelevant implausible ideas because of their tendency to engage in less analytic and less actively open-minded thinking. Consistent with this suggestion, two online studies with over 900 participants demonstrated that although delusion-prone individuals were no more likely to believe true news headlines, they displayed an increased belief in “fake news” headlines, which often feature implausible content. Mediation analyses suggest that analytic cognitive style may partially explain these individuals’ increased willingness to believe fake news. Exploratory analyses showed that dogmatic individuals and religious fundamentalists were also more likely to believe false (but not true) news, and that these relationships may be fully explained by analytic cognitive style. Our findings suggest that existing interventions that increase analytic and actively open-minded thinking might be leveraged to help reduce belief in fake news.
Article
Full-text available
The 2016 U.S. presidential election brought considerable attention to the phenomenon of “fake news”: entirely fabricated and often partisan content that is presented as factual. Here we demonstrate one mechanism that contributes to the believability of fake news: fluency via prior exposure. Using actual fake-news headlines presented as they were seen on Facebook, we show that even a single exposure increases subsequent perceptions of accuracy, both within the same session and after a week. Moreover, this “illusory truth effect” for fake-news headlines occurs despite a low level of overall believability and even when the stories are labeled as contested by fact checkers or are inconsistent with the reader’s political ideology. These results suggest that social media platforms help to incubate belief in blatantly false news stories and that tagging such stories as disputed is not an effective solution to this problem. It is interesting, however, that we also found that prior exposure does not impact entirely implausible statements (e.g., “The earth is a perfect square”). These observations indicate that although extreme implausibility is a boundary condition of the illusory truth effect, only a small degree of potential plausibility is sufficient for repetition to increase perceived accuracy. As a consequence, the scope and impact of repetition on beliefs is greater than has been previously assumed.
Article
Full-text available
The dynamics and influence of fake news on Twitter during the 2016 US presidential election remains to be clarified. Here, we use a dataset of 171 million tweets in the five months preceding the election day to identify 30 million tweets, from 2.2 million users, which contain a link to news outlets. Based on a classification of news outlets curated by www.opensources.co, we find that 25% of these tweets spread either fake or extremely biased news. We characterize the networks of information flow to find the most influential spreaders of fake and traditional news and use causal modeling to uncover how fake news influenced the presidential election. We find that, while top influencers spreading traditional center and left leaning news largely influence the activity of Clinton supporters, this causality is reversed for the fake news: the activity of Trump supporters influences the dynamics of the top fake news spreaders.
Article
Full-text available
Addressing fake news requires a multidisciplinary effort
Article
Full-text available
Much research in cognitive psychology has focused on the tendency to conserve limited cognitive resources. The CRT is the predominant measure of such miserly information processing, and also predicts a number of frequently studied decisionmaking traits (such as belief bias and need for cognition). However, many subjects from common subject populations have already been exposed to the questions, which might add considerable noise to data. Moreover, the CRT has been shown to be confounded with numeracy. To increase the pool of available questions and to try to address numeracy confounds, we developed and tested the CRT-2. CRT-2 questions appear to rely less on numeracy than the original CRT but appear to measure closely related constructs in other respects. Crucially, substantially fewer subjects from Amazon’s Mechanical Turk have been previously exposed to CRT-2 questions. Though our primary purpose was investigating the CRT-2, we also found that belief bias questions appear suitable as an additional source of new items. Implications and remaining measurement challenges are discussed. © 2016, Society for Judgment and Decision making. All rights reserved.
Article
Full-text available
** Note: This post includes the text accepted for publication, which was subsequently highly copy-edited to fit the magazine format of the journal. ** Erroneous beliefs are difficult to correct. Worse, popular correction strategies may backfire and further increase the spread and acceptance of misinformation. People evaluate the truth of a statement by assessing its compatibility with other things they believe, its internal consistency, amount of supporting evidence, acceptance by others, and the credibility of the source. To do so, they can draw on relevant details (an effortful analytic strategy) or attend to the subjective experience of processing fluency (a less effortful intuitive strategy). Throughout, fluent processing facilitates acceptance of the statement – when thoughts flow smoothly, people nod along. Correction strategies that make false information more fluent (e.g., through repetition or pictures) can therefore increase its later acceptance. We review recent research and offer recommendations for more effective correction strategies.,
Article
Full-text available
The paper explores the use of concepts in cognitive psychology to evaluate the spread of misinformation, disinformation and propaganda in online social networks. Analysing online social networks to identify metrics to infer cues of deception will enable us to measure diffusion of misinformation. The cognitive process involved in the decision to spread information involves answering four main questions viz consistency of message, coherency of message, credibility of source and general acceptability of message. We have used the cues of deception to analyse these questions to obtain solutions for preventing the spread of misinformation. We have proposed an algorithm to effectively detect deliberate spread of false information which would enable users to make informed decisions while spreading information in social networks. The computationally efficient algorithm uses the collaborative filtering property of social networks to measure the credibility of sources of information as well as quality of news items. The validation of the proposed methodology has been done on the online social network ‘Twitter’.
Article
Full-text available
Citizens are frequently misinformed about political issues and candidates but the circumstances under which inaccurate beliefs emerge are not fully understood. This experimental study demonstrates that the independent experience of two emotions, anger and anxiety, in part determines whether citizens consider misinformation in a partisan or open‐minded fashion. Anger encourages partisan, motivated evaluation of uncorrected misinformation that results in beliefs consistent with the supported political party, while anxiety at times promotes initial beliefs based less on partisanship and more on the information environment. However, exposure to corrections improves belief accuracy, regardless of emotion or partisanship. The results indicate that the unique experience of anger and anxiety can affect the accuracy of political beliefs by strengthening or attenuating the influence of partisanship.
Article
Full-text available
This paper introduces a three-item "Cognitive Reflection Test" (CRT) as a simple measure of one type of cognitive ability--the ability or disposition to reflect on a question and resist reporting the first response that comes to mind. The author will show that CRT scores are predictive of the types of choices that feature prominently in tests of decision-making theories, like expected utility theory and prospect theory. Indeed, the relation is sometimes so strong that the preferences themselves effectively function as expressions of cognitive ability--an empirical fact begging for a theoretical explanation. The author examines the relation between CRT scores and two important decision-making characteristics: time preference and risk preference. The CRT scores are then compared with other measures of cognitive ability or cognitive "style." The CRT scores exhibit considerable difference between men and women and the article explores how this relates to sex differences in time and risk preferences. The final section addresses the interpretation of correlations between cognitive abilities and decision-making characteristics.
Article
What role does deliberation play in susceptibility to political misinformation and "fake news"? The Motivated System 2 Reasoning (MS2R) account posits that deliberation causes people to fall for fake news, because reasoning facilitates identity-protective cognition and is therefore used to rationalize content that is consistent with one's political ideology. The classical account of reasoning instead posits that people ineffectively discern between true and false news headlines when they fail to deliberate (and instead rely on intuition). To distinguish between these competing accounts, we investigated the causal effect of reasoning on media truth discernment using a 2-response paradigm. Participants (N = 1,635 Mechanical Turkers) were presented with a series of headlines. For each, they were first asked to give an initial, intuitive response under time pressure and concurrent working memory load. They were then given an opportunity to rethink their response with no constraints, thereby permitting more deliberation. We also compared these responses to a (deliberative) 1-response baseline condition where participants made a single choice with no constraints. Consistent with the classical account, we found that deliberation corrected intuitive mistakes: Participants believed false headlines (but not true headlines) more in initial responses than in either final responses or the unconstrained 1-response baseline. In contrast-and inconsistent with the Motivated System 2 Reasoning account-we found that political polarization was equivalent across responses. Our data suggest that, in the context of fake news, deliberation facilitates accurate belief formation and not partisan bias. (PsycINFO Database Record (c) 2020 APA, all rights reserved).
Preprint
What is the role of emotion in susceptibility to believing fake news? Prior work on the psychology of misinformation has focused primarily on the extent to which reason and deliberation hinder versus help the formation of accurate beliefs. Several studies have suggested that people who engage in more reasoning are less likely to fall for fake news. However, the role of reliance on emotion in belief in fake news remains unclear. To shed light on this issue, we explored the relationship between specific emotions and belief in fake news (Study 1; N = 409). We found that across a wide range of specific emotions, heightened emotionality was predictive of increased belief in fake (but not real) news. Then, in Study 2, we measured and manipulated reliance on emotion versus reason across four experiments (total N = 3884). We found both correlational and causal evidence that reliance on emotion increases belief in fake news: Self-reported use of emotion was positively associated with belief in fake (but not real) news, and inducing reliance on emotion resulted in greater belief in fake (but not real) news stories compared to a control or to inducing reliance on reason. These results shed light on the unique role that emotional processing may play in susceptibility to fake news.
Article
Why do people believe blatantly inaccurate news headlines ("fake news")? Do we use our reasoning abilities to convince ourselves that statements that align with our ideology are true, or does reasoning allow us to effectively differentiate fake from real regardless of political ideology? Here we test these competing accounts in two studies (total N = 3446 Mechanical Turk workers) by using the Cognitive Reflection Test (CRT) as a measure of the propensity to engage in analytical reasoning. We find that CRT performance is negatively correlated with the perceived accuracy of fake news, and positively correlated with the ability to discern fake news from real news - even for headlines that align with individuals' political ideology. Moreover, overall discernment was actually better for ideologically aligned headlines than for misaligned headlines. Finally, a headline-level analysis finds that CRT is negatively correlated with perceived accuracy of relatively implausible (primarily fake) headlines, and positively correlated with perceived accuracy of relatively plausible (primarily real) headlines. In contrast, the correlation between CRT and perceived accuracy is unrelated to how closely the headline aligns with the participant's ideology. Thus, we conclude that analytic thinking is used to assess the plausibility of headlines, regardless of whether the stories are consistent or inconsistent with one's political ideology. Our findings therefore suggest that susceptibility to fake news is driven more by lazy thinking than it is by partisan bias per se - a finding that opens potential avenues for fighting fake news.
Article
Democracies assume accurate knowledge by the populace, but the human attraction to fake and untrustworthy news poses a serious problem for healthy democratic functioning. We articulate why and how identification with political parties – known as partisanship – can bias information processing in the human brain. There is extensive evidence that people engage in motivated political reasoning, but recent research suggests that partisanship can alter memory, implicit evaluation, and even perceptual judgments. We propose an identity-based model of belief for understanding the influence of partisanship on these cognitive processes. This framework helps to explain why people place party loyalty over policy, and even over truth. Finally, we discuss strategies for de-biasing information processing to help to create a shared reality across partisan divides.
Article
Following the 2016 US presidential election, many have expressed concern about the effects of false stories ("fake news"), circulated largely through social media. We discuss the economics of fake news and present new data on its consumption prior to the election. Drawing on web browsing data, archives of fact-checking websites, and results from a new online survey, we find: 1) social media was an important but not dominant source of election news, with 14 percent of Americans calling social media their "most important" source; 2) of the known false news stories that appeared in the three months before the election, those favoring Trump were shared a total of 30 million times on Facebook, while those favoring Clinton were shared 8 million times; 3) the average American adult saw on the order of one or perhaps several fake news stories in the months around the election, with just over half of those who recalled seeing them believing them; and 4) people are much more likely to believe stories that favor their preferred candidate, especially if they have ideologically segregated social media networks.
Article
A fake news detection system aims to assist users in detecting and filtering out varieties of potentially deceptive news. The prediction of the chances that a particular news item is intentionally deceptive is based on the analysis of previously seen truthful and deceptive news. A scarcity of deceptive news, available as corpora for predictive modeling, is a major stumbling block in this field of natural language processing (NLP) and deception detection. This paper discusses three types of fake news, each in contrast to genuine serious reporting, and weighs their pros and cons as a corpus for text analytics and predictive modeling. Filtering, vetting, and verifying online information continues to be essential in library and information science (LIS), as the lines between traditional news and online information are blurring. Copyright
Article
This article has 2 goals: to provide additional evidence that exposure to ideological online news media contributes to political misperceptions, and to test 3 forms this media-effect might take. Analyses are based on representative survey data collected during the 2012 U.S. presidential election (N = 1,004). Panel data offer persuasive evidence that biased news site use promotes inaccurate beliefs, while cross-sectional data provide insight into the nature of these effects. There is no evidence that exposure to ideological media reduces awareness of politically unfavorable evidence, though in some circumstances biased media do promote misunderstandings of it. The strongest and most consistent influence of ideological media exposure is to encourage inaccurate beliefs regardless of what consumers know of the evidence.
Book
This book explores the idea that we have two minds - one being automatic, unconscious, and fast, the other controlled, conscious, and slow. In recent years, there has been great interest in so-called dual-process theories of reasoning and rationality. According to such theories, there are two distinct systems underlying human reasoning: an evolutionarily old system that is associative, automatic, unconscious, parallel, and fast; and a more recent, distinctively human system which is rule-based, controlled, conscious, serial, and slow. Within the former, processes are held to be innate and to use heuristics that evolved to solve specific adaptive problems. In the latter, processes are taken to be learned, flexible, and responsive to rational norms. Despite the attention these theories are attracting, there is still poor communication between dual-process theorists themselves, and the substantial bodies of work on dual processes in cognitive psychology and social psychology remain isolated from each other. The book brings together researchers on dual processes to summarize the latest research, highlight key issues, present different perspectives, explore implications, and provide a stimulus to further work. It includes new ideas about the human mind both by contemporary philosophers interested in broad theoretical questions about mental architecture, and by psychologists specializing in traditionally distinct and isolated fields.
Article
Can partisan media (in particular, partisan TV news) polarize viewers? I outline a set of hypotheses to explain the conditions under which partisan media will increase attitudinal polarization. I use original experiments to test this theory, and find that like-minded messages do have a strong polarizing effect on viewers' attitudes. I also show that cross-cutting messages have, on average, little effect on attitudes, but that they can have strongly polarizing or moderating effects for voters with particular traits. I provide evidence supporting one of the primary hypothesized mechanisms, and also show their duration outside of the lab. I draw on experimental techniques from biomedical studies to show how viewer's preferences for watching partisan media shape these effects. I conclude by discussing the implications of these findings for both theories of media effects and political behavior more broadly. + The author thanks Daniella Lejitneker and the staff of the Wharton Behavioral Lab for help implementing experiment 1 in the paper. Thanks also to Pope, and participants at the MIT American Politics Conference for comments, and to the School of Arts and Sciences and the Vice-Provost for Research at the University of Pennsylvania for funding these experiments. Any remaining errors are my own. The supplemental data with details on the experiments is available upon request from the author.
Article
Source: Democracy Now! JUAN GONZALEZ: When you follow your friends on Facebook or run a search on Google, what information comes up, and what gets left out? That's the subject of a new book by Eli Pariser called The Filter Bubble: What the Internet Is Hiding from You. According to Pariser, the internet is increasingly becoming an echo chamber in which websites tailor information according to the preferences they detect in each viewer. Yahoo! News tracks which articles we read. Zappos registers the type of shoes we wear, we prefer. And Netflix stores data on each movie we select. AMY GOODMAN: The top 50 websites collect an average of 64 bits of personal information each time we visit and then custom-designs their sites to conform to our perceived preferences. While these websites profit from tailoring their advertisements to specific visitors, users pay a big price for living in an information bubble outside of their control. Instead of gaining wide exposure to diverse information, we're subjected to narrow online filters. Eli Pariser is the author of The Filter Bubble: What the Internet Is Hiding from You. He is also the board president and former executive director of the group MoveOn.org. Eli joins us in the New York studio right now after a whirlwind tour through the United States.
Article
The recent increase in partisan media has generated interest in whether such outlets polarize viewers. I draw on theories of motivated reasoning to explain why partisan media polarize viewers, why these programs affect some viewers much more strongly than others, and how long these effects endure. Using a series of original experiments, I find strong support for my theoretical expectations, including the argument that these effects can still be detected several days postexposure. My results demonstrate that partisan media polarize the electorate by taking relatively extreme citizens and making them even more extreme. Though only a narrow segment of the public watches partisan media programs, partisan media's effects extend much more broadly throughout the political arena.
News sharing on UK social media: misinformation, disinformation, and correction
  • A Chadwick
  • C Vaccari
Chadwick A, Vaccari C: News sharing on UK social media: misinformation, disinformation, and correction. Loughborough University, 2019.
Fake news did have a significant impact on the vote in the 2016 election: Original full-length version with methodological appendix
  • R Gunther
  • P A Beck
  • E C Nisbet
Gunther R, Beck PA, Nisbet EC: Fake news did have a significant impact on the vote in the 2016 election: Original full-length version with methodological appendix. Unpubl Manuscr Ohio State Univ Columb OH 2018.
The spread of true and false news online
  • S Vosoughi
  • D Roy
  • S Aral
Vosoughi S, Roy D, Aral S: The spread of true and false news online. Science 2018, 359:1146-1151 http://dx.doi.org/10.1126/ science.aap9559.
Fake News did have a Significant Impact on the Vote in the 2016 Election: Original Full-Length Version with
  • R Gunther
  • P A Beck
  • E C Nisbet
Gunther R, Beck PA, Nisbet EC: Fake News did have a Significant Impact on the Vote in the 2016 Election: Original Full-Length Version with Methodological Appendix.. Unpublished Manuscript Columbus, OH: The Ohio State University; 2018.