Figure 1 - uploaded by Sander van der Linden
Content may be subject to copyright.
Disinformation and control messages used for the online experiment.

Disinformation and control messages used for the online experiment.

Source publication
Article
Full-text available
Background Vaccination coverage needs to reach more than 80% to resolve the COVID-19 pandemic, but vaccine hesitancy, fuelled by misinformation, may jeopardize this goal. Unvaccinated older adults are not only at risk of COVID-19 complications but may also be misled by false information. Prebunking, based on inoculation theory, involves ‘forewarnin...

Contexts in source publication

Context 1
... specific messages were pretested, and the two most influential messages that decrease vaccine intention were used for the online experiment. Message A targeted mRNA vaccine safety, while Message B concerned the fast approval of a COVID-19 vaccine by a federal agency (Figure 1). The third, subsequently unused message alleged that health care professionals and older people are guinea pigs for the COVID-19 vaccine. ...
Context 2
... decided to dichotomize the outcome to avoid sparsity, as most responses favor vaccination, (more than 80% of individuals are somewhat-to-verylikely to get the vaccine so we have enough information per cell. The approach of dichotomizing the outcome was favored to a mixed ANOVA, as the distribution of the outcome variable is heavily skewed to the left (see Figure 1 in the supplementary file), which potentially violates the ANOVA normality assumption. Nonetheless, as a robustness check, we also conducted ANOVA tests of difference-in-difference scores (the pre-post score in the treatment condition minus the pre-post score in the control condition). ...

Citations

... Several studies have found pre-bunking effectively lessens misinformation's effects in experimental settings (Hameleers, 2024). For instance, researchers found that pre-bunking successfully countered COVID-19 vaccine misinformation among Canadians over the age of 50 (Vivion et al., 2022). One potential drawback is that the healthy skepticism honed through pre-bunking messages may lead to a deeper skepticism of credible information (Hameleers, 2024). ...
... Strategies to fight disinformation that were effective with certain limitations include fact-checking and debunking (Arcos et al., 2022;Chan et al., 2017), inoculation (Lewandowsky & Linden, 2021;Vivion et al., 2022), and forewarning, which aim to expose and disprove misleading content (Arcos et al., 2022). Given the constantly evolving social media landscape, previous studies highlighted the role of educational actions in countering the disinformation phenomenon Nygren & Guath, 2022). ...
Article
Full-text available
There is an ongoing debate among scholars on how to tackle disinformation. Media education initiatives to increase literacy are effective ways to counter disinformation. Hence, the European Commission (2022) published Guidelines for Teachers and Educators on Tackling Disinformation and Promoting Digital Literacy Through Education and Training. The present research looked at the role of social media literacy in increasing awareness of the role of social media in spreading disinformation. We developed an educational intervention based on the European Commission guidelines. We investigated its impact on perceived social media literacy, the intention to share fake news on social media, and general conspiracy beliefs. We conducted a within-subject (two times measurement: before the educational intervention and one week after) +1 experiment with = 127 young adults (aged 18 to 23). After filling in an initial survey, the experimental group received a 15-minute educational intervention on the role of social media for disinformation dissemination in complex digital information environments. One week later, all participants completed the second survey to assess perceived social media literacy and general conspiracy beliefs. In both surveys, participants saw three Instagram posts from a fictitious media outlet to express potential intentions to share on social media. Among the three posts, two showed false information. Findings showed that educational intervention produces a significant increase in perceived social media literacy and a decrease in general conspiracy beliefs. Intellectual humility moderates the impact of educational intervention on algorithmic awareness.
... Both prebunking and debunking interventions have been found to be effective in reducing the threat of misinformation 11,13,14,17,[21][22][23][24][25][26] . This paper addresses four main gaps in the literature, with four corresponding research questions. ...
Article
Full-text available
Misinformation surrounding crises poses a significant challenge for public institutions. Understanding the relative effectiveness of different types of interventions to counter misinformation, and which segments of the population are most and least receptive to them, is crucial. We conducted a preregistered online experiment involving 5228 participants from Germany, Greece, Ireland, and Poland. Participants were exposed to misinformation on climate change or COVID-19. In addition, they were pre-emptively exposed to a prebunk, warning them of commonly used misleading strategies, before encountering the misinformation, or were exposed to a debunking intervention afterwards. The source of the intervention (i.e. the European Commission) was either revealed or not. The findings show that both interventions change four variables reflecting vulnerability to misinformation in the expected direction in almost all cases, with debunks being slightly more effective than prebunks. Revealing the source of the interventions did not significantly impact their overall effectiveness. One case of undesirable effect heterogeneity was observed: debunks with revealed sources were less effective in decreasing the credibility of misinformation for people with low levels of trust in the European Union (as elicited in a post-experimental questionnaire). While our results mostly suggest that the European Commission, and possibly other public institutions, can confidently debunk and prebunk misinformation regardless of the trust level of the recipients, further evidence on this is needed.
... Persuasive information can spread quickly from person to person, replicating and evolving to 'infect' as many people as possible [11]. The natural solution to a rapidly spreading virus is a vaccine, and as such inoculation theory has been used to successfully build resistance against climate change misinformation [12,13], anti-vaccine misinformation [14] and political misinformation [15]. ...
Article
Full-text available
Inoculation theory research offers a promising psychological ‘vaccination’ against misinformation. But are people willing to take it? Expanding on the inoculation metaphor, we introduce the concept of ‘inoculation hesitancy’ as a framework for exploring reluctance to engage with misinformation interventions. Study 1 investigated whether individuals feel a need for misinformation inoculations. In a comparative self-evaluation, participants assessed their own experiences with misinformation and expectations of inoculation and compared them to those of the average person. Results exposed a better-than-average effect. While participants were concerned over the problem of misinformation, they estimated that they were less likely to be exposed to it and more skilful at detecting it than the average person. Their self-described likelihood of engaging with inoculation was moderate, and they believed other people would benefit more from being inoculated. In Study 2, participants evaluated their inclination to watch inoculation videos from sources varying in trustworthiness and political affiliation. Results suggest that participants are significantly less willing to accept inoculations from low-trust sources and less likely to accept inoculations from partisan sources that are antithetical to their own political beliefs. Overall, this research identifies motivational obstacles in reaching herd immunity with inoculation theory, guiding future development of inoculation interventions.
... Other researchers have found that critical appraisal is often absent when vaccine-hesitant individuals share "scientific evidence" on the web, which often includes citations that blur the line between legitimate scientific publications and fraudulent studies [98]. However, there is little evidence of communication across networks, despite repeated calls from public health communication experts to prebunk and debunk vaccine misinformation on the web [99,100]. Notably, both communities share a retracted paper, but their framing of the paper varies. ...
Article
Background Attitudes toward the human papillomavirus (HPV) vaccine and accuracy of information shared about this topic in web-based settings vary widely. As real-time, global exposure to web-based discourse about HPV immunization shapes the attitudes of people toward vaccination, the spread of misinformation and misrepresentation of scientific knowledge contribute to vaccine hesitancy. Objective In this study, we aimed to better understand the type and quality of scientific research shared on Twitter (recently rebranded as X) by vaccine-hesitant and vaccine-confident communities. Methods To analyze the use of scientific research on social media, we collected tweets and retweets using a list of keywords associated with HPV and HPV vaccines using the Academic Research Product Track application programming interface from January 2019 to May 2021. From this data set, we identified tweets referring to or sharing scientific literature through a Boolean search for any tweets with embedded links, hashtags, or keywords associated with scientific papers. First, we used social network analysis to build a retweet or reply network to identify the clusters of users belonging to either the vaccine-confident or vaccine-hesitant communities. Second, we thematically assessed all shared papers based on typology of evidence. Finally, we compared the quality of research evidence and bibliometrics between the shared papers in the vaccine-confident and vaccine-hesitant communities. Results We extracted 250 unique scientific papers (including peer-reviewed papers, preprints, and gray literature) from approximately 1 million English-language tweets. Social network maps were generated for the vaccine-confident and vaccine-hesitant communities sharing scientific research on Twitter. Vaccine-hesitant communities share fewer scientific papers; yet, these are more broadly disseminated despite being published in less prestigious journals compared to those shared by the vaccine-confident community. Conclusions Vaccine-hesitant communities have adopted communication tools traditionally wielded by health promotion communities. Vaccine-confident communities would benefit from a more cohesive communication strategy to communicate their messages more widely and effectively.
... Much information about COVID-19 has also been posted as fake news, which are "false stories that appear to be news, spread on the internet or using other media, usually created to influence political views or as a joke" and without proper medical evidence to support the claims they make [19]. The best way to deal with misinformation and fake news is to use pre-bunking and debunking techniques -the former to warn people about what they might be getting as information, and the latter to reassure people that they have responded correctly to a false claim or fact [20,21]. It is not an easy task to debunk fake news or misinformation once it has been established, as Klein (2023) discusses using the MMR-autism relationship as an example: "The persistent appeal of the vaccine-autism myth, no matter how many times and in how many ways it has been debunked, is that it gives parents who see difference as tragedy something external to blame."[16p200] ...
... Zoran Milosavljević као информација, а ове друге да се људи увере да су правилно одговорили на лажну тврдњу или чињеницу [20,21]. Није лак задатак разоткрити лажне вести или погрешне информације једном када се чврсто укорене, као што разматра Клајн (2023) кроз пример везе ММР и аутизма: ...
Article
Full-text available
This article problematises similarities in social responses to two different types of prevention-antiretrovirals in the form of pre-exposure pro-phylaxis (PrEP) against HIV transmission, and the emerging COVID-19 vaccines against the SARS-COV2 virus. For the purpose of this article, I have revisited the work of Mary Douglas, British social anthropologist (1921-2007) on risk and social responses to risk. In the late 1980s, Mary Douglas described patterns and modalities of social response to risk in emerging epidemics. The same pattern of social dynamics and response could be followed in relation to two pandemics of the 21st century-first, an HIV pandemic that started in the early 80s and in which prevention breakthrough occurred in 2012 with the introduction of pre-exposure prophylaxis-PrEP; and second, the COVID-19 pandemic that started in early 2020 with newly developed vaccines in 2021 as public health response to it.
... 66,69 An example of argument-based prebunking is delivering messages that caution people about anticipated false claims regarding the safety of mRNA vaccines or the rapid approval of COVID-19 vaccines, which are based on actual instances of misinformation found on social media. 79 Tertiary prevention: acute responses to minimise negative effects of infodemics Tertiary prevention in public health and preventive medicine involves preventing disease progression or alleviating symptoms. Similarly, in our proposed framework, tertiary prevention aims to reduce the negative effect of infodemics on public health. ...
Article
Full-text available
The COVID-19 pandemic has highlighted how infodemics (defined as an overabundance of information, including misinformation and disinformation) pose a threat to public health and could hinder individuals from making informed health decisions. Although public health authorities and other stakeholders have implemented measures for managing infodemics, existing frameworks for infodemic management have been primarily focused on responding to acute health emergencies rather than integrated in routine service delivery. We review the evidence and propose a framework for infodemic management that encompasses upstream strategies and provides guidance on identifying different interventions, informed by the four levels of prevention in public health: primary, secondary, tertiary, and primordial prevention. On the basis of a narrative review of 54 documents (peer-reviewed and grey literature published from 1961 to 2023), we present examples of interventions that belong to each level of prevention. Adopting this framework requires proactive prevention and response through managing information ecosystems, beyond reacting to misinformation or disinformation.
... Both prebunking and debunking interventions have been found to be effective in reducing the threat of misinformation 11,13,14,17,[21][22][23][24][25][26] . This paper addresses four main gaps in the literature, with four corresponding research questions. ...
Preprint
Full-text available
Misinformation surrounding crises poses a significant challenge for public institutions. Understanding the relative effectiveness of different types of interventions to counter misinformation and understanding which segments of the population are most or least receptive to them, is crucial. We conduct a preregistered online experiment involving 5,228 participants from Germany, Greece, Ireland, and Poland. Participants were exposed to misinformation on climate change or Covid-19. In addition, they were pre-emptively exposed to a prebunk, warning them of commonly used misleading strategies, before encountering the misinformation, or a debunking intervention afterward. The source of the intervention (i.e. the European Commission) was either revealed or not. Findings show that both interventions effectively change the four outcome variables in the desired direction in almost all cases, with debunks sometimes being more effective than prebunks. Moreover, revealing the source of the interventions does not significantly impact their overall effectiveness. Although one case of undesirable effect heterogeneity – debunks with revealed source were less effective in decreasing credibility of misinformation for people with low trust in the European Union – was observed, the results mostly suggest that the European Commission, and possibly other institutions, can confidently debunk and prebunk misinformation regardless of the trust level of its recipients.
... Since the World Health Organization declared COVID-19 a global pandemic on March 11, 2020 [1], inconsistent public health messaging has enhanced the appeal of antivaccine and vaccination hesitancy movements on social media [2]. There is already a substantial body of research in the health and social sciences documenting how organized misinformation and disinformation can influence the public's perception of scientific knowledge and public policy [3][4][5][6]. It is well established how disinformation led to the overhyped promotion of chloroquine and hydroxychloroquine (drugs used to treat lupus and malaria) as treatments for COVID-19, resulting in market shortages [5]. ...
Article
Full-text available
Background: The popularity of the magnetic vaccine conspiracy theory and other conspiracy theories of a similar nature creates challenges to promoting vaccines and disseminating accurate health information. Objective: Health conspiracy theories are gaining in popularity. This study's objective was to evaluate the Twitter social media network related to the magnetic vaccine conspiracy theory and apply social capital theory to analyze the unique social structures of influential users. As a strategy for web-based public health surveillance, we conducted a social network analysis to identify the important opinion leaders sharing the conspiracy, the key websites, and the narratives. Methods: A total of 18,706 tweets were retrieved and analyzed by using social network analysis. Data were retrieved from June 1 to June 13, 2021, using the keyword vaccine magnetic. Tweets were retrieved via a dedicated Twitter application programming interface. More specifically, the Academic Track application programming interface was used, and the data were analyzed by using NodeXL Pro (Social Media Research Foundation) and Gephi. Results: There were a total of 22,762 connections between Twitter users within the data set. This study found that the most influential user within the network consisted of a news account that was reporting on the magnetic vaccine conspiracy. There were also several other users that became influential, such as an epidemiologist, a health economist, and a retired sports athlete who exerted their social capital within the network. Conclusions: Our study found that influential users were effective broadcasters against the conspiracy, and their reach extended beyond their own networks of Twitter followers. We emphasize the need for trust in influential users with regard to health information, particularly in the context of the widespread social uncertainty resulting from the COVID-19 pandemic, when public sentiment on social media may be unpredictable. This study highlights the potential of influential users to disrupt information flows of conspiracy theories via their unique social capital.
... The "backfire effect" (ie, factual counterargument entrenches false beliefs) was found when using debunking messages to address misinformation [117,118]. Therefore, pre-debunking message (inoculation message) before public message communication should be piloted [73,119]. ...
... One experiment further showed that partisans exposed to ingroup media (media held the same political preference as participants) perceived debunking messages as more credible and held higher engagement [125]. For inoculation messages, 2 experiments found that simple inoculation message/video could protect people from misinformation [119,127], while the remaining 1 found it had no significant effect, but was useful when combined with viewing or writing comments on the inoculation message [128]. In terms of warning tag/cover, interstitial warnings or cover warnings, which require individuals to click through to continue, were found to be more effective to help participants identify misinformation while warning tags showed no effect [104]. ...
Article
Full-text available
Background: During the COVID-19 pandemic, infodemic spread even more rapidly than the pandemic itself. The COVID-19 vaccine hesitancy has been prevalent worldwide and hindered pandemic exiting strategies. Misinformation around COVID-19 vaccine is a vital contributor to vaccine hesitancy. However, no evidence systematically summarized COVID-19 vaccine misinformation. Objective: To synthesize the global evidence on misinformation related to COVID-19 vaccines, including its prevalence, features, influencing factors, impacts, and solutions for combating misinformation. Methods: We performed a systematic review by searching five peer-reviewed databases (PubMed, EMBASE, Web of Science, Scopus, and EBSCO). We included original articles that investigated misinformation related to COVID-19 vaccine and were published in English from January 1, 2020, to August 18, 2022. We excluded publications that did not cover or focus on COVID-19 vaccine misinformation. The Appraisal tool for Cross-Sectional Studies, Cochrane RoB 2.0 tool, and Critical Appraisal Skills Programme Checklist were used to assess the study quality. The review was guided by Preferred Reporting Items for Systematic Reviews and Meta-Analyses and registered with PROSPERO (CRD42021288929). Results: Of 8864 studies identified, 91 observational studies and 11 interventional studies met the inclusion criteria. Misinformation around COVID-19 vaccine covered conspiracy, concerns on vaccine safety and efficacy, no need for vaccine, morality, liberty, and humor. Conspiracy and safety concerns were the most prevalent misinformation. There was a great variation in misinformation prevalence with 2.5~55.4% in general population and 6.0~96.7% in antivaccine/vaccine hesitant groups from survey-based studies, and the prevalence of 0.1~41.3% on general online data and 0.5~56% on antivaccine/vaccine hesitant data from Internet-based studies. Younger age, lower education and economic status, right-wing and conservative ideology, having psychological problems enhanced beliefs in misinformation. The content, format, and source of misinformation influenced its spread. A five-step framework was proposed to address vaccine-related misinformation, including identifying misinformation, regulating producers and distributors, cutting production and distribution, supporting target audiences, and disseminating trustworthy information. The debunking messages/videos were found to be effective in several experimental studies. Conclusions: Our review provided comprehensive and up-to-date evidence on COVID-19 vaccine misinformation and helps responses to vaccine infodemic in future pandemics.