Content uploaded by Thomas Zerback
Author content
All content in this area was uploaded by Thomas Zerback on Mar 04, 2020
Content may be subject to copyright.
The disconcerting potential of Russia’s trolls: Persuasive effects of astroturfing
comments and three strategies for inoculation against them
Thomas Zerback (corresponding author)
University of Zurich, Andreasstrasse 15, 8050 Zürich,
t.zerback@ikmz.uzh.ch
Florian Töpfl
Free University of Berlin, Garystraße 55, 14195 Berlin,
f.toepfl@fu-berlin.de
Maria Knöpfle
Ludwig-Maximilians-University Munich, Oettingenstraße 67, 81538 Munich,
M.Knoepfle@campus.lmu.de
Author Note
Thomas Zerback, Ph.D. is Assistant Professor for political communication at the
Department of Communication and Media Research at the University of Zurich, Switzerland.
Florian Töpfl, Ph.D. is an Emmy Noether Junior Research Group leader at the Institute
for Media and Communication Studies at the Free University of Berlin, Germany.
Maria Knöpfle is a student assistant at the Department of Department of Media and
Communication at the Ludwig-Maximilians-University Munich, Germany.
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 1
Abstract
This study is the first to scrutinize the psychological effects of online astroturfing in the
context of Russia’s digitally-enabled foreign propaganda. Online astroturfing is a
communicative strategy that use-s websites, “sock puppets,” or social bots to create the false
impression that a particular opinion has widespread public support. We exposed N = 2,353
subjects to pro-Russian astroturfing comments and tested: (1) the comments’ effects on
political opinions and opinion certainty, and (2) the efficiency of three inoculation strategies
to prevent these effects. All effects were investigated across three issues as well as from a
short- and long-term perspective. Results show that astroturfing comments can indeed alter
recipients’ opinions, and increase uncertainty, even when recipients are inoculated before
exposure. We found only one inoculation strategy (refutational-same) to be effective.
Consequences for future inoculation research and practical applications are discussed.
Keywords: disinformation, misinformation, Russia, state propaganda, online astroturfing,
opinion certainty, uncertainty, countermeasures, inoculation
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 2
The disconcerting potential of Russia’s trolls: Persuasive effects of astroturfing
comments and three strategies for inoculation against them
Particularly in the aftermath of the 2016 US presidential election, disinformation and its
consequences for democratic societies have been subject to extensive political (European
Commission, 2018) and scholarly debate (e.g., Bennett and Livingston, 2018). At the most
abstract level, disinformation can be understood as “[i]naccurate or manipulated information /
content that is spread intentionally. This can include false news, or it can involve more subtle
methods such as false flag operations, feeding inaccurate quotes or stories to innocent
intermediaries, or knowingly amplifying biased or misleading information” (Weedon et al.,
2017: 5). Therefore, disinformation is also persuasive communication (Zhang et al., 2013). In
this paper we deal with an important and widespread subtype of disinformation, known as
“astroturfing” (Kovic et al., 2018; Zhang et al., 2013). Astroturfing can be defined as the
“manipulative use of media and other political techniques to create the perception of a
grassroots community organization where none exists for the purpose of political gain”
(McNutt and Boland, 2007: 169). Although the phenomenon itself is not new (e.g., Lyon and
Maxwell, 2004), the Internet and especially social media have paved the way for new forms,
often referred to as digital or online astroturfing (Kovic et al., 2018; Zhang et al., 2013).
A central strategic instrument of online astroturfing is the manufacturing of user
comments designed to appear as authentic citizen voices on highly visible news or social
networking sites (SNS). We focus here on this specific form of online astroturfing because it
has been one of the most widely debated in the context of recent national elections across the
Western world (Ferrara, 2017; Kovic et al., 2018; Zelenkauskaite and Balduccini, 2017).
Examples of campaigns that were targeted include the 2016 presidential election in the US
(Bessi and Ferrara, 2016; Woolley and Guilbeault, 2017), the 2017 presidential election in
France (Ferrara, 2017), and the 2012 presidential elections in South Korea (Keller et al.,
2017). As a key sponsor of these astroturfing activities, academic studies, investigative news
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 3
articles and think-tank reports have pointed to Russia’s ruling elites (see, for example,
Bugorkova, 2015; Zelenkauskaite and Balduccini, 2017), who are closely tied to an
organization operating under the name Internet Research Agency (IRA). In 2013 this entity,
also referred to as Russia’s “troll factory”, employed approximately 600 people and had an
estimated annual budget of US$ 10 million (Bugorkova, 2015). Amongst others, the so called
Russian trolls targeted foreign audiences by setting up fake SNS accounts (known as “sock
puppets”) mimicking grassroots support for Russian policies on a range of news and social-
media platforms (Kovic et al., 2018).
Among Western political leaders, these digitally enabled propaganda efforts have
sparked not only concern but explicit indignation (European Commission, 2018). In the
academic realm, they have stimulated a fast-growing body of research on the phenomenon.
So far, however, this research has focused almost exclusively on the detection of
manufactured comments—that is the question of how to identify sock-puppet accounts or
automated social bots (Keller et al., 2017; King et al., 2017). By contrast, we still know very
little about the psychological effects that such manufactured user commenting has on media
audiences, and even less about possible ways of forestalling these effects. Against this
background, our study advances existing research in three ways:
(1) We examine whether online astroturfing comments affect the political opinions and
opinion-certainty of those exposed to them.
(2) We investigate whether these persuasive effects can be mitigated, or even prevented, by
the use of inoculation messages designed to educate the audience about the manipulative
intent and argumentative tactics of the astroturfing actors.
(3) We analyze the duration of the inoculation’s immunizing effects.
Our study is based on a three-wave experiment carried out over the course of four weeks.
2,353 participants were exposed to typical Russian online astroturfing comments posted
beneath social media news items in order to determine their persuasive effects. In addition,
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 4
we tested the efficiency of three different inoculation treatments in countering these effects
both in the short run and in the long run. All stimuli messages were administered in the
context of three different issues prone to Russian astroturfing activities: the poisoning of
former Russian intelligence officer Sergei Skripal, the manipulation of the 2016 US
presidential election, and the use of toxic gas by a close Russian ally, the Syrian government.
The effects of astroturfing comments on an audience
Online astroturfing comments imitate ordinary citizens’ voices in order to create the
impression that a certain opinion has widespread public support, while the real agent behind
the message conceals his identity (Zhang et al., 2013). Astroturfing comments are almost
impossible to distinguish from authentic user comments; hence the audience find themselves
in situations where they are either completely unaware of the fact that a comment might be
sponsored by a principal, or they may suspect such an influence but cannot be entirely sure
about it. Given their authentic appearance and the lack of knowledge, and/or uncertainty, on
the part of audiences, astroturfing comments carry the potential to influence the opinions of
those who read them.
An answer to the question, how astroturfing comments can alter personal opinions is
given by exemplification research, which has investigated the effects of ordinary citizen
depictions in the media (also known as “exemplars”) (Zillmann, 1999). Exemplars possess
several characteristics contributing to their persuasive potential: firstly, as personalized
information they attract the audience’s attention, making persuasive effects more likely in the
first place (Taylor and Thompson, 1982). Secondly, the opinion voiced by an exemplar
becomes cognitively available and more accessible in the recipients’ memories (Zillmann,
1999), and highly accessible information has a greater chance of influencing subsequent
judgments (Domke et al., 1998). Finally, fellow citizens are often considered to be more
trustworthy and more similar to ourselves by comparison with other actors present in the
media, such as, for example, politicians (Lefevere et al., 2012). Trustworthiness and
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 5
similarity have both been shown to be strong facilitators of persuasive effects (Hovland et al.,
1953).
Although, from a theoretical point of view, depictions of citizens hold a great
potential to influence the opinions of those confronted with them, empirical evidence on their
persuasive potential is rather mixed. Whereas some researchers have observed opinion
changes resulting from exemplar exposure both in traditional (e.g., Daschmann, 2000) and in
online media (e.g., Sikorski, 2018), others could not find such effects (e.g., Zerback and
Peter, 2018). This leads to the question of why online astroturfing comments, in particular,
should exert a persuasive effect. The answer lies in the way they are composed: in many
cases, astroturfing comments do not merely consist of an opinion, but also include arguments
that support the position advocated. An analysis by the EU vs. Disinformation project (2019)
found that, particularly in the case of Russian propaganda, the most common strategy
employed was to offer alternative explanations for negative events for which Russia was
being publicly accused. These pro-Russian astroturfing messages deny Russian
responsibility, present other potential culprits, or portray Russia as the victim of widespread
and unfounded Russophobia or public persecution (see also Nimmo, 2015). Persuasion
research has repeatedly shown that arguments included in a message increase its persuasive
impact (Petty and Cacioppo, 1984), which should also be the case for astroturfing comments.
So far, only two studies have provided insights into the effects of astroturfing
activities on audience attitudes. However, in all these cases, researchers have not used online
comments but other types of astroturfing information. In an experiment, Cho, Martens, Kim,
and Rodrigue (2011) showed that people who were exposed to astroturf websites became
more uncertain, as compared with those who saw real grassroots websites, about the causes of
global warming and humans’ role in the phenomenon. Interestingly, these effects occurred
despite the fact that participants had (correctly) perceived the information from the
astroturfing websites to be less credible and the organization less trustworthy. In another
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 6
study, Pfau, Haigh, Sims, and Wigley (2007) investigated the effects of corporate front-group
stealth campaigns. Very similarly to astroturfing activities, these groups disseminate
persuasive messages while masking their true identity and interests. After they were
confronted with the disguised corporate messages, the opinions of those initially favoring
restrictive federal efforts on different issues were significantly eroded. Given the theoretical
and empirical evidence, we assume that pro-Russian online comments will influence the
opinions of those who read them.
H1 Exposing individuals to pro-Russian astroturfing comments will change their opinions
in the direction of the comments.
The effects of astroturfing comments on opinion certainty
Whereas an attitude or opinion represents a person’s evaluation of an object, situation, or
person, attitude or opinion certainty refers to the conviction about the attitude or the extent to
which one is confident in it (Gross et al., 1995). Certainty is an important dimension of an
attitude or opinion, because it influences its stability, durability, and behavioral impact. There
are several theoretical reasons why astroturfing comments can be expected to influence
opinion certainty. Firstly, research has shown that opinion certainty can be altered by
messages contradicting an existing opinion, because these decrease the structural consistency
of the underlying beliefs or knowledge. Hence, information with contradictory evaluative
implications (e.g., messages that contradict the overall evaluation of an object) should
decrease opinion certainty (Smith et al., 2008). Secondly, opinion certainty is influenced by
the subjective ease with which opinion-relevant information comes into an individual’s mind.
If information supporting the opinion is easily cognitively retrieved (e.g., because the
individual has recently been exposed to it), the information is deemed more valid and thus
fosters opinion certainty (Tormala et al., 2002). Conversely, easily retrieved counter-
attitudinal information—as provided by astroturfing comments—should decrease opinion
certainty. Finally, and especially important for the case of astroturfing, is the fact that people
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 7
hold opinions with greater certainty when they perceive social consensus for them (e.g.,
Visser and Mirabile, 2004). As other studies have shown, online user comments can serve as
indicators of such a consensus (Zerback and Fawzi, 2017).
Although creating uncertainty among people in democratic societies is considered a
central goal of political astroturfing in the context of elections (Zhang et al., 2013), only the
previously mentioned study by Cho and colleagues (2011) and one further study by Kang and
colleagues (2016), which replicated the former’s examination of uncertainty, have
investigated such effects. Both show that individuals who were exposed to astroturfing
websites on global warming became more uncertain regarding the causes of climate change
and the role played by humans in this context. On the basis of the theoretical work and
empirical studies described, we assume that counter-attitudinal astroturfing comments will
decrease individual opinion certainty.
H2 Exposing individuals to pro-Russian astroturfing comments will decrease opinion
certainty.
Inoculation as a countermeasure to the effects of astroturfing comments
Given the supposed effects of astroturfing comments, the question arises as to what can be
done to neutralize these. One effective way of inhibiting or even preventing the impact of
persuasive attacks is to inoculate people against them (see Compton and Pfau, 2005).
Inoculation theory explains this process by reference to a biological analogy (McGuire,
1964): resistance to future persuasive messages can be increased by administering a
weakened version of the “virus” to the individual—in this case, the impending persuasive
message. An effective inoculation procedure consists of two core elements: threat and
refutational preemption (see Compton, 2012 for an overview). Threat means that the
individual receives a warning about a pending persuasive attack that will challenge their
existing attitudes. Following this warning, the person is provided with information intended
to strengthen the existing individual attitude in the face of the attack. This second element is
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 8
termed “refutational preemption,” and exists in two common variants: refutational-same
preemptions raise and refute exactly the same arguments as used in the subsequent attack
message, whereas refutational-different preemptions include arguments that are not part of
the subsequent attack. Empirical studies have shown that both preemption types can increase
resistance to attack messages (Banas and Rains, 2010; McGuire, 1964).
Despite the promising potential of the inoculation approach, to our knowledge no
study to date has investigated the effectiveness of inoculation treatments in the context of
astroturfing campaigns, although leading scholars in the field have emphasized its benefits
and suitability as a countermeasure to contemporary forms of disinformation (van der Linden
et al., 2017). While some researchers have tested the effectiveness of inoculation strategies in
the context of mis- or disinformation, their studies do not deal with astroturfing campaigns or
state-induced propaganda in general, but rather with conspiracy theories (Banas and Miller,
2013), media reports on climate change (Cook et al., 2017), and front-group stealth
campaigns (Pfau et al., 2007). Nevertheless, all these studies confirm the effectiveness of
preemptive inoculation measures in hampering the effects of persuasive messages on
personal opinions.
Whereas the works described above investigated inoculation to prevent opinion
change, Tormala and Petty (2002) offer an additional perspective that also allows to derive
theoretical assumptions with regard to opinion certainty. They argue that the mere subjective
experience of resisting a persuasive attack can increase certainty, but only when the attack is
perceived to be strong. Although the authors clearly point out the differences between the
original inoculation approach and their theoretical conception, they state: “As long as
resistance does occur, the stronger the attack is perceived to be, the stronger the predicted
effects [on certainty] will be” (p. 1300). Because an inoculation message empowers people to
resist a subsequent persuasive attack, we expect a higher level of opinion certainty in those
who receive an inoculation treatment as compared with those who do not. This assumption
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 9
has also been confirmed by empirical studies showing that attitude certainty increased after
participants were inoculated against persuasive messages (Compton and Pfau, 2004; Pfau et
al., 2004). Therefore, we assume the following:
H3 Administering an inoculation treatment prior to an astroturfing comment will inhibit
the assumed persuasive effects on opinion change (H3a) and opinion certainty (H3b).
Durability of inoculation effects
One of the most challenging questions in the context of inoculation is how long it provides
protection from subsequent attack messages. McGuire (1964) assumes that some time must
pass between the inoculation treatment and the persuasive attack in order to strengthen
resistance. However, due to a declining motivation over time to defend one’s opinion, wear-
out effects could also occur, decreasing resistance in the long run (Insko, 1967). The co-
occurrence of both processes has led researchers to assume that the effectiveness of an
inoculation treatment follows an inversely U-shaped curve, which brings up the question of
the ideal time interval between inoculation and attack (Compton and Pfau, 2005). Empirical
studies have used varying time intervals, ranging from attack messages immediately
following the inoculation treatment to intervals of several months. In their extensive meta-
analysis of inoculation studies, Banas and Rains (2010) found some support for a declining
immunizing effect when they compared short (immediate attack message), moderate (attack
message after 13 days), and long (attack message after 14 days or later) intervals. However,
the decline was not significant. In his literature review, Compton (2012) found some
indication of a drop in resistance after a two-week period. Hence, we propose the following
research question:
RQ1 Will inoculation effects on opinion change (H3a) and opinion certainty (H3b) still
exist after a two-week delay between inoculation and the astroturfing comments?
Method
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 10
The following analyses are based on a three-wave online experiment employing a 3 (issue) x
5 (inoculation) x 2 (delay between inoculation and attack message) between subject design.
Participants were recruited via a commercial online access panel (Consumer Fieldwork
GmbH) in September 2018 and randomly assigned to one of the experimental conditions.
2,353 subjects took part in all three waves of the experiment.1 They were aged 48.8 years (SD
= 15.2) on average; 44.4 % possessed the highest German high-school degree (Abitur), and
49.9 % were female.
Stimulus and procedure
Because online astroturfing comments often occur in the context of professional journalistic
content (Kovic et al., 2018), our experimental stimulus consisted of a short, fictitious
Facebook news teaser ostensibly from the largest German television newscast, Tagesschau
(see Online Supplementary File for example). To assess the generalizability of the results,
three identical teasers were produced, differing only in the issue they dealt with. Two of these
issues (the murder attempt on Sergei Skripal and the manipulation of the 2016 US
presidential election) related to direct Russian involvement, accusing the Russian government
of being responsible for the events concerned. The third issue (the use of toxic gas in Syria)
involved the Syrian government—a close ally of Russia—as a responsible actor. Each teaser
consisted of a picture illustrating the issue, a short headline and a caption, both depicting
either the Russian (issues one and two) or the Syrian government (issue three) as responsible
for the event.
Furthermore, each teaser was accompanied by two user comments representing
typical astroturfing attack messages. In constructing the astroturfing messages, we closely
followed the analysis offered by the EU vs. Disinformation initiative, which identified the
most prevalent argumentative figures used by Russian propagandists (EU vs. Disinformation,
2019). More specifically, the comments presented to the subjects all expressed doubt
regarding a Russian/Syrian involvement in the event by bringing up arguments supporting
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 11
this position and offering alternative explanations (e.g., “So the guy [Skripal] was a proven
double agent and had connections to the mafia. There were a lot of other people who wanted
to kill him”). To make sure that the strength of the arguments did not differ between the three
issues, since this would jeopardize the interpretation of potential effects, all comments were
pre-tested by N = 20 subjects who were not part of the final study. The pretest results
indicated that all arguments offered in the astroturfing comments were perceived as
moderately strong, with no significant differences between the issue conditions (see Table 1
in Online Supplementary File).
[FIGURE 1]
The experiment was carried out in three waves, covering a period of four weeks
(Figure 1). In wave one, we measured participants’ prior opinions and opinion certainty for
all three issues and collected socio-demographic information. To avoid raising suspicion
regarding the true goal of these questions, the first wave took place two weeks before the
actual stimulus presentation. In addition, all issue-specific questions were embedded in larger
item sets also encompassing other issues. Two weeks later, in wave two, participants received
a second questionnaire including the inoculation treatments. In line with our theoretical
outline, three different inoculation messages were administered. The “threat only” inoculation
condition (IC1) included only a warning about commenters paid by the Russian government
(so-called trolls), who attempt to sway citizens’ opinions regarding the respective issue. In
the “refutational-different” condition (IC2), subjects received exactly the same warning, but
were additionally informed about the general persuasive strategies employed by these trolls,
namely, that they would try to offer alternative explanations for events in order to take
Russia/Syria out of the line of fire. Subjects were also told that these alternative explanations
contradicted independent official investigations of the events. Similarly, in the “refutational-
same” condition (IC3), subjects were warned about the possible persuasive attempts and
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 12
informed about the strategy; however, this time by telling them the exact arguments that the
trolls would put forward.
In order to determine the persuasive effects of the astroturfing comments on subjects’
opinions and opinion certainty (H1 and H2), the inoculation factor also included two
additional control conditions, in which subjects did not receive an inoculation treatment. In
control condition 1 (CC1) participants were only exposed to the news teaser, without the
astroturfing comments; in CC2 they saw the teaser including the comments. Consequently,
differences between the two control groups indicate the astroturfing comments’ effects.
In order to assess the durability of the three inoculation treatments (RQ1), all subjects
in wave two received the inoculation treatment; however, the procedure differed with respect
to the point in time when they were exposed to the subsequent news teaser with the
astroturfing comments. Half of the subjects received the teaser including the comments,
immediately after the inoculation; the other half received it two weeks later (wave three).
Measures
Because all astroturfing comments were intended to raise doubt about Russian/Syrian
involvement in the events presented, we asked our participants specifically for their opinion
on the national government’s responsibility for the event, and how certain they were of this
opinion. Subjects’ opinions were measured using a five-point Likert scale indicating
agreement with the statement that Russia/Syria was responsible for the event described in the
news teaser (1 “Do not agree” to 5 “Fully agree”). The measure for opinion certainty was
adopted from Tormala and Petty (2002), asking how certain the subject was of the opinion he
or she had indicated above (1 “Not certain at all” to 5 “Extremely certain”). By subtracting
participants’ post-stimulus answers from their pre-stimulus answers, two scores were
calculated, reflecting changes in opinion and opinion certainty before and after stimulus
presentation (opinion change: MSyria = 0.24; SDSyria = 1.04; MSkripal = 0.33; SDSkripal = 1.07;
MUS election = 0.30; SDUS election = 1.00; change in opinion certainty: MSyria = 0.27; SDSyria =
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 13
1.27; MSkripal = 0.31; SDSkripal = 1.31; MUS election = 0.07; SDUS election = 1.21). Positive values
on the opinion-change measure indicate that respondents held Russia/Syria less responsible
for the events after they were confronted with the stimulus. Positive values on the opinion-
certainty change-measure indicate higher uncertainty as compared to the initial certainty
assessment.
Results
Manipulation Checks
Manipulation checks were performed regarding the perception of the inoculation message
yielding satisfying results. Most subjects in these inoculation conditions correctly recalled
that they had received an inoculation message (88.5%). Similarly, those who had not been
inoculated correctly remembered that they had not seen such a message (87.9 %), χ²(2, N =
2221) = 1281.99, p = .000. Similarly also, most of the participants who were exposed to
astroturfing comments correctly remembered having seen comments beneath the news teaser
(77.1%), as did those in the non-comment condition, where 73.8% stated that they had not
seen any comments, χ²(2, N = 2233) = 540.66, p = .000.
Effects of online astroturfing comments on opinions and opinion certainty
To test whether the online astroturfing comments affected participants’ opinions, we will first
focus on the non-inoculated subjects in the two control conditions by comparing participants
who only saw the news teaser (CC1) to those exposed to the news teaser including the
astroturfing comments (CC2). Figure 2 depicts opinion changes in both groups, with positive
scores indicating changes in a pro-Russian/pro-Syrian direction, that is towards holding them
less responsible.2 Firstly, it is interesting to see that, over the course of the two weeks
between the pre- and post-stimulus measurements, subjects in all issue conditions became
more supportive of the Russian/Syrian position. However, while this effect was only marginal
in the news-teaser-only condition (CC1) (M = 0.12, SD = 0.96), it was clearly pronounced for
those who had been exposed both to the news teaser and to the online astroturfing comments
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 14
(M = 0.42, SD = 1.08). Put differently, those who found pro-Russian/pro-Syrian astroturfing
comments beneath the news teaser ascribed significantly less responsibility to Russia/Syria
for the event depicted, b = 0.30, p < .001.3 From a cross-issue perspective, H1 can thus be
confirmed. However, a closer inspection of the issue-specific patterns shows that the
astroturfing comments’ persuasive effect can mainly be traced back to the Skripal case, b =
0.54, p = .000, and somewhat to the Syria issue, b = 0.21, p = .09. Hence, H1 finds support
only in this case.
[FIGURES 2 and 3]
We further assumed that online astroturfing comments would increase uncertainty in those
who initially thought that Russia/Syria was responsible for the negative events (H2).
Therefore, unlike in the previous analysis, we confine our examination to subjects who had
initially seen the two states as culprits (indicated by values of pre-stimulus opinions of 4 or 5;
N = 995). Figure 3 shows that, among these participants, astroturfing comments affected
opinion certainty in the expected direction across all issue conditions.4 Again, when
comparing the two control groups CC1 (M = 0.34, SD = 1.19) and CC2 (M = 0.64, SD
=1.11), participants who saw counter-attitudinal online astroturfing comments became
significantly more uncertain of their initial view that Russia/Syria were to blame for the
depicted events, b = 0.30, p = .009, as compared with those who did not see the comments.
From a cross-issue perspective, H2 can thus be confirmed. Again, an issue-specific
examination shows that the effect was only significant in the Skripal scenario, b = 0.58, p =
.005. Therefore, H2 can only be confirmed in this case.
Effects of inoculation treatments
In a next step, we examine whether the three inoculation strategies were able to prevent the
persuasive effects of the astroturfing comments. In order to do so, we compare the three
groups who received the astroturfing comments after being inoculated (IC1, IC2, and IC3) to
the group who had seen the same comments without prior inoculation (CC2). An effective
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 15
inoculation treatment should have prevented opinion change, ideally reducing it to the level
of those who had only seen the news teaser without any astroturfing comments (CC1). A
visual inspection of Figure 2 supports this notion, at least for the refutational-same
inoculation treatment (IC3), b = -0.20, p = .007: participants who were educated in advance
about Russia’s persuasive goals and exact arguments were less influenced by the astroturfing
comments (M = 0.22, SD = 1.05) as compared with non-inoculated subjects (M = 0.42, SD =
1.08). In contrast, the remaining two inoculation strategies (threat only: b = -0.04, p = .561;
refutational-different: b = -0.06, p = .391) did not prevent opinion change in the direction of
the astroturfing comments. A further issue-specific examination of the data shows that the
overall effect of the refutational-same preemption was largely rooted in the Skripal and Syria
cases. Multiple group comparisons indicate that the refutational-same strategy reduced
opinion change in both issue conditions to a sufficient level, with a significant difference
from non-inoculated participants receiving comments (CC2) (bSyria = -0.24, p = .06; bSkripal =
-0.36, p = .01) and a non-significant difference from those who had only seen the news teaser
(CC1) (bSyria = -0.03, p = .84; bSkripal = 0.18, p = .15). Hence, H3a finds support in these two
cases (see Table 2 in Online Appendix for complete documentation of means and statistical
tests).
Following the previous logic, we finally examined the efficiency of inoculation in
relation to opinion-certainty changes (H3b). Again, the visual patterns in Figure 3 seem to
support the effectiveness of the refutational-same treatment, which hampered the increase in
uncertainty (M = 0.43, SD = 1.19) as compared with non-inoculated subjects in CC1 (M =
0.64, SD = 1.11), although not to a highly significant extent, b = -0.21, p = .07. As Table 3
(Online Appendix) shows, none of the three inoculation strategies was able to prevent
changes in opinion certainty within the single-issue conditions significantly.
Duration of inoculation effects
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 16
In a final step, we examined how long the observed immunization effect persisted (RQ1). The
two lines in Figure 4 represent the different delay conditions implemented in our experiment
design (immediate and delayed astroturfing attack). It is important to recall that delay
represents a between factor, so for each delay condition, we collected data across all
inoculation groups.
[FIGURE 4]
As can be seen, the two lines mostly parallel each other, with only minor and non-significant
differences when comparing the short- and long-term conditions (see Table 4 in Online
Supplementary File for means and statistical tests). However, there is one noteworthy
exception, which manifests itself in a nearly significant interaction effect between inoculation
and delay, F(4, 2054) = 2.23, p = .06: the refutational-same treatment, which we have
identified as the most potent in reducing opinion changes, was only effective when
administered immediately prior to the astroturfing comments (Mshort delay = 0.09, SDshort delay =
1.03), whereas its effect largely diminished after two weeks (Mlong delay = 0.36, SDlong delay =
1.01), t(385) = -2.64, p = .01. When we look at the issue-specific short- and long-term effects,
we find exactly the same pattern, but, again, only in the Skripal case, indicating a significant
decrease over time in the immunizing effect of the refutational-same treatment (Mshort delay = -
0.13, SDshort delay = 1.10; Mlong delay = 0.60, SDlong delay = 1.12), t(135) = -3.18, p = .002.
Corresponding mean differences in the Syria condition, t(118) = -0.233, p = .816, and US
election condition, t(137) = -0.909, p = .365, could not be observed.
With regard to changes in opinion certainty, we found no significant three-way
interaction between issue, inoculation strategy and delay, F(8, 965) = 0.55, p = .820. Short-
and long-term inoculation effects on opinion certainty did not differ significantly across the
three issue conditions.
Discussion
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 17
In this paper, we have examined the persuasive effects of astroturfing comments posted
beneath news items on Facebook in the context of three Russia-related issues: the poisoning
of former Russian intelligence officer Sergei Skripal, the manipulation of the 2016 US
presidential election, and the use of toxic gas by a close Russian ally, the Syrian government.
We define as astroturfing comments those that imitate ordinary citizens’ voices to create the
impression that a certain opinion has widespread public support, while the real agent behind
the commenter’s message is concealed. Drawing upon extant in-depth analysis of Russia’s
online propaganda (EU vs. Disinformation, 2019; Nimmo, 2015), we designed our
astroturfing stimuli comments so that these would support a pro-Russian position by offering
alternative explanations for the three events, and by denying Russia’s responsibility for them.
In a subsequent step, we tested the effectiveness of three different inoculation strategies in
preventing the persuasive effects of astroturfing comments: (1) threat-only, (2) refutational-
different, and (3) refutational-same treatments.
The disconcerting potential of Russia’s trolls: The impact of astroturfing comments
The results of our study show that astroturfing comments, can indeed change audiences’
political opinions and increase uncertainty. However, these effects did not occur at equal
strength across all three issues. While we could clearly observe effects in the Skripal case and
to some extent in the Syria scenario, we did not find equally strong evidence in the context of
the manipulations of the 2016 US presidential election. Against this backdrop, a key task for
further research appears to be to specify the reasons for such issue-specific differences. For
instance, it is possible that participants’ opinions in the Skripal and the Syria conditions were
more uncertain in the first place and were therefore easier to influence through astroturfing
attacks. However, our data does not support this interpretation. A comparison of pre-stimulus
opinion-certainty scores shows that subjects’ issue-specific uncertainty levels did not differ
significantly (MSyria = 3.21, SDSyria = 1.09; MSkripal = 3.30, SDSkripal = 1.04; MUS election = 3.28,
SDUS election = 1.08), F(2, 2252) = 1.31, p = .270. Another possible reason might be that
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 18
respondents’ opinions about the US presidential election and Syria were more difficult to
influence through astroturfing comments because they are more abstract scenarios and
therefore more difficult to process and understand, especially when someone offers
alternative explanations for them. The Skripal case, on the other hand, as a more narrowly
defined and concrete event, makes it easier to understand and accept possible explanations.
Unfortunately, our data did not enable us to test this assumption.
Immunizing citizens against astroturfing campaigns
This study advances research on inoculation effects because it is the first to transfer this
approach to the realm of online astroturfing comments, that is to one of the currently most
widely debated forms of disinformation in the context of democratic elections (Ferrara, 2017;
Kovic et al., 2018; Zelenkauskaite and Balduccini, 2017). As extant inoculation research
conducted in other contexts has shown, inoculation messages can help to confer on
individuals cognitive resistance to “a range of falsehoods in diverse domains such as climate
change, public health, and emerging technologies” (van der Linden et al., 2017: 1141).
Contrary to these expectations, only one strategy proved to be effective in mitigating the
persuasive effects of astroturfing comments: only when subjects were educated in advance
about the exact arguments deployed by the Russian trolls (refutational-same), changes in
opinions and opinion certainty could be prevented. This result adds to a rather disconcerting
overall picture. In essence, it means that, in order to neutralize the effects of astroturfing
campaigns sponsored by foreign governments or other powerful actors, citizens will have to
learn the very specific lines of argument that these astroturfing actors use. Without any doubt,
designing and disseminating such highly tailored inoculation messages in a timely manner
will require enormous resources, as well as highly professionalized counter-campaigning.
However, it is important to recall that in our study participants were inoculated only once,
which probably limited the immunizing potential of the treatments. There is empirical
evidence supporting the notion that booster sessions used to refresh the initial inoculation
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 19
message could enhance its power and durability. However, overall results on the
effectiveness of booster sessions are mixed, and this probably also depends on the timing of
the repeated exposure (Compton and Pfau, 2005).
A third disconcerting finding of our study was that even the immunizing effect of the
refutational-same treatment was only short-lived. It vanished almost completely after a two-
week delay. This finding is in line with other inoculation studies in the context of political
issues (Pfau and Burgoon, 1988).
The potentially negative effects of immunizing citizens against astroturfing comments
With regard to transferring inoculation research to the realm of astroturfing comments,
perhaps the most difficult problem to solve relates to the fact that such comments can
typically not be distinguished from genuine citizens’ voices (the defining element of
astroturfing). This poses a dilemma because, while inoculation messages might mitigate the
harmful effects of astroturfing messages (positive consequence), they might also undermine
the credibility of citizen commenting in public online spaces, and of online deliberation in
general (negative consequence). This potential “side-effect” of inoculation campaigns
(Compton, 2012: 15) could only be prevented if astroturfing comments were unambiguously
identifiable and distinguishable from authentic citizen comments—which will almost never
be the case. Those who initiate counter campaigns will thus have to make difficult decisions
as to whether, and how, citizens can and should be inoculated against political astroturfing
campaigns. Rather abstract “threat-only” treatments, for instance, can be disseminated with
relatively limited costs and efforts. Yet these have the disadvantage that they undermine the
credibility of online citizen debate around entire political issues. Moreover, they are,
according to our findings, relatively inefficient. Highly specific refutational-same treatments,
by contrast, can be very effective in mitigating the persuasive effects of astroturfing
comments, as the findings of this study indicate. They also have the advantage that they
discredit only those user comments that actually convey very narrowly defined pieces of
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 20
misleading and inaccurate information. The downside of refutational-same inoculation
treatments is, however, that it requires extensive resources to tailor and administer such
highly issue- and argument-specific counter messages. Finally, refutational-different
treatments can be seen as taking up a position between “threat-only” and “refutational-same”
treatments, combining their benefits and drawbacks.
Practical applications and promising paths for future research
Our results also have implications for how news organizations and professional journalists,
who have to moderate comment sections during online astroturfing campaigns, can integrate
the dissemination of inoculation messages into their work routines. Firstly, as this study
indicates, the effectiveness of threat-only and refutational-different preemptions is very
limited. Designing relatively abstract inoculation messages based on these strategies thus
appears not to be a suitable tool for preventing the impact of political astroturfing campaigns
on news audiences. Since immunizing effects appear only to derive from refutational-same
treatments, messages that present the exact arguments used in subsequent astroturfing attacks
will have to be first designed and then disseminated among a news audience. Secondly, the
short-term nature of the effects (even of refutational-same treatments) detected in our study
further imply that inoculation messages will have to be presented to the audience shortly
before they receive astroturfing comments. In reality, this means that journalists or
moderators of public social-media accounts have to inoculate their audiences “just in time”.
On the basis of the findings of this study, we have to conclude that banners or warnings
presented in the immediate vicinity of commenting fields, and administering highly issue-
specific refutational-same treatments, appear to be the only promising strategy for
immunizing news audiences against astroturfing content posted by paid political trolls or by
similar means.
Of course, our study also has limitations. Although we increased external validity by
including three different issues in our design and by investigating the short- and long-term
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 21
effects of astroturfing and inoculation messages, we still relied on results gathered in an
experimental setting. Participants were purposely exposed to stimuli that they otherwise
might not have encountered, for example because they did not use social media or did not
read the comments beneath news articles. In this sense, the effects of comments that we
found probably overestimate the effect on society as a whole. On the other hand, participants
in our experiment were only exposed once to the astroturfing comments and to the
inoculation messages. In a real-world environment, people probably encounter comments
repeatedly, which enhances the astroturfing comments’ persuasive power. The same is true of
inoculation messages: simply because a one-time inoculation proves to be inefficient or loses
its effect after a while, this does not mean that inoculation as a countermeasure to astroturfing
is an ineffective strategy. It seems plausible that multiple inoculation treatments would
sustain the immunization or might even increase it by aggregating the effects of the single
treatments. The question of how repeated exposure influences the persuasive effects of
astroturfing comments, and also those of inoculation messages, represents a further promising
avenue for future research. By following up on these and related paths of scrutiny, future
research should theorize and investigate in significantly more depth the psychological
mechanisms that facilitate the disconcerting persuasive potential of political trolling. On the
basis of this knowledge, they then need to identify and further specify the most promising
strategies for minimizing the harm that this type of political disinformation can cause to
democratic public life.
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 22
References
Banas JA and Miller G (2013) Inducing resistance to conspiracy theory propaganda: Testing
inoculation and metainoculation strategies. Human Communication Research 39(2): 184–
207. DOI: 10.1111/hcre.12000.
Banas JA and Rains SA (2010) A meta-analysis of research on inoculation theory.
Communication Monographs 77(3): 281–311. DOI: 10.1080/03637751003758193.
Bennett WL and Livingston S (2018) The disinformation order: Disruptive communication
and the decline of democratic institutions. European Journal of Communication 33(2):
122–139. DOI: 10.1177/0267323118760317.
Bessi A and Ferrara E (2016) Social Bots Distort the 2016 US Presidential Election Online
Discussion. First Monday 21(11).
Bugorkova O (2015) Ukraine conflict: Inside Russia's 'Kremlin troll army'. BBC News, 2015.
Retrieved from: https://www.bbc.com/news/world-europe-31962644.
Cho CH, Martens ML, Kim H, et al. (2011) Astroturfing global warming: It isn’t always
greener on the other side of the fence. Journal of Business Ethics 104(4): 571–587. DOI:
10.1007/s10551-011-0950-6.
Compton J (2012) Inoculation Theory. In: Dillard J and Shen L (eds) The SAGE Handbook of
Persuasion: Developments in Theory and Practice: Thousand Oaks, CA: Sage
Publications, pp. 1–20.
Compton JA and Pfau M (2004) Use of inoculation to foster resistance to credit card
marketing targeting college students. Journal of Applied Communication Research 32(4):
343–364. DOI: 10.1080/0090988042000276014.
Compton JA and Pfau M (2005) Inoculation theory of resistance to influence at maturity:
Recent progress in theory development and application and suggestions for future
research. Annals of the International Communication Association 29(1): 97–146. DOI:
10.1080/23808985.2005.11679045.
Cook J, Lewandowsky S and Ecker UKH (2017) Neutralizing misinformation through
inoculation: Exposing misleading argumentation techniques reduces their influence. PloS
one 12(5): 1-21. DOI: 10.1371/journal.pone.0175799.
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 23
Daschmann G (2000) Vox pop & vox polls: The impact of poll results and voter statements in
the media on the perception of a climate of opinion. International Journal of Public
Opinion Research 12(2): 160–181. DOI: 10.1093/ijpor/12.2.160.
Domke D, Shah DV and Wackman DB (1998) Media priming effects: Accessibility,
association, and activation. International Journal of Public Opinion Research 10(1): 51–
74. DOI: 10.1093/ijpor/10.1.51.
EU vs. Disinformation (2019) Conspiracy mania marks one-year anniversary of the Skripal
poisoning. Available at: https://euvsdisinfo.eu/conspiracy-mania-marks-one-year-
anniversary-of-the-skripal-poisoning/.
European Commission (2018) A multi-dimensional approach to disinformation: Report of the
independent high level group on fake news and online disinformation. Luxembourg:
Publications Office of the European Union.
Ferrara E (2017) Disinformation and social bot operations in the run up to the 2017 French
presidential election. First Monday 22(8): 1-30. DOI: 10.2139/ssrn.2995809.
Gross SR, Holtz R and Miller N (1995) Attitude certainty. In: Petty RE and Krosnick JA
(eds) Attitude strenght. Antecedents and consequences: Mahwah, NJ: Lawrence Erlbaum
Associates, pp. 215–245.
Hayes AF (2005) Statistical methods for communication science. Mahwah, N.J: Lawrence
Erlbaum Associates.
Hovland CI, Janis I and Kelley HH (1953) Communication and persuasion: Psychological
studies of opinion change. New Haven, CO, London: Yale University Press.
Insko CA (1967) Theories of attitude change. New York: Appleton-Century-Crofts.
Kang J, Kim H, Chu H, et al. (2016) In distrust of merits: The negative effects of astroturfs
on people's prosocial behaviors. International Journal of Advertising 35(1): 135–148.
DOI: 10.1080/02650487.2015.1094858.
Keller FB, Schoch D, Stier S, et al. (2017) How to manipulate social media: Analyzing
political astroturfing using ground truth data from South Korea.
King G, Pan J and Roberts ME (2017) How the Chinese government fabricates social media
posts for strategic distraction, not engaged argument. American Political Science Review
111(03): 484–501. DOI: 10.1017/S0003055417000144.
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 24
Kovic M, Rauchfleisch A, Sele M, et al. (2018) Digital astroturfing in politics: Definition,
typology, and countermeasures. Studies in Communication Sciences 18(1): 69–85.
Lefevere J, Swert K de and Walgrave S (2012) Effects of popular exemplars in television
news. Communication Research 39(1): 103–119. DOI: 10.1177/0093650210387124.
Lyon TP and Maxwell JW (2004) Astroturf: Interest Group Lobbying and Corporate
Strategy. Journal of Economics & Management Strategy 13(4): 561–597. DOI:
10.1111/j.1430-9134.2004.00023.x.
McGuire WJ (1964) Inducing resistance to persuasion: Some contemporary approaches. In:
Berkowitz L (ed.) Advances in experimental social psychology: New York: Academic
Press, pp. 191–229.
McNutt J and Boland K (2007) Astroturf, technology and the future of community
mobilization: Implications for nonprofit theory. The Journal of Sociology & Social
Welfare 34(3): 165–178.
Nimmo B (2015) Anatomy of an info-war: How Russia’s propaganda machine works, and
how to counter it. Available at: https://www.stopfake.org/en/anatomy-of-an-info-war-
how-russia-s-propaganda-machine-works-and-how-to-counter-it/.
Petty RE and Cacioppo JT (1984) The effects of involvement on responses to argument
quantity and quality: Central and peripheral routes to persuasion. Journal of Personality
and Social Psychology 46(1): 69–81. DOI: 10.1037/0022-3514.46.1.69.
Pfau M and Burgoon M (1988) Inoculation in political campaign communication. Human
Communication Research 15(1): 91–111. DOI: 10.1111/j.1468-2958.1988.tb00172.x.
Pfau M, Compton J, Parker KA, et al. (2004) The traditional explanation for resistance versus
attitude accessibility. Human Communication Research 30(3): 329–360. DOI:
10.1111/j.1468-2958.2004.tb00735.x.
Pfau M, Haigh MM, Sims J, et al. (2007) The influence of corporate front-group stealth
campaigns. Communication Research 34(1): 73–99. DOI: 10.1177/0093650206296083.
Sikorski C von (2018) The effects of reader comments on the perception of personalized
scandals: Exploring the roles of comment valence and commenters’ social status.
International Journal of Communication 10: 4480–4501.
Smith SM, Fabrigar LR, MacDougall BL, et al. (2008) The role of amount, cognitive
elaboration, and structural consistency of attitude-relevant knowledge in the formation of
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 25
attitude certainty. European Journal of Social Psychology 38(2): 280–295. DOI:
10.1002/ejsp.447.
Taylor SE and Thompson SC (1982) Stalking the elusive "vividness" effect. Psychological
Review 89(2): 155–181. DOI: 10.1037//0033-295X.89.2.155.
Tormala ZL and Petty RE (2002) What doesn't kill me makes me stronger: The effects of
resisting persuasion on attitude certainty. Journal of Personality and Social Psychology
83(6): 1298–1313. DOI: 10.1037/0022-3514.83.6.1298.
Tormala ZL, Petty RE and Briñol P (2002) Ease of retrieval effects in persuasion: A self-
validation analysis. Personality and Social Psychology Bulletin 28(12): 1700–1712. DOI:
10.1177/014616702237651.
van der Linden S, Maibach E, Cook J, et al. (2017) Inoculating against misinformation.
Science 358(6367): 1141–1142. DOI: 10.1126/science.aar4533.
Visser PS and Mirabile RR (2004) Attitudes in the social context: The impact of social
network composition on individual-level attitude strength. Journal of Personality and
Social Psychology 87(6): 779–795. DOI: 10.1037/0022-3514.87.6.779.
Weedon J, Nuland W and Stamos A (2017) Information operations on Facebook.
Woolley SC and Guilbeault DR (2017) Computational propaganda in the United States of
America: Manufactoring consensus online. Working Paper No. 2017.5. Oxford:
University of Oxford.
Zelenkauskaite A and Balduccini M (2017) “Information warfare” and online news
commenting: Analyzing forces of social influence through location-based commenting
user typology. Social Media + Society 3(3): 1-13. DOI: 10.1177/2056305117718468.
Zerback T and Fawzi N (2017) Can online exemplars trigger a spiral of silence? Examining
the effects of exemplar opinions on perceptions of public opinion and speaking out. New
Media & Society 19(7): 1034–1051. DOI: 10.1177/1461444815625942.
Zerback T and Peter C (2018) Exemplar effects on public opinion perception and attitudes:
The moderating role of exemplar involvement. Human Communication Research 14(2):
125. DOI: 10.1093/hcr/hqx007.
Zhang J, Carpenter D and Ko M (2013) Online Astroturfing. A theoretical perspective:
Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago,
Illinois, August 15-17, 2013.
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 26
Zillmann D (1999) Exemplification theory: Judging the whole by some of its parts. Media
Psychology 1(1): 69–94. DOI: 10.1207/s1532785xmep0101_5.
1 A statistical power analysis showed that, in order to find small interaction effects between
all three experimental factors, a minimum sample size of N = 2,283 is necessary.
2 See also Table 2 in Online Supplementary File for complete documentation of means,
standard deviations, and statistical tests for group comparisons.
3 To test the group for differences, we followed Hayes (2005), dummy-coded the inoculation
factor and inserted k-1 dummy variables as independents in a linear regression model. The
respective comparison group served as the reference category. Besides testing for significant
mean differences, the unstandardized regression coefficient b also indicates the direction and
magnitude of the mean difference between the two groups.
4 See also Table 3 in Online Appendix for complete documentation of means, standard
deviations, and statistical tests for group comparisons.
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 27
Figure 1
Experimental design
The design was replicated for all three issue conditions.
Pre-
screening of
opinions, public opinion perceptions, and randomization
Post-measurement of
public opinion perceptions
No inoculation
message
Threat
only
Refutational
different
No inoculation
message
Refutational
same
No inoculation
message
Threat
only
Refutational
different
No inoculation
message
Refutational
same
News teaser
only
News teaser
with comments
News teaser
with comments
News teaser
with comments
News teaser
with comments
News teaser
only
News teaser
with comments
News teaser
with comments
News teaser
with comments
News teaser
with comments
Inoculation
pre-treatment Immediate
attack message Delayed
attack message
Short-term inoculation effects
Long-term inoculation effects
2-week delay 2-week delay
CC1
CC2
IC1
IC2
IC3
CC1
CC2
IC1
IC2
IC3
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 28
Figure 2
Effects of astroturfing comments and inoculation treatments on opinion change
N = 2,064
0
0,2
0,4
0,6
0,8
1
News teaser
only
(CC1)
News teaser
with comments
(CC2)
Inoculation:
threat only
(IC1)
Inoculation:
refutational
different
(IC2)
Inoculation:
refutational
same
(IC3)
Opinion change
Syria Skripal US-Election
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 29
Figure 3
Effects of astroturfing comments and inoculation treatments on opinion-certainty change
N = 995 participants initially indicating that Russia/Syria was responsible for the event
depicted (values of pre-stimulus opinion 4 or 5).
0
0,2
0,4
0,6
0,8
1
News teaser
only
(CC1)
News teaser
with comments
(CC2)
Inoculation:
threat only
(IC1)
Inoculation:
refutational
different
(IC2)
Inoculation:
refutational
same
(IC3)
Opinion certainty change
Syria Skripal US-Election
THE DISCONCERTING POTENTIAL OF RUSSIA’S TROLLS 30
Figure 4
Short- and long-term inoculation effects on opinion change
N = 2,064
0
0,2
0,4
0,6
0,8
1
News teaser
only
(CC1)
News teaser
with comments
(CC2)
Inoculation:
threat only
(IC1)
Inoculation:
refutational
different
(IC2)
Inoculation:
refutational
same
(IC3)
Opinion change
Astroturfing attack immediately after inoculation
Astroturfing attack two weeks after inoculation
Supplementary Materials to the Article
The disconcerting potential of Russia’s trolls: Persuasive effects of astroturfing
comments and three strategies for inoculation against them
Contents
Table 1 Perceived argument strength in online astroturfing comments (pretest results) ...... 3
Table 2 Group mean differences in opinion change ............................................................. 4
Table 3 Group mean differences in opinion certainty change .............................................. 5
Table 4 Differences in opinion change after a short and long delay ..................................... 6
Figure I Example of the inoculation message (refutational different) ................................... 7
Figure II Example of the news teaser including astroturfing comments (Skripal issue) ........ 8
Table 1 Perceived argument strength in online astroturfing comments (pretest results)
Argument strength
M (SD) α
Syria (N = 19)
Comment 1: “Like Assad's the only one with poison gas. What about the
thousands of IS henchmen? If someone is known for massacring civilians,
then it's probably them.”
3.71 (0.85) 0.91
Comment 2: “Nothing's proved! Wouldn't be the first time somebody invented
weapons of mass destruction to wage a fucking war.” 3.59 (1.21) 0.97
Overall strength
3.65 (0.94)
0.94
Skripal (N = 20)
Comment 1: “So the guy was a proven double agent and had connections to
the mafia. There were a lot of other people who wanted to kill him.” 2.73 (1.18) 0.96
Comment 2: “If the Russians wanted Skripal dead, they simply would have
done it without leaving traces. But no, they used a poison that directly points
to them. Right!”
3.01 (1.41) 0.96
Overall strength
2.87 (1.21)
0.96
US election (N = 17)
Comment 1: “Russian wire-pullers? Yeah sure! Cold-blooded economic
interests are behind the election manipulations: Facebook, Cambridge
Analytica. Do I have to say any more?”
2.65 (1.02) 0.93
Comment 2: “Nothing's proved! It wouldn't be the first time someone
manipulated an election to gain power in the country.” 3.32 (1.02) 0.95
Overall strength
2.99 (0.81)
0.91
* N = 58 participants took part in the pretest and indicated argument strength on a bipolar
scale from 1 to 5 using the following items: not convincing – convincing, weak – strong,
implausible – plausible, incorrect – correct. All items were used to construct a scale indicating
the perceived strength of each argument.
Table 2 Group mean differences in opinion change
Teaser
only
(CC1)
Teaser with
astroturfing
comments
(CC2)
Inoculation:
Threat only
(IC1)
Inoculation:
Refutat.
different
(IC2)
Inoculation:
Refutat.
same
(IC3)
M (SD) M (SD) M (SD) M (SD) M (SD)
Syria
(n = 657) 0.11c (1.02) 0.33 (1.12) 0.40ae
(1.01) 0.28 (0.96) 0.08c (1.04)
Skripal
(n = 685) 0.06bcd
(0.83) 0.60ae
(1.15) 0.36a (1.21) 0.41a (0.94) 0.24b (1.14)
US election
(n = 722) 0.18 (1.03) 0.33 (0.98) 0.36 (1.08) 0.36 (0.99) 0.31 (0.95)
All issues
(N = 2,064) 0.12bcd
(0.96) 0.42ae
(1.08) 0.37ae
(1.10) 0.35ae
(0.96) 0.22bc
(1.05)
Group comparisons are based on linear multiple regression analysis using the inoculation
factor as a dummy variable. Superscripts indicate significant mean differences (p < .05).
Table 3 Group mean differences in opinion certainty change
Teaser
only
(CC1)
Teaser with
astroturfing
comments
(CC2)
Inoculation:
Threat only
(IC1)
Inoculation:
Refutat.
different
(IC2)
Inoculation:
Refutat.
same
(IC3)
M (SD) M (SD) M (SD) M (SD) M (SD)
Syria
(n = 331) 0.44c (1.26) 0.63 (1.12) 0.89ae (1.18) 0.74 (0.98) 0.42c (1.32)
Skripal
(n = 349) 0.26bcd
(1.26) 0.84a (1.24) 0.90a (1.34) 0.69a (1.12) 0.51 (1.28)
US election
(n = 315) 0.31 (1.05) 0.39 (0.84) 0.40 (1.07) 0.62 (1.22) 0.35 (0.96)
All issues
(N = 995) 0.34bcd
(1.19) 0.64a (1.11) 0.75ae (1.23) 0.68ae (1.11) 0.43cd (1.19)
*Participants stating that Russia / Syria was responsible for the event depicted (pre-stimulus
opinion 4 or 5). Positive values indicate higher opinion uncertainty. Group comparisons are
based on linear multiple- regression analysis using the inoculation factor as a dummy
variable. Superscripts indicate significant mean differences (p < .05).
Table 4 Differences in opinion change after a short and long delay
Opinion change
Teaser
only
(CC1)
Teaser with
astroturfing
comments
(CC2)
Inoculation:
Threat only
(IC1)
Inoculation:
Refutat.
different
(IC2)
Inoculation:
Refutat. same
(IC3)
M (SD) M (SD) M (SD) M (SD) M (SD)
Short delay
(n =1,107) 0.12 (0.97) 0.35 (1.14) 0.34 (1.12) 0.41 (0.96) 0.09a (1.02)
Long delay
(n = 957) 0.12 (0.95) 0.50 (0.99) 0.42 (1.08) 0.29 (0.96) 0.36a (1.06)
Group comparisons represent simple main effects of delay. Superscripts indicate significant
mean differences between the short- and long-delay conditions within a single inoculation
group (p < .05).
Figure I Example of the inoculation message (refutational different)
Figure II Example of the news teaser including astroturfing comments (Skripal issue)
A preview of this full-text is provided by SAGE Publications Inc.
Content available from New Media & Society
This content is subject to copyright.