Content uploaded by Jaap Muijs
Author content
All content in this area was uploaded by Jaap Muijs on Jul 07, 2021
Content may be subject to copyright.
SHAPING PUBLIC PERCEPTION
THE ROLE OF DISINFORMATION IN INTERNATIONAL SECURITY
J. (JACOB) MUIJS
MA THESIS INTERNATIONAL SECURITY
S4581407
21 JUNE 2021
Abstract
Constructions of reality take place in online environments where perceptions are
managed from abroad. Ordinary people employ themselves daily on the frontlines of an
information war and are the targets for achieving desired goals of international power
politics.
This thesis defines the relation between disinformation and International Security
by applying concepts of hybrid warfare and Realism. Drawing on case studies from
Germany, the Baltic states, the United Kingdom, and France, this thesis analyses to what
extent disinformation affects International Security. Furthermore, it describes the
circumstances in which the campaigns can succeed in accomplishing desired (geopolitical)
outcomes. Russian disinformation aims to influence public opinion to get desirable policy
effects, and Russia certainly tries to affect International Security with disinformation.
Foreign influence in public debates, discourse and elections are the new normal; an
apparent use of disinformation is visible.
However, based on these specific case studies, a disinformation campaign in itself
has little effect on International Security and should continuously be assessed as a means
in a larger (geo)political objective and a longer-term goal. While building on existing
grievances, those case studies fitted in a broader objective to weaken Western dominance
and should continually be assessed in a broader context to measure effectiveness in an
International Security environment. Disinformation contributes to International Security's
effects and can affect long-term goals but should not be contributed solely to specific
events.
Foreword
In front of you lies a thesis conducted for the International Security track within the Master's degree of
International Relations. This thesis marks the end of a nine-year-long road through different levels of
the Dutch educational system. Started in 2012 on 4-year senior secondary vocational education and
training (MBO) in media development, a 4-year Bachelor of Arts in Security Management on a
University of Applied Sciences (HBO), concludes with Master of Arts at the University of Groningen
in International Security.
A special thank you goes out to my parents, who supported me and gave me every opportunity
to educate myself for the last nine years. Without their help, their support, and their flexibility, I would
never have achieved the results I achieve at this moment. Their support for my carrier is inestimable,
and without them, this thesis, as this master and previous degree’s, would not have been possible.
I would also like to express my gratitude to dr. N. (Nienke) de Deugd as my supervisor during
the development of this thesis. Her enthusiasm for educating and guiding me throughout the process
was of unprecedented value and helped me achieve this thesis's results. She was always responding in a
matter of hours to my emails, regular consulting sessions, and helpful feedback whenever necessary.
It was a strange year; I never set foot in the University. However, I would like to thank all
professors, teachers, and fellow students at the Master’s program for making the best out of this
situation. Studying at the University of Groningen was a pleasant experience, which I never wished to
have missed.
At last, I want to thank everyone who contributed to the completion of this thesis, especially the
individuals with whom I had the opportunity to discuss the topic in the writing process. They contributed
heavily to the direction and the results of this thesis and provided a valuable layer of knowledge. The
results of their insights are included in the appendix.
Table of contents
1. Introduction ................................................................................................................................1
1.1 Methodology .......................................................................................................................3
2. Theoretical foundations ...............................................................................................................6
2.1 States, Power, War and International Security in the New Age ..................................................6
2.2 Disinformation as a Means of Warfare.......................................................................................8
2.3 Disinformation in Modern Day Society .....................................................................................9
2.4 Disinformation in the Struggle for Power................................................................................. 13
2.5 Operationalisation ................................................................................................................... 15
3 Disinformation in the Contemporary World: an Analysis ........................................................... 18
3.1 Disinformation in Germany: Lisa and the Bundestag Elections (2016-2017) ...................... 18
3.1.1 Germany and Disinformation ............................................................................................ 19
3.1.2 Analysis and Assessment .................................................................................................. 21
3.2 Disinformation in the Baltics: NATO’s Enhanced Forward Presence (2016) ...................... 23
3.2.1 The Baltic Case ................................................................................................................ 25
3.2.2 Analysis and Assessment .................................................................................................. 26
3.3 Disinformation in the United Kingdom: the Brexit Vote (2016) ......................................... 29
3.3.1 The United Kingdom as Target ......................................................................................... 31
3.3.2 Analysis and Assessment .................................................................................................. 31
3.4 Disinformation in France: the Macron Leaks Operation (2017) .......................................... 34
3.4.1 Disinformation in France – Why Russian Support for Le Pen? .......................................... 35
3.4.2 Analysis and Assessment .................................................................................................. 36
4 Conclusions – Disinformation and International Security ........................................................... 39
Epilogue: Confronting Disinformation as a Security Threat ............................................................... 42
References ........................................................................................................................................ 46
Appendices ....................................................................................................................................... 55
Appendix A: Interview Clingendael – Danny Pronk ...................................................................... 55
Appendix B: Interview NLDA - Kolonel dr. A.J.H. (Han) Bouwmeester ....................................... 57
Appendix C: Interview NCTV - Kerneenheid analyse ................................................................... 59
1
1. Introduction
The information age allows us to access information on a big scale. We can access information anytime
and anywhere, and citizens generate new reality constructions online (Till, 2020). However, every time
citizens access the internet, especially during election time, they essentially deploy themselves on the
frontline of an information war where they get bombarded with false and misleading information. A war
in which people are not collateral damage but the targets (McGeehan, 2018, p. 51) and where perception
is managed from abroad (Heuer, 1990; Thomas, 2004; Till, 2020). Within this information,
disinformation tends to influence people's perceptions about the world and events, impacts social
relationships within societies, and leads to polarisation. In addition, people are recently more willing to
mobilise based on alternative narratives they confront online. Originating deep down in the caverns of
the internet, those narratives have been seen to move much faster to the centre of the discussion
(Appendix C, interview NCTV). According to the European Commission, disinformation “erodes trust
in institutions and digital and traditional media and harms our democracies by hampering the ability of
citizens to make informed decisions” (European Commission, 2018). Moreover, it can polarise debates,
create or deepen tensions in society, undermine electoral systems, and have a broader impact on
European security (European Commission, 2018).
Inspired by Vladimir Lenin, the Russians have been using disinformation tactics since the
1920s. They are considered front runners in the disinformation game (Rid, 2020, pp. 312–326), and the
European Commission, which considers disinformation a severe problem for European security, sees
Russia as the biggest threat (Gerrits, 2018, p. 7). In 1902 in his visionary and influential pamphlet, Lenin
wrote: "Hence, political exposures in themselves serve as a powerful instrument for disintegrating the
system we oppose, as a means for diverting from the enemy his casual or temporary allies, as a means
for spreading hostility and distrust among the permanent partners of the autocracy" (Lenin, 1902).
Today, electoral interference appears to be one of Russia's many tools to increase its influence on the
world stage (Vilmer, 2019, p. 1) and disintegrate the system they oppose; the Western liberal democracy
and its institutions. Vladimir Putin, the contemporary president of Russia, called the fall of the Soviet
Union the greatest catastrophe of the 20th century (Vilmer, 2019, p. 1) and tries to restore Russia's former
grandeur (H. Bouwmeester, interview, 10 May 2021; Lynch, 2016; Vilmer, 2019, p. 1). To accomplish
that goal, Russia developed an offensive interpretation of soft power which can be used to "exert political
pressure on sovereign states, interfere with their internal affairs, destabilise their political situation, and
manipulate public opinion" (Ministry of Foreign Affairs of the Russian Federation, 2013).
We live in an age of disinformation. Data gets stolen and leaked to the press for negative effect;
political passions get inflamed online to drive wedges in existing faultlines; perpetrators sow doubt and
deny responsibility while covertly ramping it up behind the scenes (Rid, 2020, p. 6). The traditional
compartmentalised warfare approach is no match for adversaries using a hybrid strategy, which blurs
the line between war and peace (Vilmer, 2019, p. 45). Disinformation is not new. The modern era of
2
disinformation started in the 1920s in the interwar years and during the great depression with the first
wave of disinformation. The operations were innovative, conspirational and twisted. They were carried
out by the United States, Soviet Union, and others. In the second wave, after the Second World War,
disinformation became more professionalised with aggressive, unscrupulous and subversive methods.
During the third wave in the 1970s, disinformation became well-resourced and fine-tuned, honed and
manages, lifted to a global operational level, and administered by a vast well-oiled bureaucratic regime.
The Soviets had the upper hand, but the goals were the same: to exacerbate existing tensions and
contradictions in the adversaries body politic by leveraging facts, fakes, and ideally a disorienting mix
of both (Rid, 2020, pp. 6–7). The Soviet Union collapsed and, and the remaining sense of ideological
superiority retreated. Disinformation campaigns were less present in the following years and slowly
resurged in the 2010s, indicated as the fourth wave of disinformation. Disinformation campaigns are
now reshaped by new technology and internet culture (Rid, 2020, p. 7) and used intensively by the
Russian Federation. Current disinformation tends to focus on growing legitimacy problems in the west
to undermine democracy (Bennett & Livingston, 2018), focusses on existing narratives and faultlines
(Gorodnichenko et al., 2018; J. Prier, 2017; Risso, 2018; Swami et al., 2018; Till, 2020; Zimmermann
& Kohring, 2020), and with new technology is brought directly to the people.
Unfortunately, existing theories in International Relations give limited, if not contradictory,
guidance in assessing international disinformation campaigns concerning International Security
(Lanoszka, 2019, p. 245). However, the use of disinformation becomes an increasingly salient aspect of
global politics (Gerrits, 2018, p. 1), and the role of disinformation in International Security should be
assessed. Some scholars argue that disinformation is a soft security challenge and does not directly affect
the international balance of power (Gerrits, 2018; Lanoszka, 2019). Based on a comparative analysis,
this thesis explores to what extent disinformation affects International Security regarding a Realist
interpretation of the international system. Building upon cases from Germany, the Baltic States, the
United Kingdom, and France, this thesis argues that when disinformation is applied thoughtfully, using
and abusing existing frames, narratives and other essential characteristics, it can certainly impact
International Security, especially in the long-term. As Ladislav Bittman, a Soviet defector, argued back
in 1985, "Soviet disinformation operatives know that single covert action, however precisely designed,
cannot tip the balance of power between the Western Alliance and the Communist bloc. But they believe
that mass production of propaganda and disinformation over a period of several decades will have a
significant effect. The strategy seems to work" (Bittman, 1985, p. 2). While analysing specific societal
characteristics needed to make disinformation successful, this thesis allows us to understand the
workings of disinformation within society based on a comparison from different disinformation
campaigns and how Russia tries to influence Western domestic society to generate international strategic
advantages. Based on these findings, recommendations are given to confront this security threat and
make society more resilient in the epilogue.
3
Firstly, after this introduction, the methodological approach to the concept is specified. To
define to what extend disinformation affects International Security, this thesis is further structured as
follows. In the second chapter, this thesis gives an overview of the contemporary debate surrounding
disinformation and provides the reader with a broad insight into the concept. In addition, the first section
provides the theoretical foundations on which the case studies are analysed, what the theoretical origins
and goals of disinformation are, and provides an analytical framework to determine the role of
disinformation within International Security. In the third chapter, the theoretical foundations are
analysed in a fourfold case study to assess the role of disinformation. While analysing disinformation
campaigns, susceptibly originating from Russia, in Germany, the Baltics, the United Kingdom, and
France, this thesis identifies the characteristics needed for a disinformation campaign to be successful,
the goals of these campaigns, and how successful those campaigns were in affecting International
Security. In chapter four, this thesis concludes, based on the findings of chapter two and three, to what
extend disinformation affects International Security. Attached is a concluding epilogue that identifies
how to confront disinformation as a security threat. Overall, this thesis indicates what disinformation is,
what the origins of disinformation are, what the goals are while using disinformation, what strategies
can be used to confront disinformation, and to what extent disinformation affects International Security.
1.1 Methodology
This thesis is written to give a better understanding of disinformation as an International Security
problem. Therefore, it focuses on explaining the role of disinformation in the International Security
environment and describe the relationship between disinformation and International Security. During
the research, qualitative data on the topic was collected and used to develop the thesis results. Use has
been made of, mostly, secondary sources on the topic. The use of existing secondary academic sources
is a suitable approach as multiple researchers already researched the topic, and this thesis compares
different disinformation cases together to make an argument. Building on existing research gave solid
theoretical and empirical foundations to create the operationalisation for the case studies and gave
reliable and valid results as sources were used that are well researched with a comprehensive
methodology. This includes, primarily, academic journal articles on the topic. While comparing
analysed and assessed data on disinformation, the argument of this thesis could be made. Gaps in the
research were often filled with primary data, consisting of, for example, news articles, press statements,
and policy documents to explain a specific case further and give more context surrounding the
disinformation campaigns. The majority of this thesis is a result of desk research supplemented with
some interviews. The desk research was the most prominent during this research.
The sources used come from a Western standpoint of view on the theory of the topic.
Interpretations were made based on other researchers' interpretations and created subjectivity in the
research (See Appendix B, interview, A.J.H. Bouwmeester). The thesis is based on Western knowledge
of the topic and corresponding norms and values. As this thesis focuses on Russian disinformation,
4
Russia, for example, has different interpretations of world events, and Russia has a different perception
of disinformation instead of the sources used throughout the research. In this thesis, Russia is viewed as
a security threat to Western values and standards. The historical, still existing, Cold War enmity between
Russia and NATO/EU, which even dates back further in history, indicates how they perceive each other,
why specific threats are identified and why most research focuses on Russia. Russia sees NATO/EU as
a threat, while NATO/EU sees Russia as a threat. Therefore, the thesis consists of primarily subjective
truths based on perceptions, feelings and opinions. Moreover, these subjective truths are dynamic and
affected by time, events, places, cultures, and history. The understanding of the results is, therefore,
susceptible to change. The perceived enmity between Russia and the West allowed the researcher to
identify empirical case studies. This makes the approach subjective to interpretation but gives insight
into how the West perceives the issue and gives insight into the issue of disinformation in countries part
of the so-called West. Therefore, this approach is suitable for an existential conclusion and contributes
to the debate from a Western perspective. By taking this approach, a standpoint was made regarding the
origins and goals of disinformation, which was needed to answer the research questions—the term
disinformation limits forms of information that is misleading and actively placed within societies.
As mentioned, this thesis primarily focuses on Russian disinformation in Western countries.
More specifically, the focus is on disinformation in EU and/or NATO members by Russia. The European
Union indicated that disinformation is a significant problem and sees Russia as the greatest threat
(Gerrits, 2018, p. 7) and can be considered the front runner in the use of disinformation. Analysing, from
a Western point of view, how disinformation works within societies in Europe, what Russia wants to
achieve with using disinformation and how to confront this threat is, therefore, relevant, as contemporary
International Relations theory give limited guidance (Lanoszka, 2019, p. 245) and Russia seems the
most relevant actor and user of disinformation campaigns.
To test the operationalisation of the theoretical framework, a comparative study has been done
into fourfold cases. First, Germany is the main target for disinformation from Russia and is an important
ally within NATO and the EU. Therefore, Germany was of strategic interest, by both the West and
Russia, to analyse during the thesis. Especially considering the mobilisation of groups affected by
disinformation, this case was essential to analyse. This case indicates how disinformation can mobilise
groups, how targeted audiences can perceive disinformation, and which societal characteristics a
campaign must comply with to be successful. In addition, the campaign was in the run-up to the most
recent elections in Germany, which is an excellent opportunity to weaken state structures. Second, the
Baltics are bordered by Russia, are strategically used by NATO (e.g. military exercises, STRATCOM
HQ), the populations of some states consist of Russian minorities, and the Baltics were part of the former
USSR and Warshaw pact for a substantial period. The Baltics, therefore, are strategically important for
both Russia and NATO and valuable to analyse. It is especially essential because the disinformation
campaign did not get a foothold in domestic Baltic society and, therefore, not affected International
5
Security. Nevertheless, as indicated in the analysis, the Baltics consist of Russian minority groups and
non-citizens, which are the main target for Russian disinformation and destabilise internal politics.
Third, Brexit was of significant geopolitical interest and the first disintegration of the European Union.
The disintegration of the European Union is one of the main strategic goals of Russia and, therefore,
this specific event is interesting to analyse considering disinformation. Moreover, Brexit had a strategic
effect on International Security in favour of Russia, which is in line with Russian foreign policy.
Therefore, it is essential to indicate what role disinformation played in the Brexit vote and how
disinformation, in this case, possibly affected International Security. Lastly, France is an important ally
of NATO and the EU and, thereby, strategically important. Because France was the next target after
Brexit, it could have been the next step to further disintegrating the European Union during the most
recent elections. This case is vital to analyse because the disinformation failed and was not perceived
by its target audience and, therefore, not affected International Security. For the argument of this thesis,
this is, therefore, essential for the overall argument. By indicating what went wrong in France and the
Baltics and what went right in Germany and the United Kingdom, an argument can be made about the
role of disinformation and how it affects International Security. Together, the complete comparative
study led to an explanation and trend identification of disinformation by Russia against its perceived
enemies. Building on multiple cases, a valid argument can be made concerning disinformation and
International Security. Using a single case would have delivered a limited view on the issue and would
not be comprehensive enough to make a solid argument. Multiple countries also respond differently to
disinformation and give a better and broader understanding of the variables needed for success.
Moreover, a solid argument can be made on the relationship between disinformation and International
Security using multiple case studies, as the relationship differs in every country and is not limited to a
single conclusion.
Lastly, use has been made of some interviews. The interviewees were carefully selected based
on expertise and knowledge and identified via the network of the writer. The interviews provided an
extra layer of data for this thesis, as they gave some practical knowledge on the topic and were of value
in identifying other primary and secondary sources. All interviewees were somehow related to work
with the topic investigated and are knowledgeable on the topic. The interviews also gave
supplementations to the sources and literature used throughout the thesis. All interviews were conducted
on a semi-structured basis and provided, based on the goals of this thesis, practical knowledge and
directions for the research. The interviews are added in the appendix and are used as references
throughout the thesis.
6
2. Theoretical foundations
2.1 States, Power, War and International Security in the New Age
The concept of security has been heavily debated and, in recent decades, it became more inclusive and
pivoted away from the military realm (Ullman, 1983, p. 133). According to Ullman (1983, p. 133), “a
threat to security is an action or sequence of events that (1) threatens drastically and over a relatively
brief span of time to degrade the quality of life for the inhabitants of a state, or (2) threatens significantly
to narrow the range of policy responses available to the government of a state or to private,
nongovernmental entities (persons, groups, corporations) within the state”. For Ayoob (1995, p. 9), the
nexus between security and the affected population lies in a concept he names 'security-insecurity',
which he defines as: “in relation to vulnerabilities- both internal and external-that threaten or have the
potential to bring down or weaken state structures, both territorial and institutional, and governing
regimes”. In recent decades, security is identified as much more than the absence of military war and
the threat of military invasion. Focusing security solely on military terms confounds a false image of
reality, which is profoundly dangerous. It has led states to focus on military threats, resulting in the
ignorance of perhaps more hazardous and harmful dangers, and reduces total security (Ullman, 1983, p.
129). However, these days the threat from other states might be less visible. The Copenhagen school
offers us the sectoral approach, which allows us to identify multiple referent objects to securitise.
Though not conventional, it approaches the human as the ultimate referent object. Furthermore, it
provides several themes along which threats and subsequent modes of combating these threats can be
identified. The Copenhagen School identifies five security sectors with all their referent objects and
threats. The sectors range from military, societal, environmental, economic and political security (Buzan
et al., 1998).
Along with the increasingly inclusive concept of security, the concept of power can also be
directed away from solely military power (Ullman, 1983). States are increasingly involved in other
forms of power. As (Tzu, 2008, p. 37) indicated, “[…] to fight and conquer in all your battles is not
supreme excellence; supreme excellence consists in breaking the enemy's resistance without fighting”.
As the concept of security became more inclusive, so did the art of performing war and the way states
pursue power. Warfare went away from military warfare, which speaks to the imagination, to more
recent hybrid forms of warfare. Hybrid warfare has become the common term to capture twenty-first-
century warfare's complexity (Filipec, 2020; Wither, 2016).
Forms of conventional and unconventional warfare have been used throughout history, and
recent developments increased the range of some hybrid tools (Appendix A, Interview, D. Pronk). The
term hybrid appeared to be the best way to describe the range and variety of tools and methods employed
in modern-day warfare (Wither, 2016, p. 75), forming a threat to International Security. It is identified
as "the use of military and non-military tools in an integrated campaign, designed to achieve surprise,
7
seize the initiative and gain psychological as well as physical advantages utilising diplomatic means;
sophisticated and rapid information, electronic and cyber operations; covert and occasionally overt
military and intelligence action; and economic pressure ('Complex Crises Call for Adaptable and
Durable Capabilities', 2015, p. 5)"—this definition allows for the inclusion of non-military methods into
war studies. The development of the new concept of weapons in the hybrid spectrum helps pursue one
state's interest by bending another state to its will (Wither, 2016, p. 78). Thus, hybrid warfare can be
seen within international politics' Realist concept
1
2
, where sovereign states are in a constant power
struggle, compete with other states, and pursue their interest while using hybrid tools. A Clausewitzian
interpretation of war by other means.
A Realist approach would argue that states are central actors in International Relations, the
international system is anarchic, states act in their rational interest and states desire power to survive in
the international system (Collins, 2018, pp. 13–29). This unstable and anarchic international system
generates uncertainty, and, therefore, states try to maximise their security as they are cautious by the
intentions and motives of other states (Collins, 2018, p. 17). As indicated in the article by Filipec (2020),
Realism is based on the logic that states are concerned with their survival, politics are based on interest
in terms of power, and a lack of a world government and states operate in a self-help system. Moreover,
the international system is always competitive and in potential conflict as states constantly seek power
to become powerful enough to survive (Collins, 2018, pp. 13–29). In addition, as the international
system is unstable and constantly changing, in the modern world, states might turn to unconventional
and less visible instruments to gain power and pursue their interest instead of military warfare (Filipec,
2020).
Filipec (2020) analyses how states seek power and how hybrid forms of warfare plays a role in
pursuing their state interest (e.g. power struggle). The element of power is essential for analysing this
thesis, as power is a means for states to ensure survival (Collins, 2018, p. 19). Power can be defined as
“[…] anything that establishes and maintains the control of man over man" and that "[…] its content
and manner of its use is determined by the political and cultural environment (Morgenthau, 1973, p. 8)'.
By having control or exercise influence over other states, states have power in the international system.
1
‘Realism’ is a dominant school of thought in international relations theory unified by the belief that world
politics is always and necessarily a field of conflict among actors (states) pursuing power (Collins, 2018, Chapter
2). Other dominant theories are ‘Liberalism’ and ‘Constructivism’. ‘Liberals’ believe that international
institutions play a key role in cooperation among states. With the correct international institutions, and increasing
interdependence (including economic and cultural exchanges) states have the opportunity to reduce conflict
(Collins, 2018, Chapter 3). ‘Constructivists’ argue that security is a social construction. They emphasize the
importance of social, cultural and historical factors, which leads to different actors construing similar events
differently (Collins, 2018, Chapter 6).
2
This thesis analyses disinformation in relation to International Security based on the Realist perceptions of the
international system. Disinformation, in this thesis, is seen as a means to achieve desired geopolitical outcome
and affect the balance of power. By influencing other states with disinformation, the disseminator controls, or
tries to get control over, other states in the struggle for power. Therefore, the Realist approach is used for the
argument of this thesis and the relation between Realism and disinformation is specified.
8
When states act according to the interest of other states, we understand this as power. Such is the struggle
that continuously goes on, according to the Realist perspective. The aim in hybrid warfare stays the
same as a conventional war, to exploit the threat to gain physical or physiological advantages over an
opponent (adversary state) (Wither, 2016, p. 86) and gain more power and control in the international
system. Thus, hybrid forms of warfare may lead to the same outcomes as conventional warfare; power
reduction of the adversary and effective control (Filipec, 2020, p. 57).
2.2 Disinformation as a Means of Warfare
As described in the articles by Filipc (2020) and Wither (2016), a hybrid 'tool' that has been powerful
and prominent in the last years is disinformation, also referred to as information warfare. To better
understand disinformation, it is essential to indicate the difference between misinformation and
disinformation (See Appendix C, Interview, NCTV). Where misinformation refers to “[…] the action
of misinforming someone; the condition of being misinformed” (la Cour, 2020, p. 708), disinformation
is referred to as “the dissemination of deliberately false or misleading information especially when
supplied by a government or its agent to a foreign power or the media, with the intention of influencing
the policies or opinions of those who receive it; false information so supplied” (la Cour, 2020, p. 708).
All research acknowledges the difference between misinformation and disinformation, where
misinformation is false but without intent or false information by accident; disinformation has a
controversial goal and a desirable effect (Bennett & Livingston, 2018; Gerrits, 2018; la Cour, 2020;
Lanoszka, 2019; Sarts, 2021; Tenove, 2020). Disinformation is an increasingly salient feature of
International Relations. When a foreign power plants disinformation with malign intent, false
information becomes disinformation and a tool used in the struggle for power (La Cour, 2020, p.705)
and the advancement of political goals. The primary goal of disinformation is to confuse and mislead in
order to spread disagreement and division among parts of the population in foreign countries (Gerrits,
2018, p. 5).
In a conflict between states, the goal has always been about changing the other's behaviour and
letting states act according to other states' will (Sarts, 2021, p. 24). Governments and policy-makers
have echoed concerns about disinformation because it undermines national security- in the international
context- sovereignty (Tenove, 2020, p. 522). The spread of disinformation is often referred to as
information warfare, which the Russian government indicates as “the confrontation between two or more
states in the information space with the purpose of inflicting damage to information systems, processes
and resources, critical and other structures, undermining the political, economic and social systems, a
massive psychological manipulation of the population to destabilise the state and society, as well as
coercion of the state to take decisions for the benefit of the opposing force” (Ministry of Defence,
Russian Federation, n.d.). Where the aim is, according to Wither (2016, p. 81), “to apply psychological
pressure to cause the collapse of the target state from within so that the political objectives of the conflict
can be achieved without fighting – the acme of strategic skill according to Sun Tzu”. In international
9
politics, the aim is, therefore, to influence public opinion in targeted countries with disinformation to
get desirable policy effects, changes in behaviour and affect the balance of power (Gerrits, 2018; Heuer,
1990, p. 5; Lanoszka, 2019; Thomas, 2004, p. 241), in order to get stronger in the international system,
gain power and ensure survival without fighting. The balance of power, in Realist terms, means that
states may secure their survival by preventing any one state from gaining enough power to dominate all
others. When threatened, states may seek safety by balancing, allying with others against the prevailing
threat; or bandwagoning, aligning themselves with the threatening power, or gaining more power by
getting more influence (Walt, 1987). In this sense, the use of disinformation is geostrategic (Bennett &
Livingston, 2018, p. 132). Information warfare is used to distort domestic or foreign political sentiment
to achieve a strategic or geopolitical outcome (Thomas, 2004; Till, 2020, p. 12; Weedon et al., 2017, p.
5).
For Russia, one of the main methods of foreign influence is through the theory of reflexive
control (RC) or perception management. The idea of perception management tries to control, rather than
manage, subjects (Thomas, 2004, p. 237). It is defined as a means to “conveying to a partner or an
opponent specially prepared information to incline him to voluntarily make the predetermined decision
desired by the initiator of the action” (Thomas, 2004, p. 237). RC is considered an information warfare
means (Thomas, 2004, p. 240) and has become an important weapon in achieving geopolitical objectives
and as a means to become powerful in the international system. When successfully used, RC on one side
can impose its will on the enemy and cause him to make a favourable decision (Thomas, 2004, p. 242).
One can, for example, influence the opponents' perception of a situation and the opponents' goals and,
by doing so, influence the opponents' decision-making (Heuer, 1990, p. 6). Successful RC requires an
in-depth study of the enemy’s inner nature, ideas, and concepts (Thomas, 2004, p. 243). Nowadays, RC
is seen as a method to manipulate individuals perceptions of reality (Till, 2020, p. 6).
2.3 Disinformation in Modern Day Society
The reach of social media and the difficulties of policing enables foreign political agents to reach
national and international audiences with strategic disinformation (Bennett & Livingston, 2018, p. 134),
along with increasing technological developments (Thomas, 2004, p. 246). This creates new
opportunities for (foreign) political actors to influence public opinion (Till, 2020, p. 5) and create
alternative perceptions of reality (Till, 2020, p. 6). Inauthentic profiles are profiles created by foreign
officials posing as 'real people', producing various forms of 'organic content' and engaging with foreign
citizens with the purpose of influencing those targeted citizens (la Cour, 2020). Prier (2017) analyses
how Twitter trends are used for disinformation campaigns and how these Twitter trends are picked up
by mainstream media over time and become mainstream news. Disinformation is used within an existing
narrative to penetrate a network of 'true believers'. Commanding the trend is where they take an existing
topic to weaponise it (Prier, 2017). Use is, for example, made of bots who are retweeting and sharing
disinformation on social networks to reach a broader audience and become a trend. When disinformation
10
is often shared between groups of peoples, it becomes a trend, often picked up by journalists searching
social media to report upon trends for breaking news. The fake news becomes, in this case, 'real' news
(Prier, 2017). Recent innovations in social media have also enabled disseminators of disinformation to
successful target users, enabling them to act as receptors and encourage sharing behaviours shaping
them into influential spreaders (Till, 2020, p. 12).
Disinformation is used to confuse and create doubts and noise about ‘truths’ within societies
(Lanoszka, 2019, p. 239) and destabilise truth claims and/or construct new micro realities through
targeted messaging (Till, 2020, p. 12). With all the above-described factors combined, it may be that we
have entered a 'post-truth' order. Post-truth is a philosophical and political concept for the disappearance
of shared objective standards for truth and the "circuitous slippage between facts or alternative facts,
knowledge, opinion, belief, and truth" (Biesecker, 2018, p. 329). The integration of social media
interactions has accelerated the flow between ‘objective reality’ and ‘subjective reality’ (Till, 2020, p.
4). Individuals can present their interpretations of events (subjective reality) on social media, which
others see as ‘objective reality’ (actual existing social facts). Most reality construction occurs in people's
everyday actions and (online) interactions (Till, 2020, p. 11). Digital media plays an increasingly
important role in steering beliefs and behaviours (Till, 2020, p. 4). La Cour (2020) describes that the
post-truth society has created a pathway for disinformation due to the declining trust in basic facts and
institutions (See Appendix B, Interview, A.J.H. Bouwmeester). Due to this lack of trust, there has been
a development that emotions have increasingly dominated politics and crowded out rational, evidence-
based decision making (La Cour, 2020, p. 709).
For disinformation to be successful, it needs to build on an existing narrative, a network of true
believers who already buy into the underlying theme and the network to accept the disinformation as a
fact. Disinformation needs to resonate with the target; it must fit a belief structure and confirms peoples
bias (Prier, 2017). This is done by manipulating the ‘filter’ or ‘frame’, which refers to the collection of
concepts, knowledge, ideas, and experience used by targets to make decisions (Till, 2020, p. 7). These
filters and frames are then exploited to create discords in society or widen existing fractures (Sarts, 2021,
p. 29). Frames enable citizens to locate, perceive, identify and label occurrences within their living space
and the world. Collective frames, thereby, evolve by negotiating shared meaning, and collective action
frames are an action-oriented set of beliefs and meanings that inspire and legitimise social movement
organisations' activities and campaigns (Benford & Snow, 2000). Collective action frames are
constructed in part as movement adherents negotiate a shared understanding of some problematic
condition or situation they identify in need of change. According to Benford and Snow (2000, pp. 615-
618), critical tasks in collective framing are diagnostic framing (problem identifications and
attributions), prognostic framing (the articulation of a proposed solution to a problem, plan of attack,
and strategies), and motivational framing (call-to-arms, vocabularies of motives, and vocabularies of
severity, urgency, efficacy, and propriety). For example, some frames can be seen as injustice frames,
11
where the movement identify themselves as victims of a given injustice and amplify their victimisation.
Moreover, various collective action frames include problem identification and direction, flexibility and
rigidity, variation in interpretive scope and influence, and resonance. Resonance is relevant to the
concept of the effectiveness of mobilising potency of framings. The credibility of framings is a function
of three factors: frame consistency (as between what the social movement says and what it does),
empirical credibility (the fit between framings and events in the world, the more believable a claim is,
the broader its appeal will be), and credibility of frame articulators or claims makers (the more credible
the speakers, the more persuasive they are). As movements resonate with citizens' personal and everyday
experience, the social movement will more likely grow (Benford & Snow, 2000, pp. 619-622). Besides,
another factor with a significant impact on frame resonance is narrative fidelity. When the claims are
culturally resonant, and the more claims resonate with the targets cultural narrations, the greater the
prospect for mobilisation. For example, when frames tap into existing cultural beliefs, values, narratives,
and folk wisdom, the frame amplifies best (Benford & Snow, 2000, p. 622). With aspects of identity,
those emotional engagements are considered the most effective way of targeting people most susceptible
to mobilisation (Till, 2020, p. 11).
As Heuer (1990) analysed, influence operations are established by an initiator using channels to
reach specific targets to accomplish goals (Heuer, 1990, p. 4). For example, a foreign state (initiator)
aims disinformation at the domestic general public (target) through social media (channel) to influence
specific decisions (goal). Each influence operation has a specific goal, uses various methods, makes use
of different opportunities, and has different levels of effectiveness (Heuer, 1990, p. 4). Influencing a
target individual goes through four principal steps: perception or receipt of the message; understanding
and evaluation of the message; a change in attitudes; opinions or judgements resulting from the message;
and behaviour change (Heuer, 1990, pp. 8–18). First, the message has to be perceived by the target. How
the target perceives is strongly influenced by its experience, education, cultural values, and tole
requirements (Heuer, 1990, pp. 9-10). Second, to be effective, the message must be understood and
evaluated. New information will tend to be evaluated in a manner that supports existing beliefs.
Influence operations need to exploit existing preconceptions rather than change existing beliefs (Heuer,
1990, pp. 11-12). Third, the influence operations have to lead to forms of attitude change. Influence
operations are most effective when carefully considering the merits of information and aimed at existing
emotions and beliefs (Heuer, 1990, p. 15). Finally, influence campaigns must lead to behaviour change
to be effective. The goal is to change actions and not just opinions. Joining a demonstration, signing a
petition, or vote choice can be changes of behaviour. This suggests that influence operations will be
most effective when targets have the opportunity and incentive to act (Heuer, 1990, p. 15-18). However,
this behaviour must be linked to influence operations, and there is a need to understand why people did
what they did. It is only successful when aimed at those willing to listen and able to act on it. The
relationship between influence operations and effectiveness is hard to prove. One should always
12
consider other factors that may have influenced behaviour and opinion changes (Heuer, 1990). To
measure effectiveness, three potential indicators can be analysed: quantity of activity, opinion polls, and
behavioural indicators (Heuer, 1990, pp. 25–28).
Sarts (2021) explains how the new information environment created a favourable outlook for
foreign adversaries' disinformation campaigns and analyses disinformation concerning national security
and weakening democracy (Bennett & Livingston, 2018; Tenove, 2020). Moreover, the new information
environment has led to new ways of reality construction as developments in communications has led to
new ways of meaning production (Till, 2020, p. 4) and options to employ reflexive control (Till, 2020,
p. 7). First, Tenove (2020. p. 521) argues that disinformation is a threat to a states self-determination as
it undermines states' sovereignty. Foreign influence undermines democracies' ability to rule themselves
and compromises the selective empowerments that enable citizens to contribute to giving themselves
rules. Second, disinformation is seen as a threat to accountable representation, as disinformation
threatens electoral integrity and democracy. The most straightforward use of disinformation is spreading
wrong information about voting processes, candidates, or election issues, thereby undermining electoral
integrity. Third, disinformation is seen as a threat to political deliberation. Disinformation is aimed at
degrading the quality of discourse within society. Disinformation influences the 'opinion formation' and
the 'collective will formation', leading to less well-informed public decision-making. Targetting social
groups with false claims, conspiracy theories, chauvinistic language, and imagery provokes moral
revulsion toward candidates and officials (Tenove, 2020, p. 528). Increasing the number of false claims
lead to degraded deliberation in society. These false claims are often reshared by domestic public
figures, such as politicians, and help spread the disinformation campaigns. According to Lanoskza
(2019, p. 237) and Tenove (2020), disseminators call those domestic public figures 'useful idiots' as they
share the preferred disinformation for them, sometimes unknowingly, so that the message gains its
circulation in society. Analysis shows that domestic political candidates, journalists, and citizens played
significant roles in spreading disinformation, promoting foreign interest (Tenove, 2020, p. 519).
Distinctions can be made between fully employed agents of influence (like the Internet Research
Agency), locally recruited agents of influence (activists, politicians, etc.) or unwitting agents of
influence (effective receptors and spreaders) to plant disinformation, discredit viewpoints or cause
confusion (Till, 2020, pp. 7-11).
Bennett & Livingston (2018) argue that the spread of disinformation can be traced to many
democracies' growing legitimacy problems. The radical right, for example, disregarded the mainstream
media in recent years, which has led to parts of societies have disbelief in media, institutions and
governments. Besides, it has helped to establish alt-right alternative media promoting opposing versions
of daily reality. Algorithmically enabled communities of like-minded people now exist on scales not
captured by terms like 'filter bubbles' (Bennett & Livingston, 2018, p. 125). Users have become endorsed
within specific self-selected groups, and views are shared with like-minded people on social media.
13
People are more willing to believe things that fit with their worldview. Over time, the fake news sources
become for those people more trustworthy than legitimate news sources. The availability of a firehose
of disinformation can slowly alter opinions (Prier, 2017). Strategic disinformation is distributed and
promoted through conspiracy theories that often mimic journalism formats (Bennett & Livingston,
2018). The narratives created on these alternative media platforms often cycle back through the
mainstream media, repeating the disinformation-amplification-reverberation (DAR) cycle. Bennett &
Livingston (2018) argue that at the core of their argument is the breakdown of trust in democratic
institutions of press and politics, which has led to increased space for populism in many democracies.
Disinformation aims to disrupt the institutional order, undermine politicians, and create confusion
(Bennett & Livington, 2018, p. 130). Developments in democracies, such as globalisation and
privatisation, have led to a situation that Crouch (2004) indicated with the term post-democracy. By
Crouch's definition: "A post-democratic society continues to have and to use all the institutions of
democracy, but in which they increasingly become a formal shell. The energy and innovative drive pass
away from the democratic arena and into small circles of a politico-economic elite” (London School of
Economics, 2013). For example, privatisation and globalisations have led to the breakdown of core
processes of political representation and declining authority of institutions, which has further led to
disbelief and disenchantment within democratic societies. Strategic disinformation by foreign and
national actors is used to exploit these vulnerabilities in democracies. Combined with the radical right
who reject the core institutions of press and politics, along with the authorities who speak through them,
there is a growing demand for alternative information and leadership that explains how things got so out
of order (See Appendix B, Interview, A.J.H. Bouwmeester). Depending on the country, “one finds a
mix of sources, including (a) alt news sites promoting ethnic nationalism, anti-immigrant and refugee
hate news, and globalist conspiracies, along with tie-ins to daily national political news developments;
(b) party and movement website networks such as those run by the Austrian Freedom Party, with links
to Facebook and social media accounts of leaders supplying updates on party news, interspersed with
'nostalgic' nationalist propaganda; (c) foreign 'non-linear warfare' operations (a term coined by Putin
advisor Vladislav Surkov) aimed at destabilising elections and governments; (d) along with enterprising
fake news businesses springing up in the 'attention economy'” (Bennett & Livingston, 2018, p. 128).
Disinformation, with its potential impact, is widely considered a threat to democracy, but it has mainly
to do with the current state of democracy itself. Political and societal polarisation, declining trust in
institutions and media, the rise of populism and strongmen politics add to the feeling that the Western
liberal democracy is under pressure. (Gerrits 2018, p. 6).
2.4 Disinformation in the Struggle for Power
As disinformation is a tool in the struggle for power to achieve state interest and generate favourable
outcomes, how effective is disinformation, as a hybrid tool, in the international system? Lanoskza (2019)
argues that disinformation should not work in a world of anarchy because states are not likely to use
14
information from adversaries due to the lack of trust in information coming from another state.
Information from other states will be disregarded, argues Lanoskza (2019, p. 234). However, this implies
that states can identify information coming from another state, which might be difficult, especially from
an ally. Moreover, he argues that disinformation should not work because people are already
ideologically biased. It will be, therefore, challenging to bend peoples opinions on specific facts with
disinformation. Furthermore, it is hard for an adversary to change the minds and bend public opinion of
a sufficient share of the target population to get favourable outcomes in the electoral system (Lanoskza,
2019, pp. 235-237). Moreover, according to Lanoszka (2019, p. 236), citizens are not likely to believe
information that is much in contrast with their ideological beliefs, and average citizens will neither grasp
balance of power politics nor heed faraway, indirect internal threats to national security. Citizens of
states will tend towards scepticisms for information favourable to an adversary. However, one should
always consider disinformation that fits ideological beliefs and narratives.
According to Gerrits (2018, p. 5), the primary objective is to weaken democracy by confusing
and misleading populations to benefit from other government decisions and increase its international
influence. As described by many authors, informational developments give increased opportunities to
spread biased and fake messages. The Brexit (2016) and the US elections (2016) raised international
awareness of the dangers of disinformation. Recent developments gave rise to the feeling that Western
democracies are under pressure, and with the potential impact disinformation can make within those
feelings, disinformation is considered a danger to democracy (Gerrits, 2018, p. 6). The goal of
disinformation is to influence and undermine political processes in various countries. Disinformation is
not an end but serves a larger political objective. The Russian government, for example, considers
disinformation as an essential aspect of International Relations and uses disinformation to weaken
Western dominance. According to Gerrits (2018), the Russian aim is not to preach its point of view, but
with the spreading of false, or simply as much information as possible, the goal is to bewilder and
debilitate with the spread of prejudiced messages. By weakening other states, Russian can accomplish
more strategic goals in the international system and become more powerful, which is in line with the
Realist interpretations of the international system. The level of effectiveness has to do with domestic
circumstances. Developing technologies give, however, many new opportunities to wage disinformation
campaigns. As Gerrits (2018, p. 21) argues: "disinformation seems at most a soft security challenge.
The domestic and international effects of disinformation are causally related. Misleading, confusing and
dividing the population in other countries may be an objective in and of itself, but for disinformation to
have serious international consequences, manipulated ideas among significant parts of the population
need to be translated into state policies, which reflect the foreign policy ambitions of the disinforming
state. It is not impossible. Brexit is a political event of great strategic importance. It undermines the
global position of the European Union; it favours its competitors. Brexit has strategic effects in terms of
15
international alignment and balance of power. The point, however, is that there is no compelling
evidence that the Brexit vote was decisively manipulated from abroad".
2.5 Operationalisation
Disinformation aims to create tensions on the political fault lines and use existing forms of disbelief and
disenchantment to aim disinformation at vulnerabilities. Disinformation can be targeted at specific
frames or create frames to mobilise social movements in a country and undermine state institutions.
When disinformation is aimed at existing narratives, frames or disbelief within a society, it can lead to
mobilisation. Disinformation can be targeted at existing problems, such as injustice frames, to mobilise
people. When disinformation resonates with people's belief in societies and with everyday life
experience, culturally resonant and fits existing narratives, the campaigns can be successful for social
mobilisation and motivations. Already existing frames can be used by the disseminator of
disinformation, which allows the target to perceive the message (Heuer, 1990, p. 9). They may mobilise
existing opinion, define political debate or provide ammunition for the debate. For example,
disinformation can play a role in exacerbating trust in institutions in media by framing the disinformation
within this existing narrative and making use of movements that already believe in opposing truths.
Social movements and framings are interesting to analyse regarding disinformation. Frames can tell us
where disinformation can, or will, have the most effects and mobilise people to create desirable
outcomes. For example, as Bennett & Livingston (2018) analysed, disinformation can be aimed at
frames that already are in disbelief with the current institutional order and further exacerbate and
accelerate the declining trust. Disinformation can be targeted at existing anti-migrant frames to create
mobilisation against migration. When disinformation is targeted at the right audience (e.g. true believers)
in the right way (fitting bias, existing beliefs, cultural narrations, etcetera) and with the proper channels
(aiming at fora, social groups, alternative news site), it can have tremendous and desirable effects. A
slight change of public attitudes may significantly impact government policy; a swing of several
percentage points may change a government, with dramatic policy consequences (Heuer, 1990, p. 5).
The more persons are moved, and the further they are moved, the more successful the operation (Heuer,
1990, p. 16). The causal relationship between disinformation and influence is difficult to prove, as there
are many other variables involved that can cause a change in opinions, attitude and behaviour (Heuer,
1990, p. .3). The essential link between disinformation, political behaviour and political outcome must
be further analysed (Gerrits, 2018, p. 19).
Social media platforms and the broader political economy of the internet create the possibilities
for online interactions and targeting, enabling a form of political intervention focused on the
destabilisation of perceptions of reality and recruited users to construct new politically advantageous
realities (Till, 2020). Disinformation can be mixed with news reports and documented events to enhance
authenticity (Bennett & Livingston, 2018, p. 125). New technologies, like deepfakes, extend the
possibilities for the dissemination of disinformation (Gerrits, 2018, p. 20; Sarts, 2021).
16
The most significant of those threatening actions is disinformation that seeks to exert a goal-
oriented effect on public opinion or decision-makers for reflexive control. Using reflexive control
against other states will inevitably have a geopolitical impact (Thomas, 2004, p. 254). Disinformation
can target existing domestic disenchantment to exacerbate the effects and further polarise and weaken
an already vulnerable society, which can help an adversary. The chief task of reflexive control, for
example, is to locate the weak link of the filter (frame, narrative) and exploit it (Thomas, p. 241). This
requires an in-depth study of the enemy, frame, or narrative where the disinformation is targeted
(Thomas, 2004, p. 243) to mobilise and motivate people to change behaviour. In the long-term, it can
give desirable outcomes when there is eroding trust in institutions and might change the balance of
power. Disinformation does not have to be linked to another ideology to be effective; it can use pre-
existing difficulties and vulnerabilities in states and erode the line between true and false. It can deepen
the countries' existing divisions, create confusion among populations, and mobilise or persuade minority
groups to destabilise state systems (Lanoskza, 2019). Deepening the rifts within a country is an effective
way to weaken an opponent internally, weakening leadership and the ability to act (Sarts 2021, p. 27).
For example, in the Russian case, the spreading of disinformation is geostrategic to decline democratic
institutions and become more powerful in the international system. Today, Russia uses the anti-liberal
wave to weaken Western dominance (Gerrits, 2018, p. 9). Disinformation is most effective where and
when political opinions are already polarised. It confirms rather than creates and challenges pre-existing
prejudices- known as the confirmation bias (Gerrits, 2018, p. 14). Once established among sizable
enough populations, disinformation can threaten domestic orders. Many citizens actively seek
information to support political activities and identities that stem from material and emotional
dislocations from the modern national and global order. False information may engage with deeper
emotional truths for people who willfully defy reason (Bennett & Livingston, 2018, p. 135).
To conclude, Gerrits (2018) & Lanoskza (2019) analyse disinformation in its short term effect.
As they both argue, it does not influence the balance of power. However, weakening other states
domestically can eventually lead to a more favourable decision in longer strategic goals. Long-term
analyses are needed to give better insights. As they argue, disinformation is not an end in itself but part
of a larger political objective. "Soviet disinformation operatives know that single covert action, however
precisely designed, cannot tip the balance of power between the Western Alliance and the Communist
bloc. However, they believe that the mass production of propaganda and disinformation over a period
of several decades will have a significant effect. The strategy seems to work” (Bittman, 1985, p. 2).
Therefore, using existing domestic disenchantment is a valuable strategy to weaken other states
by using vulnerabilities and exacerbating them. When disinformation is correctly aimed and framed
within these vulnerabilities, it can affect International Security. Moreover, disinformation is a less
visible tool and does not need extensive resources, and technological developments might make it more
powerful. It is hard to indicate causation between disinformation and political outcomes, such as Brexit,
17
but confusing and misleading parts of the populations can ultimately lead to favourable decisions. Even
if the effects are minor and minority groups are affected, it can have an influence. As Gerrits (2018)
argues, disinformation does not have to be ideologically connected to a particular state, which critiques
the analysis made by Lanoskza (2020) that states and society will disregard information coming from
an adversary. Disinformation, most of the times, cannot be linked, by individuals, to a specific state and
is there to confuse and mislead to weaken vulnerabilities further. As analysed in this part of the thesis,
disinformation can be aimed at specific existing frames and vulnerabilities and mobilise groups or
influence people's decision-making. Using particular vulnerabilities in societies can exacerbate and
accelerate existing problems in countries and have desirable strategic effects for the disseminator.
Disinformation has the potential to narrow the range of policy responses and weaken state structures,
and, thereby, to let states act in the interest of other states (power). This, therefore, causes a threat to
security, as identified by Ullman (1983) and Ayoob (1995).
Using fourfold case studies, this thesis compares different examples of disinformation to measure
to what extent they influenced national and International Security. Empirically, the case studies will be
analysed in comparison with the theoretical foundations. For this analysis, use will be made of figure 1,
a theoretical analysis model based on the theoretical foundations.
Figure 1: theoretical analysis model
18
3 Disinformation in the Contemporary World: an Analysis
3.1 Disinformation in Germany: Lisa and the Bundestag Elections (2016-2017)
On January 11th 2016, a 13-year-old girl of mixed German-Russian descent failed to return home during
the evening. She went missing for around 30 hours, and when she resurfaced, she claimed she had been
abducted and raped by migrants (Kampfner, 2020, pp. 13-14; Mankoff, 2020, pp. 8-11). The German
police investigated the case and soon found out that she had been lying to spend the night at a friend’s
house. Later on, Lisa herself even admitted that she had been lying. Based on the investigation and the
confession by Lisa, the police quickly debunked the story to be false. Despite the speed of the conclusion
and statement three days later by the German police, the story had taken a life on its own (Mahairas &
Dvilyanski, 2018, p. 24).
The story about the rape and abduction, although untrue, went viral. Channel One, a Russian
state-sponsored channel, began the saga, and the story was then picked up by Russia Today (RT), RT
Deutsch and Sputnik. These three media outlets are well known to be under Russian governmental
control (Mahairas & Dvilyanski, 2018, p. 24). The story got international attention when it was picked
up by Breitbart, a US-based far-right news site (Kampfner, 2020, p. 13). The overt (open) media activity
was then coupled with covert (hidden) media activity, and the story was shared on anti-immigrant
Facebook groups and the anti-refugee website Ayslterror, which was later determined to have links to
Russia (Polyakova, 2019). The actions led to a quick spread of the news through various social media
and right-wing groups to distribute the information on the internet (Mahairas & Dvilyanski, 2018, p.
24). Some locals got so angry that they mounted mass demonstrations against immigrants with an anti-
Muslim agenda. Concerned citizens from further afield joined them (NTV.de, 2016). Moreover,
demonstrations against migration were organised on Facebook groups involving representatives of the
German-Russia minority and neo-nazi groups (Mahairas & Dvilyanski, 2018, p. 24).
Despite attempts to calm the uproar, subsequent days saw attacks against Muslim immigrants
(Berliner Zeiting, 2017). The story further escalated when news surfaced that German authorities tried
to protect migrants by debunking the story (Mankoff, 2020, pp. 8–9). Within a week, demonstrations
broke out in Russian-German neighbourhoods throughout various cities around the country, and also a
demonstration in front of the chancellery of around 700 people (Rutenberg, 2017). According to Ben
Nimmo, “pro-Kremlin media kept the story circulating long after it had been debunked, generating
significant tension and anti-migrant feeling in Russian and far-right groups” (Nimmo & Aleksejeva,
2017).
The Russian connection at first was indirect, but on January 26th, Russian Foreign Minister
Lavrov gave a press conference in Moscow calling on the German government not to “sweep under the
rug” the allegations and ensuring that “these migration problems do not lead to a politically correct
19
attempt to varnish the truth on behalf of some domestic political goals” (Russian Ministry of Foreign
Affairs, 2016).
3.1.1 Germany and Disinformation
Germany is one of the main targets of Russian disinformation (Baczynska, 2021). In most countries,
disinformation is aimed at right-wing narratives and in Germany, the disinformation focuses mainly on
attacking and vilifying (Muslim) immigrants, as the refugee situation has been on top of the news agenda
for a long time (Zimmermann & Kohring, 2020, p. 217). The disinformation is, as mentioned,
overwhelmingly xenophobic and negatively framing immigrants, which is leading to negative attitudes
against migrants, its consequences, and raises the salience of migrations as a problem, which is not
sufficiently addressed by the political system (Zimmermann & Kohring, 2020, p. 219). Germany is on
the frontline of Russian influence. Russia is deploying different approaches to accomplish similar ends-
an economic foothold, political influence, and ideological infiltration. The ultimate goal is to undermine
the faith of German populations in liberal democracy and institutions (Kampfner, 2020, p. 1). Therefore,
Russia supports two political parties on the extreme left (Die Linke) and the extreme-right (AfD), and
Russia adopts their disinformation messages that comply with the themes of these political parties. One
of RT Deutsch most viewed programs is, for example, Der Fehlende Part (The Missing Part), which
provides a regular diet of negative coverage about immigration and job security (Kampfner, 2020, p.
13).
Research into the significant German fact-checking websites shows that nearly “all disnews
stories in Germany contained right-wing implications such as scepticism toward the European Union
(e.g., “The European Union is going to abolish cash money starting in 2018 (Zimmermann & Kohring,
2020, p. 221).”), attacking politicians (e.g., “The father of the candidate for chancellorship Martin
Schultz was a captain of the SS and commander of the concentration camp Mauthausen (Zimmermann
& Kohring, 2020, p. 221).”) and above all, the exclusion of migrants and refugees (e.g., “Refugees from
Arabia cause hepatitis A epidemic across Europe (Zimmermann & Kohring, 2020, p. 221). The Lisa
story focused on exploiting the existing divisions among Germans relating to Arab-migrant issues and
focused on existing fault-lines in society. (Mahairas & Dvilyanski, 2018, p. 24).
The fabricated Lisa case was supposed to fit the current Kremlin narrative, stressing that Europe
cannot cope with the refugee crisis (Janda, 2016, p. 2). Supposedly, the disinformation case was built
on the objective to discredit Chancellor Merkel’s reputation before the Bundestag elections by reducing
the support for the migration and immigration policy in Germany (la Cour, 2020, p. 712), and especially
blaming Chancellor Merkel’s so-called ‘welcome policy’ (Zimmermann & Kohring, 2020, p. 231). This
was in Russia’s strategic interest as Chancellor Merkel visions on Ukraine were not in line with Russia’s
views on Ukraine (la Cour, 2020, p. 712). The story was supposedly created to weaken support for
Merkel, which in turn would weaken her ability to stand up against Russia’s actions in Ukraine (la Cour,
20
2020, p. 715; Mahairas & Dvilyanski, 2018, p. 24). The Lisa case surfaced only a week and a half after
the New Year’s Eve attacks on German women by migrant gangs in Cologne and surfaced when
emotions were still raw and anti-migrant sentiment high (Mankoff, 2020, p. 9).
Germany was increasingly worried about the disruptive effects of Russian disinformation
campaigns during the 2017 Bundestag elections. While the run-up to the Bundestag elections saw an
uptick in the circulations of false or misleading information online, the feared leak of confidential
documents (as happened in the United States and France) did not occur (Mankoff, 2020, p. 10).
Zimmerman & Kohring (2020) indicate that fewer people trust the established news media and politics,
the more they believe online disinformation is accurate, and disinformation beliefs lower the odds of
electing the main governing party (Zimmermann & Kohring, 2020, p. 226). Results from Zimmerman
and Kohring (2020, p. 230) indicated that former supporters of the CDU/CSU (Chancellor Merkel is
part of the CDU) were more likely to refrain from electing this party the more they believed
disinformation. Instead, these voters tended to choose either to support the AfD or SPD. The AfD
became stronger and supported by those who had right-wing attitudes at the CDU/CSU (Zimmermann
& Kohring, 2020, p. 230). Believing disinformation alienated voters from the main governing parties
and notably drove them into the arms of the AfD. There may be a suitable explanation for this
development, like disinformation, in Germany, which mainly focuses on the migrant situation and
blames Chancellor Merkel as she was deemed responsible for the refugee situation. This, in turn, drove
voters of the CDU/CSU, with right-wing attitudes, in the arms of the AfD and refraining them from
voting for the governing party (Zimmermann & Kohring, 2020, p. 231). In the 2017 Bundestag elections,
the AfD rose from 0 seats to 94 seats in parliament (Officer, T. F. R., 2017). Although empirical evidence
in Germany shows that disinformation may affect individual vote choice and thereby undermines
democratic principles. Disinformation should not be seen as an isolated phenomenon but rather as a
symptom of more deep-rooted disaffection with news media as well as the political system
(Zimmermann & Kohring, 2020, pp. 232–233). It is impossible to argue that disinformation on its own
has led to a significant increase in AfD voters. As argued, disinformation uses existing frames and
narratives and uses more deep-rooted societal problems, and disinformation tends to amplify these
problems.
Merkel was again voted to Chancellor in 2017 but lost substantial support in the number of
voters (Officer, T. F. R., 2017). Societal support for Merkel was weaker, but she was still able to act in
government. It requires an in-depth study to indicate if the results of the Bundestag elections in 2017
has led to a different stance and opinion of Chancellor Merkel to Russia, something that goes beyond
the scope of this thesis. However, empirical evidence shows that Russia is trying to weaken Western
governments, Germany, for its strategic geopolitical interest, building upon deep-rooted problems.
Support for Merkel declines, but one should be careful to link this decline solely to disinformation.
Evidence shows it affects voting behaviour, but the scale remains unclear.
21
3.1.2 Analysis and Assessment
The Lisa Case is a relatively straightforward example of how disinformation campaigns can work
throughout the theoretical analysis model. As shown in figure 2, the story initiated by Russian state-
controlled media sources found its way into society and led to mobilisation and changes in behaviour.
As mentioned, Germany is one of the main targets for Russian disinformation, to weaken
support for chancellor Merkel, amplify the voices of nonmainstream figures on both the left and the
right, and weaken transatlantic unity (Mankoff, 2020, p. 9). The missing girl provided an excellent
opportunity for a disinformation campaign on an existing fault-line and narrative. The story was aimed
at existing German anti-immigrant narratives, which provided a breeding ground for the story. The story
was shared with unwitting agents of influence with overt and covert activity, which were effective
receptors and spreaders of the story. These unwitting agents were primarily active within far-right
Facebook groups, which provided people already believing in anti-immigrant narratives. The story,
therefore, made use of the effective information environment to reach a specific part of society where
the disinformation story had the most effect and would be perceived favourably. The case gives a clear
example of how Russia acted quickly on a current development in Germany and used the opportunity
to create the story to affect society. This indicates that Russia, correctly, did an in-depth study of the
target to be successful (Thomas, 2004, p. 243).
Moreover, the story was created within the suitable societal characteristics (frame, narrative and
true believers) that it was spread quickly and had desirable short-term changes in attitudes and opinions.
For a story to fit into a frame and for a story to lead to mobilisation, it must resonate with the target
(Benford & Snow, 2000, pp. 619-622; Prier, 2017) and support existing beliefs (Heuer, 1990, pp. 11-
12). The story was a consistency of already in-place beliefs, claims and actions against immigration. It
provided the empirical credibility of an anti-immigrant frame, as the false story gave empirical evidence
for the beliefs of these groups. The story, thereby, was shared within specific groups, leading to
credibility within these ‘bubbles’ of true believers of the anti-immigrant narratives. Ultimately, the story
had short term effects and led to demonstrations and attacks and thus mobilisation and can be considered
successful as it affected attitudes and opinions within specific (target) groups. In the long-term, the story
fitted a broader Kremlin narrative, which was supposed to weaken the German (liberal) government by
primarily using far-right narratives. The story itself should, therefore, be understood in a broader context
of disinformation. The case was not a case on its own but had to provide to the weakening of chancellor
Merkel and democracy in the long term with a longer strategic goal. The effects on the long-term are
difficult to illustrate as it makes use of existing fault lines. Therefore, specific election results cannot be
solely blamed on cases of disinformation. It is amplifying the nonmainstream voices and tries to
undermine democratic principles. Using existing narratives and aiming at a specific audience, the story
was perceived, evaluated, and gave the specific audience opportunity and incentive to act (Heuer, 1990).
22
Although, it is impossible to link the case to specific policy effects. The Lisa story is a clear
example of how disinformation is used on an existing fault-line to create noise and confusion among
populations and can relatively easily lead to forms of short-term mobilisation (e.g. demonstrations and
attacks). The doubt created over the story did not disappear as some argue that the police debunked the
story to protect the migrants. A specific existing narrative and frame were articulated and exacerbated
with disinformation, with a geopolitical goal to weaken policy options for the Chancellor. This
weakening fits within the longer-term strategic goals. The already in place anti-immigrant narratives
provided the breeding ground for the story and helped spread the disinformation rapidly within the
available information environment. Russia tried to or has effectively controlled specific parts of the
German population, aiming to let them make favourable voluntary decisions to weaken the German
government responses, ability to act, and support for existing migration policy favouring Russia
(Thomas, 2004, p. 254). The story is not a case on its own but a single case in an information
environment with multiple disinforming pieces of information. It fell into an established pattern of
Russia seeking to destabilise democratic systems by undermining specific politicians who take an anti-
Russia line while reinforcing the alt-right and far-left (Kampfner, 2020, p. 14).
Figure 2: the Lisa Case: a visualisation of the Lisa case
23
3.2 Disinformation in the Baltics: NATO’s Enhanced Forward Presence (2016)
In July 2016, NATO decided to enhance its presence in the Baltics and Poland under the Enhanced
Forward Presence (eFP). The eFP was intended to strengthen the deterrence and defence posture of
NATO in the Baltics as a reminder of Article 5 of the alliance's treaty; “an attack against one is
considered an attack against all” (NATO, 1949).
Russia interpreted this action as a hostile act, and they pledged to respond (Nimmo et al., 2018).
Russian state media started to spread a series of distorted and faked stories about the eFP on their
platforms and social media, in addition to the increased Russian military activity in the Baltic sea
(Barojan, 2018; Nimmo et al., 2018). The Atlantic Council's Digital Forensic Research Lab (DFRLab)
identified six key narratives to the hostile reporting, all intending to undermine NATO and the enhanced
forward presence in the Baltic states and Poland. The following six key narratives, which were spread
in the Baltics, were identified by the DFRLab (Barojan, 2018; Nimmo et al., 2018):
“1. NATO is unwelcome, and NATO troops are occupants;
2. The Baltics and Poland are paranoid or "Russophobic";
3. NATO is provocative and aggressive;
4. NATO is obsolete and cannot protect its allies;
5. NATO, the Baltics, and Poland are sympathetic to the Nazi ideology;
6. The Baltics are artificial countries and unreliable partners”.
Of the six narratives, the first four were the most dominant and particularly noticeable on Russian funded
state media, like RT and Sputnik. Certain narratives were more prominent in some countries than others.
For example, Lithuania has publicly criticised Russia's aggression in Ukraine and labelled Russia a
threat to European security. Negative narratives, as a response, disproportionately accused Lithuania of
being paranoid and Russophobic (Barojan, 2018). Second, in Poland, apart from the enhanced forward
presence, there is an additional presence of U.S. troops in the country. Therefore, disinformation in
Poland has centred around the narratives that NATO actions were aggressive and provocative. (Barojan,
2018). Third, the narrative that NATO is obsolete and can not protect the Baltic states saw an increase
after a U.S. outlet published an article about the potential threat the Russian military poses to NATO
(Nimmo et al., 2018). Another story focussed on Latvia and argued that NATO soldiers posed a threat
to public safety as they were allowed to carry loaded guns in public. This story was partly true, as
soldiers were not allowed to go around armed in public as suggested in the article but, by legal
amendment, NATO soldiers were allowed to cross the border into the country while armed to speed up
reinforcement (Nimmo et al., 2018). The spreading of narratives tend to follow real-world events
concerning Russia, NATO, and the Baltics, with sometimes half-truths in their messages. The language
used in spreading the narratives was primarily Russian, suggesting that native Russian speakers in the
24
Baltics were the targeted audience of the disinformation (Barojan, 2018; Nimmo et al., 2018). The
deployment received significantly more negative coverage in the Russian language than in the local
language. In the local language, “the coverage tended to be either factual and neutral in tone or optimistic
towards the deployment” (Nimmo et al., 2018). The Strategic Communications Centre (STRATCOM)
of NATO reported in 2018 that Russian-language bots were responsible for 55 per cent of all Russian-
language messages in the Baltics about NATO (Robbins, 2020). The narratives, therefore, tend to speak
to the Russian speaking minorities in the Baltics, considering the narratives and the language used
(Nimmo et al., 2018). In Latvia and Estonia, these minorities comprise around 25% of the population,
and in Lithuania, the number is around 7% (Courrier d'Europe, 2019). However, most ethnic Latvians
and Estonians understand the Russian language, making Russian media accessible to many Baltic
citizens (Bergmane, 2020, p. 484).
All stories were focused on discrediting NATO and undermine its capabilities and had a pro-
Russian view. Several posts generated significant attention with thousands of views. However, many
other posts lead to little or no impact. The overall reporting effort, however, was significant (Nimmo et
al., 2018). The effects were limited. Widespread engagement with the articles- in terms of shares-
appears weak. According to the DFRLab, the four most common narratives acquired little traction when
the enhanced forward presence deployment had just arrived, and local impressions and opinions were
still forming (Lanoszka, 2019, p. 243). The narratives also seem to elicit minor reactions from Russian
readers even though the message was focused on Russian-speaking audiences, as they are most likely
to consume news in their native language (Lanoszka, 2019). Although, the consumption data of the
disinformation narratives give a little impression of the campaign's impact and should be compared with
the attitudes against NATO and government policies during the campaign. A poll conducted by RAID
– a Lithuanian research company at the Ministry of National Defence – found that 81 per cent and 82
per cent of Lithuanians surveyed support for NATO and their presence of Lithuanian territory (DELFI,
2016). Over the years, support for NATO remained stable, with a slight increase for the support of
NATO membership at 3 per cent (LRT, 2017). These attitudes were mainly shared in the region, as a
survey by the Estonian Ministry of defence found in 2017 that 78 per cent of Estonian speakers are
confident in NATO, although only 24 per cent of Russophones in Estonia share this assessment.
Confidence in state institutions, including the security apparatus, increased from previous years
(Kivirähk, 2015, p. 4). A November 2016 poll in Latvia reported that support for NATO remained stable,
with 59 per cent found that NATO enhances national security (Ministry of Defence of the Republic of
Latvia, 2018). Latvia is divided by Russian and Latvian speakers on NATO, with Latvian speakers
reporting more favourable attitudes towards NATO (73 per cent against 38 per cent) (Latvijas
Sabiedriskais medijs, 2016). Moreover, in the three countries, non-governmental surveys throughout
2016 have reported similar findings that the Baltic domestic audiences see NATO as protection (Smith,
2017).
25
To gauge the effectiveness of the disinformation campaign, the best indicator is to analyse the
government policy on military spending. The uptick in Russian disinformation does not seem to have,
for NATO, negatively shaped military spending; defence spending continues to rise (IISS, 2018;
Lanoszka, 2019, p. 246).
3.2.1 The Baltic Case
Three decades after the fall of communism and regaining independence from the Soviet Union,
nowadays, there are increased concerns amongst the Baltic states and their Western allies over their
security with the rise of a more aggressive Russia over the last decade (Cesare, 2020). Baltic states,
especially Estonia, fear Russian threats given their Russian-speaking ethnic minorities, which are still
in the country after independence, digital inclination, and proximity to Moscow (Robbins, 2020). Russia
looks for ways to interact with and support Russian ethnic minorities in the Baltics to build its influence
(Cesare, 2020). Moreover, the Baltic states are considered vulnerable to Russia, as all three share a
border with Russia and possess small militaries. The Baltic states worry that Russia is trying to
destabilise them from within and attempt a Crimea-like operation to restore imperial control over them
(Lanoszka, 2019, p. 241). As former Soviet republics with significant Russian-speaking minorities,
Estonia and Latvia often seem to outside observers to be breeding grounds for future conflicts
(Bergmane, 2020). The disinformation campaigns might be part of a more extensive campaign to isolate
the Baltic states from the West and NATO and bring them into the sphere of Russian influence
(Lanoszka, 2019, p. 241). With the ethnic Russians living in the Baltic states, the Baltics can fall under
the Russian 'Compatriot Policy'. The compatriots of Russia living abroad refer to “individuals who live
outside the borders of the Russian Federation itself yet feel that they have a historical, cultural, and
linguistic linkage with Russia (Zevelev, 2016)”. This concept has developed a series of laws, state
policies, and foreign policy decisions (Kallas, 2016) to strengthen ties between Russia and Russian
speakers abroad (Bergmane, 2020, p. 484). The involvement with the compatriots is seen as one of the
main methods to recreate Russia's great power status and a tool of Russian influence in the region
(Zevelev, 2016). Russia, therefore, could mobilise their Russian minorities in the Baltics to destabilise
the internal politics of the Baltic states (Lanoszka, 2019, p. 242). In addition, since the 1990s, non-
citizens
3
remain a noteworthy issue within the Baltic states and between Baltic-Russia relations. Non-
citizens remain a crucial element in Russia's compatriot policy (Bergmane, 2020, p. 485), which the
Russian state-media can manipulate. After the regaining of independence and the fall of the Soviet
Union, those citizens, now considered stateless people in the Baltics, refused to go through the
citizenship application process (Cesare, 2020). Estonian non-citizens can vote only in municipal
elections; Latvian non-citizens cannot vote at all (Bergmane, 2020, p. 485). The Russian Federation has
3
According to Article 1 of the UN Declaration on the Human Rights of Individuals who are not Nationals of the
Country in which They Live (1985), a non-citizen is defined as “any individual who is not a national of a State in
which he or she is present”. In the Baltics, non-citizens are mostly former USSR residents who were not be able
to obtain Baltic citizenship after the USSR period.
26
accused these states of human right abuses but, Estonian and Latvian governments point out that the
path to citizenship is open to everybody willing to learn the official language (Bergmane, 2020, p. 485).
The Baltics have a long history of fighting disinformation and learned to create effective
responses (Thompson, 2019). Estonia, for example, created its Russian-language channel back in 2015
to counter the narratives spread by Russia and give Russian speaking groups an alternative to Russian
state media (Robbins, 2020). Moreover, Baltic countries also rely on citizen mobilisation to counter
Russian disinformation. These citizens respond to disinformation as the so-called "Baltic Elves",; an
internet activist group that counter Russian "trolls". The "Baltic Elves" report bots, counter-narratives
across the Baltics and monitor news article message boards (Robbins, 2020). Neutral and optimistic
coverage in official Baltic languages about the enhanced forward presence outnumbered the negative
narratives targeting Russian speakers in the Baltic States and Russia (Barojan, 2018).
3.2.2 Analysis and Assessment
As discussed, the disinformation campaign employed during the time of the enhanced forward presence
found little support. Given the indicators used, public opinion and government policies were not
affected, and support for NATO even increased. In addition, considering the number of shares of the
disinformation, the campaign received little traction.
As the campaign's initiator, Russia tried to speak through the known channels (Sputnik, RT and
social media) to the target audiences in the Baltic states. As shown in figure 3, a distinction can be made
between the potential targets (domestic society or Russian compatriots/non-citizens). For the matter of
the argument of this thesis and the effect on International Security, the disinformation campaign should
have to get a foothold in the whole Baltic societies to change attitudes and opinions against NATO
drastically. Therefore, for the argument of this thesis, all Baltic audiences are considered the target of
the campaign, while keeping in mind that the target was probably not Baltic society. In this case, the
targeted audience was not receptive to the disinformation as it did not change opinions, as shown in the
opinion polls and government policy. The campaign was not received appropriately, indicating that the
disinformation did not comply with the experience, education, and cultural narratives of the targeted
audience (Heuer, 1990), it did not make use of a domestic network of true believers or fitted a belief
structure within society (Prier, 2017). The wars in Georgia and especially in Ukraine have antagonised
the Baltic societies (Bergmane, 2020, p. 480), and in combination with the feared Russian threat, might
be why the Baltic societies were not receptive to the disinformation and instead have positive attitudes
to NATO, as they see this as the providers of their security. Therefore, the targeted audience was
presumably the ethnic Russian 'compatriots' living abroad, as the disinformation was mainly
disseminated in Russian. Those 'compatriots' were presumably already favouring Russia, so that might
be the reason that public opinion did not change, as it targeted audiences that were already in line with
Russia and fitted their belief structure. The Estonian example of a Russian language alternative to
27
Russian state media can also be seen as an indicator that counter-narratives in those societies do their
work.
Moreover, the information considering NATO in the Baltics domestic language was mainly
positive, indicating the overall narrative in the Baltic states considering Russia and NATO. If the target
was the Baltic domestic society, the campaign fell on deaf ears, as use has not been made of existing
frames and narratives, making sure the disinformation was not perceived (Heuer, 1990). The Baltics
were already quite positive in their opinions and attitudes considering NATO, and bending opinions are,
as analysed in the operationalisation, not a successful method. Because of the lack of success, the
disinformation did not lead to the desired policy effects for Russia (Heuer, 1990, p. 5; Thomas, 2004, p.
241). As described by the DFRLab, some of the narratives were contradictory to each other. A country
cannot be Russophobic and unwelcoming to NATO at the same time. Neither can NATO be defenceless
against Russia and still deploy the enhanced forward presence. These inconsistencies undermine the
overall message (Lanoszka, 2019, p. 243). It could be a sign that Russia might be struggling to find a
narrative that resonates, as suggested by Benford & Snow (2000, pp. 619-622), with the audience's
beliefs (Lanoszka, 2019, p. 243).
This campaign got stuck on the target, and thereby, did not have the desired result.
Overall, the disinformation was mainly targeted at Russian minorities and tended to be a
campaign aimed to discredit NATO in favour of Russia and was not built on existing domestic narratives
and frames. This can primary interpreted that Russia sees NATO as a real threat and tried to place this
narrative on the Baltics and, especially, the Russian minorities. The campaign lacked an in-depth study
of the Baltics inner nature (Thomas, 2004, p. 243) but can be seen more as a quick response to external
developments. Because it did not receive much traction, this narrative is not shared within the Baltics
and could not alter opinions and attitudes concerning the enhanced forward presence needed to affect
International Security. Therefore, this specific case seems to be an example of when disinformation does
not negatively affect the support for NATO as hoped by the Russians. However, a note has to be made
that the goal was not to bend domestic audiences but to influence and exacerbate opinions on the
'compatriots' because of the primarily Russian language in the narratives, which can be a method to sow
disagreement, create divisions and discord among different parts of the population (Gerrits, 2018, p. 5),
and mobilise ethnic Russians (Lanoskza, 2019, p. 242). To measure the effectiveness of such a
campaign, a more in-depth analysis should be made into the ethnic Russian populations and Russia's
compatriot policy. Relations between the ethnic communities have been tense in history. Different
perceptions of history, disagreements on laws by Russian minorities, and diverse understanding of the
role of the Russian language are some examples of previous issues within the Baltic states (Bergmane,
2020, p. 484). However, over the last six years, there are no signs of deep dissatisfaction among local
Russian communities. According to a 2016 poll conducted in Latvia among Russian speakers, 84.3 per
28
cent feel belonging to Latvia, 43.1 per cent feel belonging to Europe, and 32.4 per cent and 28.3 per cent
feel belonging to the USSR and/or Russia (Berzina et al., 2018, p. 27).
Figure 3: the Baltic Case: distinction should be made into the possible target of the campaign, which
creates a different goal and different results
Initiator
•Russia
Channel
•Russia state-
controlled
media
•Social Medua
Target
•Russian
Compatriots
•Russia's non-
citizens
Goal
•Create division
•Mobilise
ethnic
Russians
•Creating
compatriocity
Result
•(unclear)
•Is part of a
bigger long-
term
campaign to
increase
influence
over
minorities
Initiator
•Russia
Channel
•Russia state-
controlled
media
•Social Medua
Target
•Baltic domestic
audiences
Goal
•Bend opinions
•Undermine
NATO
Result
•Stable or
increased
support for
NATO
•Low
interactions
with
disinformation
29
3.3 Disinformation in the United Kingdom: the Brexit Vote (2016)
On the 26th of June 2016, the United Kingdom (UK) voted to leave the European Union. The "Leave"
camp won the popular vote by 51.89% (BBC, n.d.). As Russian influence in the results remains
unproven, sources claim that evidence exists that Russia tried to persuade the public to vote to leave the
European Union (Gillett, 2017), which was most visible in online social media activity. The Russia
report states that “Russian influence in the UK is the new normal” and that “the UK is clearly a target
for Russian disinformation” (BBC, 2020; Intelligence and Security Committee of Parliament, 2020).
The report also states that studies identified the preponderance of Anti-EU and pro-Brexit narratives on
RT and Sputnik and Russian trolls and bots on social media to promote these narratives as evidence
(Intelligence and Security Committee of Parliament, 2020). Therefore, it would be naïve to claim that
Russia did not contribute to the referendum's outcome.
Russian disinformation in the UK is focused on targeting minority and separatist groups on the
fault-lines of society, promoting scepticism about international organisations (such as the EU and
NATO) and international partners (such as the US), and thwart EU cohesion (Ellehuus, 2020, pp. 5–8).
The most visible influence campaign was supporting the leave campaign in digital environments during
the Brexit referendum. Disinformation circulated mainly on social media or was disseminated on the
English-language platforms of RT and Sputnik. The used narratives in the disinformation aimed to
exploit pre-existing vulnerabilities of political and societal polarisation by playing up fears about
migration and globalisation, promulgating allegations of corrupt candidates and parties, and trying to
discredit the European Union by painting it as corrupt, ineffective and infringing upon the United
Kingdom's sovereignty (Ellehuus, 2020, p. 8).
Russian disinformation often observes and amplifies existing anti-EU narratives, and the
approach tends to "flood the zone" with a combination of accurate, half-true, and false information. The
disinformation is also targeted over time and follows real-life events. Days before the referendum,
analysts saw an uptick in tweets linked to Russia-based accounts. Some 150,000 Russian-language
Twitter accounts
4
posted tens of thousands of messages favouring leaving the European Union
(Baraniuk, 2016; Ellehuus, 2020, p. 10; Mostrous et al., 2017). As around 55 per cent of British people
get their news from social media platforms, there are concerns about the susceptibility to the
disinformation of the British public (Ellehuus, 2020, p. 15). Research by the Oxford institute indicates
that monitoring 313,000 thousand accounts, out of 1.5 million tweets leading up to the vote, 54% were
pro-leave, 20% were pro-remain, and 26% were neutral. A third of the tweets came from less than 1%
of the accounts (Risso, 2018, p. 80). As research indicates, the amount of influence relies on whether a
bot provides information consistent with humans' priors and creates the so-called "echo chambers"
4
Russian-language Twitter accounts have their language settings on Twitter set on Russian. This does not mean
they tweet in Russian. Only their Twitter settings are set in Russian, indicating that the users of these accounts
have an understanding of the Russian language.
30
(Gorodnichenko et al., 2018, p. 3). During Brexit, the daily volume of pro-leave tweets was always
higher than the daily volume of pro-remain tweets. A massive volume of tweets, coming from Russian-
language accounts, was created only a few days before the vote, reaching its peak on voting day
(Gorodnichenko et al., 2018, p. 9). The interaction levels of humans with Twitter bots depended on
whether the bot's information is consistent with the humans' preferences (Gorodnichenko et al., 2018, p.
21). Bots continuously retweeted posts with hashtags that supported the leave side and were directed
primarly on migration and lack of control of the borders (Risso, 2018, p. 80).
Another tactic worth considering used by Russia to influence the referendum was financial
support for the leave campaign. A loophole in the legislation allowed money to flow in from external
organisations. The primary suspicion was on Aaron Banks, as the biggest donor to the Leave campaign
with 8 million pounds and a close associate of Nigel Farage, by reports that he was offered several
profitable deals either in or related to Russia (Ellehuus, 2020, p. 9). Aron Banks allegedly had meetings
with Russians at the Russian embassy in London and travelled to Russia to discuss diamond and gold
mining opportunities in Russia (Channel 4 News, 2018; Kirkpatrick & Rosenberg, 2018). Funding on
the leave campaign allegedly broke electoral law, and funds were mainly used, in a legal grey zone, on
a company named AggregateIQ linked to Cambridge Analytica (Risso, 2018, p. 80, 2018, p. 82). Vote
Leave spent more than half of its official campaign funding on AggregateIQ (Cadwalladr, 2017). The
company allegedly used large amounts of harvested data from social media, especially Facebook, to
micro-target persuadable voters, which were then bombarded with over a billion user-specific
advertisements in the last couple of days before the vote. (Cadwalladr, 2017; Risso, 2018, p. 80). By
harvesting as much data as possible, emotional triggers can be found for each voter based on personality
profiles created with this data. Messages, in this way, can be targeted at individual levels to persuade
voters (Cadwalladr, 2017; Risso, 2018, p. 76). The voters' personality, created from this data, is built
upon psychological traits such as openness, conscientiousness, extroversion, agreeableness, and
neuroticism (Risso, 2018, p. 76). The bombarded ads were emotionally exploiting and using
phenomenons like nationalism to manipulate people at the margins (Cadwalladr, 2017). The
psychological profiles allow companies like Cambridge Analytica to develop communication programs
that trigger inner fears and deep-rooted biases (Risso, 2018, p. 78). According to research, it is clear that
social media campaigns, which were likely based on data from Cambridge Analytica, helped fuel anti-
EU sentiment (Risso, 2018, p. 81).
While there is no concrete evidence that Russian money entered the United Kingdom to
influence politics, UK authorities have stepped up their investigations into Russian money, Russian
political-business figures, and the funding by Aron Banks (Ellehuus, 2020, p. 13). Questions about the
funding on the leave campaign remain, and investigations into Aaron Banks are ongoing (Ellehuus,
2020, p. 14). Multiple sources have linked Russia to Cambridge Analytica and, thereby, AggregateIQ.
These links are meetings with executives from Russian state-owned enterprises, trips to Russia, meetings
31
at the Russian embassy, and references by employees working for Russian entities (Cadwalladr, 2017).
Even though there is no hard evidence, it must be considered that Russia was somehow involved in the
process (Cadwalladr, 2017).
3.3.1 The United Kingdom as Target
The United Kingdom is a prime target for Russian influence to weaken and undermine the existing
international order, as it is a nuclear power, member of NATO, has a seat on the United Nations Security
Council, due to its political and economic strength, and its close relationships with the United States
(Ellehuus, 2020, p. 1). The United Kingdom is one of Russia's top intelligence targets and can pose a
security threat fuelled by paranoia about the West and the desire to be seen as a world power
(Intelligence and Security Committee of Parliament, 2020). Russia calculates that it gains more from
undermining than embracing the international system and thinks it benefits when countries on its
periphery are unstable, strong international players are weak, and international organisations are divided
(Ellehuus, 2020, p. 4). Influence objectives in the United Kingdom serve three primary goals: to weaken
the United Kingdom internally, diminish its position globally, and promote favourable policies to Russia
(Ellehuus, 2020, p. 5). Russian disinformation, therefore, was centred around supporting the leave
campaign, as it diminished the UK position and influence in the world and weakened the European
Union as a power opposed to Russia (Ellehuus, 2020, p. 6). Although Russian attribution is difficult at
this stage, Brexit was at least linked to Russian strategic interest, making influence operations and
contribution during the campaign plausible.
In parts of British society, polarisation is present, making them more vulnerable to
disinformation. Divisions within Britain mainly focus on narratives of "us-versus-them", and adversaries
capitalise on these divisions, offering narratives that reinforce each side's respective fears, opinions, and
prejudices (Ellehuus, 2020, p. 16). As observed, tweets containing a high degree of emotionality reach
a wider readership and are more likely to disseminated (Gorodnichenko et al., 2018, p. 4).
3.3.2 Analysis and Assessment
The Brexit case is complex; new technologies gave new possibilities for influence campaigns during the
Brexit vote. There are direct and indirect links to Russia, but hard evidence for influence remain
unproven or classified. Nevertheless, as evidence shows, Russia is an initiator, or at least accomplice,
of influence campaigns. As shown in figure 4, at this stage, it is clear that during Brexit, influence
campaigns were present and that entities tried to persuade voters into making favourable decisions (e.g.
the large number of tweets coming from bots and the advertisements by AggerateIQ). As the Russia
report indicated, there was no dedicated team confronting disinformation in time of the Brexit vote, and
the threat was underestimated (Intelligence and Security Committee of Parliament, 2020), freeing the
way for a successful operation.
32
Disinformation efforts in the United Kingdom recently focused on “amplifying anti-EU
sentiments, creating reputational damage to the UK's role in NATO and its relationship with the United
States, exploiting minority grievances and encouraging separatist movements” (Ellehuus, 2020, p. 27).
Disinformation campaigns focused on existing divides in British society and were able to target its
messaging. Disinforming narratives focused on anti-elite sentiments, fears about migration and resultant
erosion of British culture. The Brexit referendum became less about staying in the European Union but
more about identity (Swami et al., 2018). Identity-based politics is more susceptible to disinformation,
as the focus is more on emotions than facts (Ellehuus, 2020, p. 16). In addition, information with a
negative tone also spreads faster (Gorodnichenko et al., 2018, p. 5). By using social media as the channel
for disinformation, sufficient parts of the populations were reached. Furthermore, the disinformation
campaign used existing divides and narratives, amplified existing emotions, and found a network of true
believers while influencing. Tailored messaging was done with the micro-targeting and the
psychological profiles gave tools to understand the cultural environment, the audiences' concerns, inner
fears, and hopes and desires (Risso, 2018, p. 78), and to create micro realities through targeted
messaging (Till, 2020, p. 12). Key messages must resonate with the public, relating to their deep-rooted
cultural tropes and bias in speaking to various audiences in the same community (Risso, 2018, p. 79).
Studies on the impact of micro-targeting in advertisements suggest that it can covertly attract up to 40%
more clicks and up to 50% more purchases. There is no reason to believe that similar results could not
be achieved in political messages (Risso, 2018, p. 83). Therefore, the disinformation campaign used new
technologies to understand the societal characteristics of British society and target disinformation on
existing frames and narratives. Specifically, bots can spread and amplify disinformation, thus
influencing what humans think about a particular issue and likely reinforce humans' beliefs
(Gorodnichenko et al., 2018, p. 21). Therefore, the campaign fitted (individual) belief structures and
fitted pre-existing biases (Prier, 2017). Disinformation, generated with bots and micro-targeting, helped
create an information environment in which voters no longer know whom to trust, who is saying what,
and to which end. This insecurity, distrust and sense of chaos push voters towards messages that
reinforce their bias and gives them hope and reassurances (Risso, 2018, p. 84); by making use of deep-
rooted emotions and biases, the disinformation aimed to reflexive control parts of British society
(Thomas, 2004, p. 237). The campaigns by, and based on, data from Cambridge Analytica spoke to
different audiences within the same society, giving a new dimension to the use of disinformation. With
this data, it was easier to fit existing beliefs and target the messaging (Heuer, 1990, pp. 11-12).
This micro-targeting helped comply with resonating with the individual beliefs of people and
framing messages to specific audiences. Problems were diagnosed (e.g. migrations, sovereignty, and
border issues), a proposed solution to the problem was orchestrated (leave the European Union), and a
call-to-arms was given (vote leave during the referendum). The disinformation complied with the critical
tasks in framing (Benford & Snow, 2000, pp. 615–618). It was easier to tailor a message to individual
33
voters with micro-targeting to ensure the message resonates with emotional and cultural beliefs (Benford
& Snow, 2000, p. 622) and that the message was perceived in the right way (Heuer, 1990, pp. 9-10). In
this way, it was more likely that voters were persuaded to make voluntarily favourable decisions for the
initiator of the campaigns and act according to the will of the initiator (Heuer, 1990, p. 15-18).
Brexit undermines the global position of the European Union, is a political event of great
strategic importance, and favours its competitors. Brexit has strategic effects on international alignment
and balance of power (Gerrits, 2018, p. 28). As the Brexit vote came down to less than 2 per cent, it
would be naïve to indicate that influence campaigns did not have desirable effects on the vote outcome
and, thereby, changes in behaviour and opinion. Research results suggest that the effect was likely
marginal but possibly large enough to affect the outcome given the narrow outcome (Gorodnichenko et
al., 2018, p. 4). A slight change of public attitudes may significantly impact government policy; a swing
of several percentage points may change a government and, in this case, the outcome of a referendum,
with dramatic policy consequences (Heuer, 1990, p. 5). Russia is seen as a significant competitor to the
West, and Brexit was of strategic interest for the Kremlin and considering disinformation in the United
Kingdom, Russia did play a role in the Brexit vote. By reflexive controlling public opinion in the United
Kingdom with disinformation to get desirable policy effects, changes in behaviour, and affect the
balance of power (Gerrits, 2018; Heuer, 1990, p. 5; Lanoszka, 2019; Thomas, 2004, p. 241). As we
cannot solely blame a influence campaign for the outcome of the vote, as pre-exsting narratives, biasas
and identities played also a role; Brexit remains, for Russia, a desired outcome. Therefore, it is plausible
that Russia, at least tried, to influence the outcome of the vote.
Figure 4: Brexit: visualisation of the disinformation campaign
34
3.4 Disinformation in France: the Macron Leaks Operation (2017)
In the run-up to the French presidential elections of 2017, there was a coordinated attempt to undermine
the candidacy of Macron through a classic 3-dimension information operation. Starting with a
disinformation campaign consisting of fake stories, rumours, and forged documents, followed by a hack
of Macron’s campaign staff, and concluded with the leak of the15GB of stolen data (Vilmer, 2019, p.
3). On Friday the 5th of May 2017, a user anonymously shared an email dump from Emmanuel Macron
and his team (Ferrara, 2017, p. 2). The leaked documents and emails came two days before the 2017
presidential election in France in which Macron was a prominent candidate. As analysed in this chapter,
the disinformation campaign failed to interfere with the election and antagonise French society (Vilmer,
2021, p. 76), as Macron won with two-thirds of the vote.
On the 5th of May, just before the election silence (a 44-hour ban on political campaigning
including polling ahead of the vote) required by French electoral law, a dump of more than twenty
thousand emails of Macron's election teams was dropped online (Ferrara, 2017, p. 2; Vilmer, 2021, p.
78). The drop consisting of 15 gigabytes of stolen data was first posted on archive.org
5
, then on Pastebin
6
and 4chan
7
(Brandom, 2017; Vilmer, 2021, p. 78). Considering the leak's timing, it was timed to leave
Macron and his team powerless, block the mainstream media from analysing the contents and make sure
the debate around the drop would primarily happen on social media (Vilmer, 2019, p. 12). The leak was
then further disseminated and amplified by US-based alt-right activists and Wikileaks (Ellyatt, 2017;
Scott, 2017; Vilmer, 2021, p. 78). Pro-Trump accounts, with the hashtag #MacronLeaks, were the first
to share the leak on Twitter. Wikileaks followed shortly after, although with the disclosure that the
authenticity was unverified (Ferrara, 2017, p. 2; Vilmer, 2021, p. 78). Overall, in just three and a half
hours, the hashtag reached 47.000 tweets (Nimmo et al., 2017). The following pattern of operation can
be identified: first, the content is dumped on the political discussion board on 4chan. Second, the content
is brought into mainstream social networks like Twitter and Facebook. Lastly, it was spread with catalyst
accounts through established far-right communities in France and the United States, and retweeted by
bots and real humans (Ferrara, 2017; Vilmer, 2021, p. 78). The first French amplifiers of the leak
happened to be Le Pen supporters (Vilmer, 2019, p. 13). The documents circulated online have been
referred to as evidence of tax fraud and other illicit activities by Macron. However, there is no evidence
within the leak to support these allegations, as the conclusions were based on erroneous French
translations and biased interpretations rather than facts (Ferrara, 2017, p. 3).
Research uncovers that the leak was amplified and disseminated in a social bot operation that
occurred in the run-up to Election day to further the viral sharing of the disinformation (Ellyatt, 2017;
5
The Internet Archive, a 501(c)(3) non-profit, is building a digital library of Internet sites and other cultural
artifacts in digital form.
6
A pastebin or text storage site is a type of online content hosting service where users can store plain text, e.g.,
to source code snippets for code review via Internet Relay Chat.
7
4chan is a simple image-based bulletin board where anyone can post comments and share images.
35
Ferrara, 2017, p. 3; Nimmo et al., 2017; Vilmer, 2019, p. 3). The campaign's inception is easy to pinpoint
on Twitter, as the volume of tweets surged in the run-up to election days and was nearly comparable
with the regular discussion, suggesting that the disinformation campaign acquired significant collective
attention for a brief period which in turn could have disastrous effects in terms of public opinion
manipulation (Ferrara, 2017, p. 7). Moreover, Macron canalised the most significant volume of tweets
and mentions to his official Twitter account (Ferrara, 2017, p. 7). Out of 99.378 Twitter users involved,
models classify 18% of them as bots, which is highly consistent with other studies into social bots
(Ferrara, 2017, p. 8). Some of the bots accounts were also active in the US presidential elections of 2016
and, after a period of silence, they resurged during the MacronLeaks disinformation campaign (Ferrara,
2017, p. 9).
The attack on Marcon’s campaign staff is still not publicly attributed to any perpetrator by the
authorities in France. The head of the French National Cybersecurity Agency (ANSSI) indicated that
“the attack was so simple, it could practically be anyone who did it” (Rettman, 2017). However, the
expert community
8
points to the Kremlin. The phishing attempts that led to the leak have been attributed
to "Fancy Bear" (a cyberespionage group that is linked to Russian military intelligence agency); the
metadata behind the leak consisted of Cyrillic and indicated that the last person that made edits to the
stolen files was Roshka Georgiy Petrovic (allegedly an employee of an information technology company
with links to the FSB based in St-Petersburg); and the attack shows many similarities to the leaks
surrounding the Clinton campaign in 2016 (Brandom, 2017; Vilmer, 2021, pp. 78–79). None of the
evidence is conclusive but taken together; the evidence points in the direction of Moscow.
3.4.1 Disinformation in France – Why Russian Support for Le Pen?
Russian disinformation in the West and, thereby, France often goes hand-in-hand with populist and
extremist political movements. Those movements increasingly derive from the Kremlin and support its
political agenda, ultimately acting as proxies for Russian influence in the West (EUvsDisinfo, n.d.).
France plays a crucial role in keeping the EU together, especially after Brexit. Russia is keen to see the
bloc further weakened, and the France elections gave another opportunity (Ferris-Rotman, 2017). The
French far-right certainly fells in Russians area of interest and influence (Lebourg, 2018).
As the 2017 presidential election in France was a runoff election between Emmanuel Macron
and Marine Le Pen, it offered a unique opportunity for foreign influence to undermine the elections and
create favourable outcomes. For Russia, Le Pen is undoubtedly the favourite by publicly supporting the
Russian annexation of the Crimean peninsula and saying that sanctions implemented by the EU because
of the conflict in Ukraine should be lifted (Ferris-Rotman, 2017). Combined with her far-right policy,
anti-NATO and anti-immigration stances and the election promise to take France out of the EU, it must
8
Referring to the information security researcher “The Grugq”, the Japanese cybersecurity firm Trend Micro,
information security firm Proofpoint, and cybersecurity experts at Threat Connect (Vilmer, 2019, pp. 19–21).
36
sound like music to Russian ears (Bryant, 2017; Ferris-Rotman, 2017) to see her win. Moreover, Le Pen
borrowed €11 million from a private Russian lender in 2014 to support her campaign and is portrayed
positively by RT and Sputnik (Ferris-Rotman, 2017; Obeidallah, 2017). Taken together, it was in
Russia's strategic interest that Le Pen won the vote, and, therefore, the attempt to weaken the Macron
campaign certainly points to Russia on the strategic level. Although attribution is difficult at this stage,
what can be safely assumed is that the perpetrators, spreaders and amplifiers of the hack were at least
linked to Russian interest (Vilmer, 2019, p. 23). France was also warned of possible Russian hacks ahead
of the elections (Matishak, 2017).
3.4.2 Analysis and Assessment
The Macron Leaks are crucial to analyse, as it was another attempt to weaken the EU, but more certainly
because it failed. For the purpose of this thesis, it is vital to indicate what went wrong and what lessons
can be learned from this case. Considering the above-described situation, it is not surprising that a
campaign against Macron enfolded during election time. However, the campaign failed, and Macron
won the election with two-thirds of the vote (Conseil constitutionnel, 2017).
Multiple explanations for the campaign's failure have been given (Vilmer, 2019, pp. 26–40,
2021, pp. 80–84) due to structural reasons, luck, and effective anticipation and reaction. As shown in
figure 5, the attack initiator used a well-known hack and leak disinformation operation, and we can
consider the hacker (susceptibly Russia), the leaker, the catalyst accounts, and the bots as initiators of
the disinformation attack. For the attack, use has been made of well-known channels, such as bots and
other Twitter accounts, also used during the Clinton campaign leaks, to disseminate the content on social
media. Research into the social bot operations shows that the MacronLeaks campaign was limited to a
primarily English speaking community and failed to percolate the French-speaking community (Ferrara,
2017, p. 12). In addition, research also shows that the majority of Twitter users involved in the
MacronLeaks campaign had a prior interest in American politics, right-wing narratives, and alt-right
political agendas (Ferrara, 2017, p. 12). The campaign reached a network of true believers (L. C. J. Prier,
2017), but most believers were not allowed to vote in the French elections, as they were not French
citizens and primarily located outside of France.
Further analysis also reveals an interesting fact: most tweets considering the MacronLeaks
campaign were in English, while French is only second (Ferrara, 2017, pp. 14–15). This is in stark
difference with the general conversation, where French is, by far, the most prominent language in the
debate. Suggesting that the main participants in the MacronLeaks campaign were not the French-
speaking community but rather the English-speaking American user base (Ferrara, 2017, p. 15). This
result might explain why the campaign failed and reached scarce success to affect the French-speaking
community allowed to vote. The general conversation, which happened significantly more in French
(thus, likely by French voters), exhibited a clear trend favouring Macron (Ferrara, 2017, p. 15). An
37
implication was that the campaign's framing did not meet with the French societal characteristics, as
claims were not culturally resonant (Benford & Snow, 2000, p. 622).
The MacronLeaks campaign went viral, but not within the audience allowed to vote during the
election, as most of the spreaders of the disinformation were not French (Vilmer, 2019, p. 15). The
whole operation did not significantly influence French voters (Vilmer, 2021, p. 80). The media
environment in France is robust, alternative websites are far less popular than in other countries, and
there is a strong recognition for serious journalism. The populations mainly refer to traditional new
sources, and Cartesianism plays a role and is encouraged from primary school and throughout personal
life (Vilmer, 2021, p. 80). French voters tend to share better and more reliable content online than, for
example, US voters and have low trust in information coming from social networks (Vilmer, 2019, p.
27), and are, therefore, less receptive to these forms of influence (Heuer, 1990). In addition, the attack
also suffered from cultural clumsiness as most of the catalyst accounts and bots promoting the leaks
were in English, coming from American alt-right audiences. The French are known for having moderate
foreign language skills, so penetrating the population was, in this way, not effective. It, thereby, possibly
alienated nationalist voters unwilling to support anything coming from foreign countries (Vilmer, 2021,
p. 81). Therefore, the whole operation was framed in the wrong language and did not found its
foundations in French society. The operation did not resonate with French society because it did not fit
belief structures and did not confirm peoples bias (L. C. J. Prier, 2017), it did not comply with societal
characteristic, and because of the lack of framing and resonance, it did not lead to mobilisation on a
desirable scale (Benford & Snow, 2000, pp. 619–622). Therefore, the French voters were not perceptive
of the disinformation due to experience, education and culture (Heuer, 1990, pp. 9–10). The attackers
seem to lack sufficient knowledge about their target state (Vilmer, 2019, p. 28), already aware of the
risk of cyberattacks due to past experiences and raised awareness of foreign influence (Vilmer, 2019,
pp. 32–34). In this way, the perpetrator could not reflexive control its target, as successful reflexive
control requires an in-depth study of the target (Thomas, 2004, p. 243). Therefore, the operation was not
successful as the target did not make a favourable decision (Thomas, 2004, p. 242) due to a lack of
knowledge about the target and failure to meet the societal characteristics and existing narratives.
Relevant actors were quick to respond, and campaign managers quickly reached out to law
enforcement to report the hack. The public prosecutor launched an investigation only hours after the
initial dump (Vilmer, 2021, p. 81). Moreover, the campaign staff issued a press release just before the
election silence and, thereby, responded rapidly to the spread of the information. The campaign engaged
with the online discussion that mentioned the leak and indicated that the leak only revealed the regular
operation of a presidential campaign (Vilmer, 2021, p. 83). The leaks contained no evidence; nothing
illegal nor exciting was found in the leaks (Erickson, 2017; Vilmer, 2021, p. 83). The campaign was
immediately transparent on what had happened. Combined with the press release from the Commission
Nationale de Contrôle de la Campagne électorale en vue de l’Élection Présidentielle (CNCCEP), the
38
campaign asked the media "not to report the content of this data, especially on their websites, reminding
them that the dissemination of false information is a breach of law, notably criminal law" (CNCCEP,
2017). The French press did not cover the content of the Macron leaks; instead, the journalists covered
the hacking and influence operation without giving any credibility to the leaked information (L. C. J.
Prier, 2017, p. 80), underlining the French rationality, critical thinking and healthy scepticism (Vilmer,
2021, p. 80). In addition, Russian media outlets like RT and Sputnik have sometimes been banned from
press conferences, justified on the grounds that they are propaganda outlets for Russia. Emmanuel
Macron takes a hard stance against these outlets and blames them openly for influence and deceitful
propaganda (McAuley, 2017; Vilmer, 2021, pp. 83–85). The messaging on social media was also
progressively reduced by counter-messaging, mocking the leak or the leakers and linking them to Russia
(Vilmer, 2019, p. 14). This counter-messaging and transparency about what happened raised awareness
of possible influence from an adversary state, making the target audience even less receptive to the
disinformation (Lanoszka, 2019, p. 236).
Overall, structural factors combined with the responsive and effective strategy allowed France
to successfully mitigate the damage of the Macron campaign (Vilmer, 2021, p. 84); as it was not
perceived by French society, it did not significantly influence French voters, and, therefore, failed to
influence the target. This resilience and response have led to the situation that the disinformation
campaign did not have the desired effect (Heuer, 1990, p. 5; Thomas, 2004, p. 241).
Figure 5: The Macron Leaks Operation: visualisation
39
4 Conclusions – Disinformation and International Security
Influence operations in Western democracies are the new normal. This thesis analysed and compared
the use of disinformation in four different case studies. The use of false or misleading information
supplied by a foreign power to influence the policy of opinions of those who receive it (la Cour, 2020,
p. 708) is a way to achieve a strategic or geopolitical outcome (Thomas, 2004; Till, 2020, p. 12; Weedon
et al., 2017, p. 5). Inspired by new technologies and internet culture (Sarts, 2021), but building upon
methods used for decades (Rid, 2020), foreign powers try to take the new unconventional (hybrid) ways
of warfare to the public to reflexive control populations to get desired outcomes (Thomas, 2004; Till,
2020).
As disinformation is identified as an issue, as done by the European Commission, to what extent
does it affect International Security? The disinformation cases analysed in this thesis originate
susceptibly or allegedly in Russia. The cases were at least in line with Russia’s broader geopolitical
interest. As the fall of the Soviet Union, in Russia’s view, distorted the balance of power, Russia tries
to restore its world power status (H. Bouwmeester, interview, 10 May 2021; Lynch, 2016; Vilmer, 2019)
by using a hybrid spectrum of tools to gain influence. Today, Russia makes use of the anti-liberal wave
that contributed to the weakening of Western dominance (Gerrits, 2018, p. 9) and uses and abuses
legitimacy problems emerging in the West (Bennett & Livingston, 2018). Disinformation is one tool in
a complete hybrid spectrum of tools used in unconventional ways of warfare to pursue state interest
(Wither, 2016, p. 78), and hybrid warfare is a term used to catch the complexity of modern-day warfare
(Filipec, 2020; Wither, 2016, p. 75). The aim stays the same as conventional forms of warfare; to exploit
the threat or use of organised violence to gain physical or psychological advantages over an opponent
(Wither, 2016, p. 86). Ordinary citizens are now the target in these forms of warfare. The goal is to
influence public opinion to distort domestic or foreign political sentiment to gain power and ensure
survival (Bennett & Livingston, 2018, p. 132). A government waging a disinformation campaign in
International Relations tries to achieve specific policy objectives inducing favourable changes or
preventing unfavourable changes (Lanoszka, 2019, p. 232). In this case, Russia tries to gain reflexive
control over populations in the West to let them act favourable and in the strategic interest of Russia.
Over a more extended period, this must ultimately affect International Security in favour of Russia
(Bittman, 1985, p. 2). Disinformation is not an end in itself, but serves a larger geopolitical objective
(Gerrits, 2018, p. 8), but with disinformation, there is a lot to win and little to lose (Gerrits, 2018, p. 10).
Considering International Security, Brexit was an event of great strategic importance (Gerrits,
2018, p. 21) and in line with Russia’s goal to diminish the United Kingdoms and Western influence to
get stronger themselves in the international system (Ellehuus, 2020, p. 1). Therefore, it came as no
surprise that disinformation surrounding Brexit was primarily supporting the leave campaign, targetted
on existing narratives and frames, to make Brexit happen (Ellehuus, 2020, p. 6). Shortly after Brexit,
the arrows were at France and to get a favourable result in the 2017 elections to further diminish the
40
European Union as an opposing power to Russia. France was a vital ally to keep the European Union
together, especially after Brexit (Ferris-Rotman, 2017). However, the Macron Leaks operation failed,
and the attempt to further weaken the Western block with disinformation was unsuccessful (Vilmer,
2019, 2021) as the French population proved to be resilient and aware. The Baltics, where the
disinformation campaign intended to weaken NATO as another Western power block, did not get hold
of domestic society. However, as indicated in the analysis, the aim of this campaign was probably the
Russian minority groups and the former Soviet citizens, who are now registered as non-citizens, to
weaken the state internally within a longer strategic goal. The effects of the German case remain
uncertain, it had a short-term effect resulting in demonstrations and attacks, but the effects on the
elections a year later remain unclear. The aim in Germany was more structured for the longer term and
to widen existing cracks in society as a means to disintegrate the system. However, as evidence suggests,
believing disinformation in German society leads to a vote on a non-established, more alternative
political party (Zimmermann & Kohring, 2020) and, therefore, has the potential to limit policy
responses. As disinformation, to be perceived, has to be consistent with the audience emotions (Heuer,
1990), as emotions matter more than facts, timing, such as in Germany, seems essential. Therefore, the
timing in Germany was crucial and allowed the public to mobilise based on their emotions.
To be effective and affect International Security, disinformation needs to exploit and exacerbate
rather than create (Gerrits, 2018, p. 1; Heuer, 1990) on existing societal characteristics, has to resonate
with target audiences (Benford & Snow, 2000) to affect attitudes and opinions (Heuer, 1990, pp. 12–
19) and can than, eventually, lead to strategic goals. As analysed and compared in the case studies, those
foundations were found in the United Kingdom and Germany, but less in the Baltics and France. For
disinformation to work, it has to be based on an in-depth study of the target audience to find the resonant
frames and narratives that undermine Western society on which the influence campaigns are targeted to
harm International Security. While building on such an existing belief structure, narrative or frame, and
using agents of influence, campaigns can get a foothold in society, lead to mobilisation and have
desirable outcomes. Strategic effects in Germany were less visible than in France and the United
Kingdom as it was not an all-out-election as were a couple of percentage points can make the difference.
A difference of a couple of percentage points, such as with the Brexit vote, may change the outcome of
the vote with tremendous policy consequences (Heuer, 1990, p. 5). Although the causal relationship is
almost impossible to prove (Heuer, 1990, p. 3), Russian intentions were clear and considering the
analysis made; we can see a clear trend of influence operations across the European Union. Thus,
disinformation as a hybrid tool certainly can affect International Security or at least intends to affect the
current International Security and, therefore, it is a threat to our security and should be treated.
Considering the case studies analysed and the relation between disinformation and International
Security, Brexit was the only case that negatively affected International Security. To measure
effectiveness, it is vital to identify the desired goals and achieved results of a disinformation campaign.
41
The goal in the Brexit vote was to make Brexit happen, to which the disinformation campaign
contributed. Even if it is impossible to solely blame the disinformation campaign for the outcome of the
vote, the intentions were clear; contribute to the weakening of the European Union. The campaign failed
in France and did not affect International Security, but again, the goal was clear; further weaken the
European Union by supporting and helping Le Pen in time of the elections. In Germany, the effects on
International Security remain unclear. The analysed disinformation campaign was one element of a
broader objective; weakening and destabilising the German government. A broader, more in-depth
insight into Germany is needed to measure the effects on International Security, as this case study gives
limited insight. This research can be done by looking at potential policy consequences resulting from
disinformation, and certainly, during the upcoming Bundestag elections in 2021. For the Baltics, the
disinformation campaign had zero effect on International Security if the goal of the disinformation
campaign was to influence the opinions of the domestic citizens. Overall, under the right circumstances,
disinformation can contribute to events concerning International Security; it can exacerbate and
accelerate tensions, but considering the case studies used, apart from Brexit, disinformation in itself did
not affect International Security. Russian certainly tries to affect International Security with
disinformation and considers it as a means to achieve geopolitical advantages. However, contributing
International Security events only to the role of disinformation is unjustified. Disinformation is just one
tool in the complete, modern, and hybrid spectrum to weaken Western dominance, building on existing
grievances. Looking at one case study limits the results when analysing longer strategic goals. On longer
terms, Disinformation can weaken state and institutional structures, limit policy responses in a state, and
should be indicated as a threat to security. Further analysis into, for example, specific policy effects as
a result of disinformation can contribute to the argument of this thesis, and continuously analysing
disinformation, especially with the use of new technologies, concerning International Security seems to
be a requirement for every government in the West to limit potential long-term consequences.
42
Epilogue: Confronting Disinformation as a Security Threat
As civilians are the targets in modern-day information warfare (McGeehan, 2018, p. 51), measures have
to be found and implemented to protect civilians from exposure or create resilience against
disinformation. However, defending populations against disinformation to shape perception is
challenging (McGeehan, 2018, p. 53). This chapter, drawing on the used case studies and the theoretical
foundations, indicates a couple of possible solutions to confront disinformation campaigns. However,
as technology is ever-changing, solutions are, therefore, not static but dynamic. Potential solutions
proposed to solve the problem of disinformation can be divided into four categories: (1) algorithmic, for
example, with artificial intelligence, (2) corrective, with (military) task-forces correcting disinformation,
(3) legislative, by law, (4) psychological, by making the public resilient and aware of the problem
(Linden & Roozenbeek, 2020, p. 150).
Algorithmic solutions, for example, artificial intelligence, might reach a point where it can
automatically detect, red flag, and discern disinformation on a massive scale (McGeehan, 2018, p. 53).
However, as Sarts (2021) explains, these new technologies used to identify disinformation might also
be used to create disinformation. As the information environment is ever-changing, these solutions
contribute to the cat-and-mouse play and look like temporal solutions. Moreover, for example,
algorithms used by Facebook have often backfired, as they were imperfect at detecting disinformation
(Wakefield, 2017). This potential solution might also draw us into a situation where an automated
algorithm defines what disinformation is and what we get exposed to on social media. We might get
subjected to forms of censorship as certain content is (incorrectly) filtered away by a computer without
human intervention. Putting pressure on social media platforms has become more standard, as in the
run-up to the French elections, Facebook suspended seventy thousand accounts as they were sharing
false and misleading content (Vilmer, 2019, pp. 33–34). These essential steps result from growing
pressure by both governments as the public (Tenove, 2020, p. 530; Vilmer, 2019, p. 34). However,
Facebook continues to play a significant role in the dissemination of disinformation (Tenove, 2020, p.
530).
Secondly, corrective solutions by, for example, fact-checking tools are there to debunk false
stories quickly. Although they are laudable, their efficacy remains mixed, and the potential audiences
for fact-checking reports remain limited (Linden & Roozenbeek, 2020, p. 150). Moreover, it is
impossible to fact-check every story as research shows that disinformation spread more easily on social
media than any other kind of news (Vosoughi et al., 2018). Another solution is the use of corrective
tasks forces to correct and delete disinformation. As disinformation is a security problem and identified
within the spectrum of hybrid warfare, responsibility might be by the military. However, the military is
not 'thought police', which might lead to problems separating free speech from disinformation.
Therefore, transparency on how task forces work and how they identify disinformation is crucial to
avoid abuse by authorities (McGeehan, 2018, pp. 56–57). Government must be aware that they do not
43
unduly empower government agencies or governing political parties while fighting disinformation
(Tenove, 2020, p. 531). Estonia's Defence League runs an anti-propaganda blog that counters potential
harmful narratives, highlights corporate practices on social media, names and shames individuals and
posts designed to further disinformation, and promotes media literacy (Robbins, 2020). Media literacy
is also promoted across the UK as more citizens get their news from social media platforms (Ellehuus,
2020, p. 15) and because of the highly partisan commercial press (Ellehuus, 2020, p. 14). Estonia even
created its Russian-language channel to counter messages by the Kremlin aimed at Russian minorities
(Robbins, 2020). The Baltic states have their counter-message group of volunteers, called the Baltic
Elves, who report bots, counter-message Russian trolls, monitor news article message boards for
disinformation and counter-narratives across the Baltics (Robbins, 2020). In France, the campaign team
responded to social media posts online about the leak and indicated that the leak only revealed the regular
operation of the presidential campaign (Vilmer, 2021, p. 83). Timely responses seem to be crucial.
Quickly disseminating the truth about a disinformation story, as it spreads more quickly than other
information, is critical as the longer the story goes, the more truthful it appears (McGeehan, 2018, p.
54). In Germany, the Lisa story was first shared on a minor Russian website; a task force should have
been capable of detecting the false story earlier and reacting appropriately, debunking the story and
create a counter-narrative (Janda, 2016).
Thirdly, a more radical solution to disinformation is legislation. In the UK, the parliamentary
committee declared that electoral law is not fit for the digital age and needs to be amended to reflect
new technologies (Tenove, 2020, p. 526). France's "Fake news law", which during election time pose
stricter restrictions on media outlets on what they are allowed to put online, is an example of such
legislation (Linden & Roozenbeek, 2020, pp. 150–151). However, legislative solutions can easily
backfire, as granting an organisation, governmental or not, the power to decide what is real or not can
easily lead to censorship. As subjectivity plays a huge role in identifying disinformation, organisations
or task forces can easily lead to different interpretations of truths and fakes (Pieters, 2018), leading to
false or biased reporting (Tenove, 2020, p. 525). In France, legislation was beneficial, as the hack was
quickly reported and an investigation into the hack started by the prosecutor's office (Vilmer, 2021, p.
81); the media was by law restricted to report about the contents of the hack as disseminating false
information is a breach of law (L. C. J. Prier, 2017, p. 80; Vilmer, 2021, p. 83). However, these forms
of legislation can threaten the free press (Tenove, 2020, p. 527) or influence communication at crucial
moments (Tenove, 2020, p. 531). In addition, the media environment in France is also more regulated,
as paid advertisements are forbidden, stricter regulations are in place around opinion polls, the use of an
equal time rule, and the election silence (Vilmer, 2019, p. 26). There are also concerns about legislative
solutions to the notion of freedom of expression (Tenove, 2020, p. 519). As disinformation is a threat to
an open democratic society, we cannot lose that open society by fighting against it (Tenove, 2020, p.
44
525). Social media regulation, for example, might form a threat to the freedom of expression (Tenove,
2020, p. 530).
It is essential to make sure that disinformation does not go viral in the first place (Linden &
Roozenbeek, 2020, p. 163). More attention is recently directed to the individual level and the role of
psychology, education, and behavioural sciences in combating disinformation (Linden & Roozenbeek,
2020, p. 151). Making the population more resilient and aware of a possible form of disinformation and
letting them make their subjective judgements is undoubtedly a more effective solution in the long term.
An idea to make citizens more resilient and aware is the inoculation theory. According to William
McGuire, the focus has to be shifted to a different question: how can we help people resist persuasion
attempts. This question helped him develop the inoculation theory, which he described as a vaccine for
brainwash and is based on an analogy from immunology (McGuire, 1961). The same can occur with
information by "preemptively presenting someone with a weakened version of a misleading piece of
information, a thought process is triggered that is analogous to the cultivation of "mental antibodies"
rendering the person immune to (undesirable) persuasion attempts" (Linden & Roozenbeek, 2020, p.
152). There is a need to identify and preemptively message groups susceptible to disinformation in a
"mass vaccination" campaign (McGeehan, 2018, p. 54). As analysed in the theoretical foundations,
frames and narratives can be identified where disinformation campaigns are likely to occur. Within those
frames and narratives, weakened versions of disinformation can be used to "vaccinate" the frames and
narratives and create "herd immunity" for disinformation within those groups (McGeehan, 2018, p. 54).
The inoculation process consists of two main components, "namely: (1) a warning to elicit and activate
threat in message recipients (the affective bias) and (2) refutational pre-emption" (Linden &
Roozenbeek, 2020, p. 152). However, prevention campaigns cannot be effective if people do not
understand them (McGeehan, 2018, p. 54). Therefore, Roozenbeek & van der Linden (2020) theorise
that taking the role of someone actively trying to deceive you will be an effective way to create more
resistance to disinformation by actively letting them create disinformation (Linden & Roozenbeek,
2020, p. 155). Educational programs
9
of creating disinformation are being set up with games to educate
people on how disinformation works, theorising that this will make them aware and more resilient
against disinformation campaigns (Linden & Roozenbeek, 2020, pp. 155–161). An education that helps
people to spot disinformation techniques and to make them more resliant in the process (Linden &
Roozenbeek, 2020, p. 156).
Education and resilience are utterly essential and needed to confront disinformation. It is a
national security imperative that citizens can critically think and discerning the truth (McGeehan, 2018,
p. 54). By looking up facts online, one can be easily directed to false or misleading information. In the
attention economy, where content is tailored for immediate consumption, people get quickly confronted
9
Harmony Square is an example of such a game: https://harmonysquare.game/en
45
with disinformation without knowing (McGeehan, 2018, p. 55). Social media platforms are constantly
struggling to balance commercial interest (as they want to keep people on their platforms) and provide
them with unbiased content (which might lead to decreased time spent on social media platforms) (L.
C. J. Prier, 2017, p. 80). Critical thinking must receive more attention to create citizens who can
objectively evaluate information and its sources, look for hidden agendas, and determine the plausibility
of the content (McGeehan, 2018, p. 56). Likewise, citizens must be aware of the pitfalls and "echo
chambers" provided by social media platforms that isolate them from the outside world and limit the
information consistent with their already in-believe narratives and frames (McGeehan, 2018, p. 56). As
analysed in the French case study, the French media system is robust and French citizens share more
quality information. French society is more resilient as they are more critical, rational, and sceptical
about social media information (Vilmer, 2021, p. 80). Moreover, due to historical experience and the
ANSSI and the CNCCEP alerting the media, the French population was more aware of a possible hack
to influence the campaign (Vilmer, 2021, p. 82), generating awareness among the public and media
(Vilmer, 2019, p. 34). The French authorities were transparent, with press releases, about what happened
and, in the process, making the population more aware of potential foreign influence (Vilmer, 2021, pp.
82–83). This messaging must be done in a coordinated way in a whole government approach, so no gaps
to exploit remains and need to be followed up by a public release of evidence (Ellehuus, 2020, p. 22) to
make the public aware. With an aware French population on foreign influence, the French people were
sceptical and untrustful about the hack potentially coming from an adversary state, complying with the
argument of Lanoszka (2019, pp. 233-234), which helped limit the influence of the campaign. The
German police were less transparent about the investigation started on the initial case. The huge
coverage provided by Russian media backfired in German media (Janda, 2016) and provided the story
with the opportunity to unfold in a not aware society.
Societal and political polarisation is a vulnerability where disinformation is often aimed at
influencing campaigns (Ellehuus, 2020, p. 16) and might be effective in highly polarised societies and
exploits rather than creates (Gerrits, 2018; Lanoszka, 2019, p. 237). Existing narratives and frames are
exploited. Treating the underlying causes of this polarisation, narratives, and frames might be the best
solution to tackle disinformation (Gerrits, 2018, p. 13) and making them more aware of potential foreign
influence. A lack of trust in governmental organisations and the media is one reason disinformation can
be successful (Bennett & Livingston, 2018; Ellehuus, 2020, p. 22). Working on trust and transparency
within society is crucial to limit the effect of disinformation campaigns. The aim for governments
remains to fight disinformation without disproportionately limiting essential freedoms (Gerrits, 2018, p.
18) and restore the societies declining trust.
46
References
Ajir, M., & Vailliant, B. (2018). Russian Information Warfare: Implications for Deterrence Theory.
Strategic Studies Quarterly, 12(3), 70–89.
Ayoob, M. (1995). The Third World Security Predicament: State Making, Regional Conflict, and the
International System. Lynne Rienner Publishers.
Baczynska, G. (2021, March 9). Germany is main target of Russian disinformation, EU says. Reuters.
https://www.reuters.com/article/us-eu-russia-germany-idUSKBN2B11CX
Baraniuk, C. (2016, June 21). Beware the Brexit bots: The Twitter spam out to swing your vote. New
Scientist. http://www.newscientist.com/article/2094629-beware-the-brexit-bots-the-twitter-spam-out-
to-swing-your-vote/
Barojan, D. (2018, February 7). #BalticBrief: Enhanced Anti-NATO Narratives Target Enhanced Forward
Presence. Medium. https://medium.com/dfrlab/balticbrief-enhanced-anti-nato-narratives-target-
enhanced-forward-presence-fdf2272a8992
BBC. (n.d.). EU Referendum Results—BBC News. Retrieved 25 May 2021, from
https://www.bbc.com/news/politics/eu_referendum/results
BBC. (2020, July 21). Russia report: UK ‘badly underestimated’ threat, says committee. BBC News.
https://www.bbc.com/news/uk-politics-53484344
Benford, R. D., & Snow, D. A. (2000). Framing Processes and Social Movements: An Overview and
Assessment. Annual Review of Sociology, 26, 611–639.
Bennett, L., & Livingston, S. (2018). The disinformation order: Disruptive communication and the decline
of democratic institutions. European Journal of Communication, 33, 122–139.
https://doi.org/10.1177/0267323118760317
Bergmane, U. (2020). Fading Russian Influence in the Baltic States. Orbis, 64(3), 479–488.
https://doi.org/10.1016/j.orbis.2020.05.009
Berliner Zeiting. (2017, February 28). Der Fall Lisa (13): Mann (23) wegen sexuellen Missbrauchs
angeklagt. https://www.bz-berlin.de/berlin/marzahn-hellersdorf/der-fall-lisa-13-mann-23-wegen-
sexuellen-missbrauchs-angeklagt
47
Berzina, I., Bērziņš, J., Hirss, M., Rostoks, T., & Vanaga, N. (2018). The Possibility of Societal
Destabilization in Latvia: Potential National Security Threats. National Defence Academy.
Biesecker, B. A. (2018). Guest Editor’s Introduction: Toward an Archaeogenealogy of Post-truth.
Philosophy & Rhetoric, 51(4), 329–341.
Bittman, L. (1985). The KGB and Soviet Disinformation: An Insider’s View (1st Edition). Pergamon Pr.
Bouwmeester, H. (2021, May 10). Interview [Online].
Brandom, R. (2017, May 5). Emails leaked in ‘massive hacking attack’ on French presidential campaign.
The Verge. https://www.theverge.com/2017/5/5/15564532/macron-email-leak-russia-hacking-
campaign-4chan
Bryant, E. (2017, February 24). Le Pen blasts EU, NATO, praises Trump. DW.COM.
https://www.dw.com/en/le-pen-blasts-eu-nato-praises-trump/a-37696454
Buzan, B., Waever, O., & Wilde, J. de. (1998). Security: A New Framework for Analysis. Lynne Rienner
Publishers.
Cadwalladr, C. (2017, May 7). The great British Brexit robbery: How our democracy was hijacked. The
Guardian. http://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-
hijacked-democracy
Cesare, M. (2020, December 14). Russian Encroachment in the Baltics: The Role of Russian Media and
Military - Foreign Policy Research Institute. https://www.fpri.org/article/2020/12/russian-
encroachment-in-the-baltics-the-role-of-russian-media-and-military-2/
Channel 4 News. (2018, July 20). Exclusive: Court documents claim new Arron Banks links with Russia.
Channel 4 News. https://www.channel4.com/news/exclusive-court-documents-claim-new-arron-
banks-links-with-russia
CNCCEP. (2017, May 6). Commission Nationale de Contrôle de la Campagne en vue de l’Élection
Présidentielle. http://www.cnccep.fr
Collins, A. (Ed.). (2018). Contemporary Security Studies (Fifth Edition). Oxford University Press.
Complex crises call for adaptable and durable capabilities. (2015). The Military Balance, 115(1), 5–8.
https://doi.org/10.1080/04597222.2015.996334
48
Conseil constitutionnel. (2017, May 10). Décision n° 2017-171 PDR du 10 mai 2017 | Conseil
constitutionnel. https://www.conseil-constitutionnel.fr/decision/2017/2017171PDR.htm
Courrier d’Europe. (2019, March 26). Baltic states’ Russian disquiet. The New Federalist.
https://www.thenewfederalist.eu/baltic-states-russian-disquiet
DELFI. (2016, February 1). Apklausa: Lietuviai – už NATO ir stiprią kariuomenę. DELFI.
https://www.delfi.lt/news/daily/lithuania/apklausa-lietuviai-uz-nato-ir-stipria-
kariuomene.d?id=70275978
Ellehuus, R. (2020, July 20). Mind the Gaps: Assessing Russian Influence in the United Kingdom.
https://www.csis.org/analysis/mind-gaps-assessing-russian-influence-united-kingdom
Ellyatt, H. (2017, May 7). US far-right activists, WikiLeaks and bots help amplify Macron leaks:
Researchers. CNBC. https://www.cnbc.com/2017/05/07/macron-email-leaks-far-right-wikileaks-
twitter-bots.html
Erickson, A. (2017, May 6). Analysis | Macron’s emails got hacked. Here’s why French voters won’t hear
much about them before Sunday’s election. Washington Post.
https://www.washingtonpost.com/news/worldviews/wp/2017/05/06/macrons-emails-got-hacked-heres-
why-french-voters-wont-hear-much-about-them-before-sundays-election/
European Commission. (2018, April 26). Tackling online disinformation [Text]. European Commission -
European Commission. http://ec.europa.eu/commission/presscorner/detail/en/MEMO_18_3371
EUvsDisinfo. (n.d.). STUDIES AND REPORTS. EU vs DISINFORMATION. Retrieved 4 June 2021, from
https://euvsdisinfo.eu/reading-list/
Ferrara, E. (2017). Disinformation and Social Bot Operations in the Run Up to the 2017 French
Presidential Election. https://doi.org/10.5210/fm.v22i8.8005
Ferris-Rotman, A. (2017, February 17). Russia ♥ Marine Le Pen. POLITICO.
https://www.politico.eu/article/russia-%e2%99%a5-marine-le-pen-national-front-vladimir-putin-
kremlin-france-elections/
Filipec, O. (2020). Hybrid Warfare: Between Realism, Liberalism and Constructivism. 52–70.
49
Galeotti, M. (2019, April). The Baltic States as Targets and Levers: The Role of the Region in Russian
Strategy. http://www.marshallcenter.org/en/publications/security-insights/baltic-states-targets-and-
levers-role-region-russian-strategy-0
Gerrits, A. W. M. (2018). Disinformation in International Relations: How Important Is It? Security and
Human Rights, 29(1–4), 3–23. https://doi.org/10.1163/18750230-02901007
Gillett, F. (2017, November 2). Probe launched into claims of Russian meddling during Brexit vote.
https://www.standard.co.uk/news/politics/election-watchdog-launches-probe-into-russian-meddling-
in-brexit-vote-a3674251.html
Gorodnichenko, Y., Pham, T., & Talavera, O. (2018). Social Media, Sentiment and Public Opinions:
Evidence from #Brexit and #USElection (No. w24631). National Bureau of Economic Research.
https://doi.org/10.3386/w24631
Heuer, R. J. (1990). Assessing Soviet Influence Operations. 29.
IISS. (2018, February). The Military Balance 2018—Europe (Chapter Four). IISS.
https://www.iiss.org/publications/the-military-balance/the-military-balance-2018/mb2018-04-europe
Intelligence and Security Committee of Parliament. (2020). Russia Report: Press notice.
https://isc.independent.gov.uk/wp-content/uploads/2021/01/20200721_Russia_Press_Notice.pdf
Janda, J. (2016). The Lisa Case: STRATCOM Lessons for European states. Federal Academy for Security
Policy. http://www.jstor.org/stable/resrep22152
Kallas, K. (2016). Claiming the diaspora: Russia’s compatriot policy and its reception by Estonian-Russian
population. JEMiE, 15, 1.
Kampfner, J. (2020, July 29). Russia and China in Germany. RUSI. https://rusi.org/publication/occasional-
papers/russia-and-china-germany
Kirkpatrick, D. D., & Rosenberg, M. (2018, June 29). Russians Offered Business Deals to Brexit’s Biggest
Backer. The New York Times. https://www.nytimes.com/2018/06/29/world/europe/russia-britain-
brexit-arron-banks.html
Kivirähk, J. (2015). Public Opinion and National Defence. Estonian Ministry of Defence, April.
la Cour, C. (2020). Theorising digital disinformation in international relations. International Politics, 57(4),
704–723. https://doi.org/10.1057/s41311-020-00215-x
50
Lanoszka, A. (2019). Disinformation in international politics. European Journal of International Security,
4(2), 227–248. https://doi.org/10.1017/eis.2019.6
Latvijas Sabiedriskais medijs. (2016, June 27). Support for EU and NATO membership high among Latvian
speakers. https://eng.lsm.lv/article/society/defense/support-for-eu-and-nato-membership-high-among-
latvian-speakers.a189527/
Lebourg, N. (2018, May 15). The French Far Right in Russia’s Orbit | Carnegie Council for Ethics in
International Affairs. https://www.carnegiecouncil.org/publications/articles_papers_reports/the-
french-far-right-in-russias-orbit
Lenin, V. (1902). What is to be done? https://www.marxists.org/archive/lenin/works/download/what-itd.pdf
Linden, S. van der, & Roozenbeek, J. (2020). Psychological Inoculation Against Fake News. In The
Psychology of Fake News. Routledge.
London School of Economics. (2013, February 5). Five minutes with Colin Crouch: “A post-democratic
society is one that continues to have and to use all the institutions of democracy, but in which they
increasingly become a formal shell”. British Politics and Policy at LSE.
https://blogs.lse.ac.uk/politicsandpolicy/five-minutes-with-colin-crouch/
LRT. (2017, January 27). Lietuvos gyventojai ypač palankiai vertina narystę NATO. lrt.lt.
https://www.lrt.lt/naujienos/lietuvoje/2/161652/lietuvos-gyventojai-ypac-palankiai-vertina-naryste-
nato
Lynch, A. C. (2016). The influence of regime type on Russian foreign policy toward “the West,” 1992–
2015. Communist and Post-Communist Studies, 49(1), 101–111.
https://doi.org/10.1016/j.postcomstud.2015.12.004
Mahairas, A., & Dvilyanski, M. (2018). Disinformation – Дезинформация (Dezinformatsiya). The Cyber
Defense Review, 3(3), 21–28.
Mankoff, J. (2020, July 24). With Friends Like These: Assessing Russian Influence in Germany [Center for
Strategic and International Studies]. With Friends Like These: Assessing Russian Influence in
Germany. https://www.csis.org/analysis/friends-these-assessing-russian-influence-germany
Matishak, M. (2017, May 9). NSA chief: U.S. warned France about Russian hacks before Macron leak.
POLITICO. https://www.politico.com/story/2017/05/09/us-warned-france-russia-hacking-238152
51
McAuley, J. (2017, May 29). French President Macron blasts Russian state-owned media as ‘propaganda’.
Washington Post. https://www.washingtonpost.com/world/europe/french-president-macron-blasts-
russian-state-run-media-as-propaganda/2017/05/29/4e758308-4479-11e7-8de1-
cec59a9bf4b1_story.html
McGeehan, T. P. (2018). Countering Russian Disinformation. Parameters, 48(1), 49–57.
McGuire, W. J. (1961). Resistance to persuasion conferred by active and passive prior refutation of the
same and alternative counterarguments. The Journal of Abnormal and Social Psychology, 63(2), 326–
332. https://doi.org/10.1037/h0048344
Ministry of Defence of the Republic of Latvia. (2018, November 28). Residents’ Poll on State Defence
Issues | Aizsardzības ministrija. http://www.mod.gov.lv/en/news/residents-poll-state-defence-issues
Ministry of Defence, Russian Federation. (n.d.). Russian Federation Armed Forces’ Information Space
Activities Concept: Ministry of Defence of the Russian Federation. Retrieved 24 March 2021, from
https://eng.mil.ru/en/science/publications/more.htm?id=10845074@cmsArticle
Ministry of Foreign Affairs of the Russian Federation. (2013). Concept of the Foreign Policy of the Russian
Federation 2013.
Morgenthau, H. J. (1973). Politics Among Nations: The Struggle for Power and Peace. Alfred A. Knopf.
Mostrous, A., Bridge, M., & Gibbons, K. (2017, November 15). Russia used Twitter bots and trolls ‘to
disrupt’ Brexit vote. https://www.thetimes.co.uk/article/russia-used-web-posts-to-disrupt-brexit-vote-
h9nv5zg6c
NATO. (1949, April 4). The North Atlantic Treaty. NATO.
http://www.nato.int/cps/en/natohq/official_texts_17120.htm
NCTV. (2021, April 14). Interview [Online].
Nimmo, B., & Aleksejeva, N. (2017, March 11). Lisa 2.0. Medium. https://medium.com/@DFRLab/lisa-2-
0-133d44e8acc7
Nimmo, B., Barojan, D., & Aleksejeva, N. (2018, February 6). Russian Narratives on NATO’s Deployment.
Medium. https://medium.com/dfrlab/russian-narratives-on-natos-deployment-616e19c3d194
Nimmo, B., Durakgolu, N., Czuperski, M., & Yap, N. (2017, May 8). Hashtag Campaign: #MacronLeaks.
Medium. https://medium.com/dfrlab/hashtag-campaign-macronleaks-4a3fb870c4e8
52
NTV.de. (2016, January). Fall Lisa: Das sind die Fakten. n-tv.de. https://www.n-tv.de/panorama/Fall-Lisa-
Das-sind-die-Fakten-article16865016.html
Obeidallah, D. (2017, April 23). Why Putin and Trump both like Le Pen (opinion)—CNN.
https://edition.cnn.com/2017/04/23/opinions/trump-support-le-pen-opinion-obeidallah/index.html
Officer, T. F. R. (2017). Results Germany—The Federal Returning Officer.
https://www.bundeswahlleiter.de/en/bundestagswahlen/2017/ergebnisse/bund-
99.html#stimmentabelle13
Pieters, J. (2018, March 6). Dutch politicians want EU anti-fake news watchdog scrapped. NL Times.
https://nltimes.nl/2018/03/06/dutch-politicians-want-eu-anti-fake-news-watchdog-scrapped
Polyakova, A. (2019, June 20). Lessons from the Mueller report on Russian political warfare. Brookings.
https://www.brookings.edu/testimonies/lessons-from-the-mueller-report-on-russian-political-warfare/
Prier, J. (2017). Commanding the Trend: Social Media as Information Warfare. Strategic Studies Quarterly,
11(4), 50–85.
Pronk, D. (2021, March 11). Interview [Online].
Rettman, A. (2017, June 2). Macron Leaks could be ‘isolated individual’, France says. EUobserver.
https://euobserver.com/foreign/138106
Rid, T. (2020). Active Measures: The Secret History of Disinformation and Political Warfare. Farrar, Straus
and Giroux.
Risso, L. (2018). Harvesting your soul? Cambridge analytica and brexit. Brexit Means Brexit, 2018, 75–90.
Robbins, J. (2020, October 23). Countering Russian Disinformation [CSIS].
https://www.csis.org/blogs/post-soviet-post/countering-russian-disinformation
Russian Ministry of Foreign Affairs. (2016, January 26). Выступление и ответы на вопросы СМИ
Министра иностранных дел России С.В.Лаврова в ходе пресс-конференции по итогам
деятельности российской дипломатии в 2015 году, Москва, 26 января 2016 года.
https://www.mid.ru/press_service/minister_speeches/-
/asset_publisher/7OvQR5KJWVmR/content/id/2032328
Rutenberg, J. (2017, September 13). RT, Sputnik and Russia’s New Theory of War. The New York Times.
https://www.nytimes.com/2017/09/13/magazine/rt-sputnik-and-russias-new-theory-of-war.html
53
Sarts, J. (2021). Disinformation as a Threat to National Security. In S. Jayakumar, B. Ang, & N. D. Anwar
(Eds.), Disinformation and Fake News (pp. 23–33). Springer. https://doi.org/10.1007/978-981-15-
5876-4_2
Scott, M. (2017, May 6). U.S. Far-Right Activists Promote Hacking Attack Against Macron. The New York
Times. https://www.nytimes.com/2017/05/06/world/europe/emmanuel-macron-hack-french-election-
marine-le-pen.html
Smith, M. (2017, February 10). Most NATO Members in Eastern Europe See It as Protection. Gallup.Com.
https://news.gallup.com/poll/203819/nato-members-eastern-europe-protection.aspx
Swami, V., Barron, D., Weis, L., & Furnham, A. (2018a). To Brexit or not to Brexit: The roles of
Islamophobia, conspiracist beliefs, and integrated threat in voting intentions for the United Kingdom
European Union membership referendum. British Journal of Psychology, 109(1), 156–179.
https://doi.org/10.1111/bjop.12252
Swami, V., Barron, D., Weis, L., & Furnham, A. (2018b). To Brexit or not to Brexit: The roles of
Islamophobia, conspiracist beliefs, and integrated threat in voting intentions for the United Kingdom
European Union membership referendum. British Journal of Psychology (London, England: 1953),
109(1), 156–179. https://doi.org/10.1111/bjop.12252
Tenove, C. (2020). Protecting Democracy from Disinformation: Normative Threats and Policy Responses.
The International Journal of Press/Politics, 25(3), 517–537.
https://doi.org/10.1177/1940161220918740
Thomas, T. (2004). Russia’s Reflexive Control Theory and the Military. The Journal of Slavic Military
Studies, 17, 237–256. https://doi.org/10.1080/13518040490450529
Thompson, T. (2019, January 9). Countering Russian disinformation the Baltic nations’ way. The
Conversation. http://theconversation.com/countering-russian-disinformation-the-baltic-nations-way-
109366
Till, C. (2020). Propaganda through ‘reflexive control’ and the mediated construction of reality. New Media
& Society, 1461444820902446. https://doi.org/10.1177/1461444820902446
Tzu, S. (2008). The art of war. Routledge.
54
Ullman, R. H. (1983). Redefining Security. International Security, 8(1), 129–153.
https://doi.org/10.2307/2538489
Vilmer, J.-B. J. (2019, June 20). The ‘#Macron leaks’ operation: A post-mortem. Atlantic Council.
https://www.atlanticcouncil.org/in-depth-research-reports/report/the-macron-leaks-operation-a-post-
mortem/
Vilmer, J.-B. J. (2021). Fighting Information Manipulation: The French Experience. In S. Jayakumar, B.
Ang, & N. D. Anwar (Eds.), Disinformation and Fake News (pp. 75–89). Springer.
https://doi.org/10.1007/978-981-15-5876-4_6
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380),
1146–1151. https://doi.org/10.1126/science.aap9559
Wakefield, J. (2017, November 7). Facebook’s fake news experiment backfires. BBC News.
https://www.bbc.com/news/technology-41900877
Walt, S. M. (1987). The origins of alliances. Cornell University Press.
Weedon, J., Nuland, W., & Stamos, A. (2017). Information operations and Facebook. Undefined.
/paper/Information-operations-and-Facebook-Weedon-
Nuland/f633771f0f586aaa89120a9003e2b24dddaf4d89
Wither, J. K. (2016). Making Sense of Hybrid Warfare. Connections, 15(2), 73–87.
Zevelev, I. (2016, August 22). The Russian World in Moscow’s Strategy. The Russian World in Moscow’s
Strategy. https://www.csis.org/analysis/russian-world-moscows-strategy
Zimmermann, F., & Kohring, M. (2020). Mistrust, Disinforming News, and Vote Choice: A Panel Survey
on the Origins and Consequences of Believing Disinformation in the 2017 German Parliamentary
Election. Political Communication, 37(2), 215–237. https://doi.org/10.1080/10584609.2019.1686095
55
Appendices
Appendix A: Interview Clingendael – Danny Pronk
Date: 11-03-2021 Function: Senior Research Fellow
These are worked-out notes of the discussion with the expert within Clingendael. Results can be used in
the thesis. Everything described below are the results of the discussion.
During the discussion, we talked about disinformation and the influence that it can have on policy-
making, policy-decisions and societies.
The first thing to consider when analysing disinformation is to look at the targeted audience. It is
essential to distinguish between:
1. Political -> policy-makers, policy-decision
Therefore, the target will be political individuals or groups who make decisions that can have a
favourable outcome for the disseminator of disinformation. The goal is to influence the outcome
of a decision, which can lead to a favourable policy. Use can be made of “useful idiots” who
are used to disseminate disinformation. These can be favourable politicians who retweet,
sometimes unknowingly, disinformation on, for example, their Twitter.
2. Society
Another target of disinformation is society. This is mainly focussed on exacerbating
polarisation, create favourable opinions, influence behaviour, and create a lack of trust in
established institutions.
The second element to consider when analysing disinformation is to look at the effectiveness. The
effectiveness of disinformation is difficult to measure. When disinformation is aimed to change
opinions, results can be seen at opinion polls. However, it is almost impossible to measure if the opinions
changed due to disinformation or that opinions changed due to other causes. The effectiveness of
disinformation is, thereby, even harder to measure in multi-party electoral systems. For example, in the
Netherlands, multiple parties have to work together to form a government. So it will not be easy to
influence elections with strategic outcomes. Two-party systems are more suitable for disinformation or
when disinformation is applied to referenda. Still, it stays hard to measure the effectiveness of
disinformation because it is almost impossible to relate the outcomes to disinformation.
Despite the difficulty to measure the effectiveness, disinformation is used globally by multiple countries.
It can be seen as an element of a bigger toolset used by states to influence other states. This bigger toolset
is often described as hybrid warfare. Combining these hybrid tools are aimed to weaken enemies and
gain influence. The goal is, mostly, to make use of existing polarisation in politics and society with this
toolset and to undermine society. Disinformation as a tool is part of a bigger scale of tools and adds to
56
the hybrid toolset. Creating confusion amongst society is one of the most important goals to create a lost
sense of reality among citizens. Subversion is the central theme in disinformation.
During the discussion, some advice was given on things to consider researching the development of the
thesis. These article’s and theories will be researched on account of the discussion held. Moreover, some
names of other experts are given who are of interesting to approach for this thesis.
- Reflexive control theory
- Operation infection (related
to the AIDS pandemic)
- The article’s about think-
thank research into the
effectiveness of
disinformation (send in the
mail after discussion)
- Heuer Article (send in the
mail after discussion)
57
Appendix B: Interview NLDA - Kolonel dr. A.J.H. (Han) Bouwmeester
Date: 10-05-2021 Function: Universitair Hoofddocent Militaire Strategie en Landoptreden
These are worked-out notes of the discussion with Kolonel dr. A.J.H. (Han) Bouwmeester from the
Dutch Defence Academy (NLDA). Results can be used in the thesis. Everything described below are
the results of the discussion.
Disinformation is a “Tricky topic”. It is essential to identify what it is and when we identify information
as disinformation. The topic depends on what we see as truths, and there are levels of constructivism
present in defining the topic. An important question to ask is how to make interpretations of truths to be
able to define disinformation. Truths, in this case, are subjective, and it is crucial to stay away from
absolute truths when researching the topic. As a researcher, it is essential to define your position within
the topic and from which position analysis are made regarding disinformation.
As we look at, for example, Russian disinformation, it is crucial to define how the topic of disinformation
is seen from a Russian standpoint and not only from a Western standpoint. Russia, for example,
interprets events differently. In the Russian case, disinformation is used as a tool for the global struggle
for power as Russia feels threatened by alliances like NATO. Vladimir Putin is a realist and wants Russia
to be recognized as a world power. Moreover, Putin feels responsibility for ethnic Russians living
abroad- outside Russia’s borders. For example, there are Russian minorities in the Baltic states, which
are often targeted for influence campaigns. NATO and the West are seen as a threat from a Russian
point of view, and disinformation is used to gain more power and strength in the international domain.
What we see as disinformation might be true for Russians. It is, therefore, essential to understand your
position concerning disinformation and create background knowledge of why Russia, in this case, uses
disinformation to understand the dynamics behind the topic. Positioning yourself and try to understand
the topic from a Russian standpoint is essential to understand disinformation accurately.
Kolonel dr. A.J.H. (Han) Bouwmeester researched Russian influence campaigns in the Krim, Ukraine
and Georgia. These cases were admitted by the Russians, making it easier to write specifically about
Russian influence. In other cases, it might be challenging to claim the role of the Russians as they
deny responsibility. In Ukraine and Georgia, narratives concerning the Second World War were used
to influence decision-makers in these countries. The influence campaign was designed based on the
Russian standpoint concerning the Second World War, which is different from the Western standpoints.
To be successful, disinformation has to make use of existing emotions and prejudices. In addition, timing
is an essential element for the dissemination of disinformation. Disinformation has to be placed in the
right place at the right time to meet levels of effectiveness. Moreover, when a message is amplificated,
the real source disappears, and people tend to believe it to be accurate faster. Journalists, in addition,
scan the internet consistently for news stories and on social media and fake stories spread too quickly to
58
research it properly. News that fascinates seems to be more of interest to the public, and people seem to
be more interested in this kind of news.
To some extent, it seems that disinformation spreads at a higher pace and is faster recognized as accurate
than correct information, although this is not based on research. Moreover, in recent years, there is a
declining trust in democratic institutions in the West. Distrust in established institutions seems to be
growing and provided a breeding ground for disinformation, as citizens feel neglected by these
institutions. Thereby, current developments at these institutions do not create more confidence and
contribute to the declining trust. When trust is declining, people tend to believe alternative theories
and start looking for alternative explanations of everyday events. Institutions, therefore, have to work
on their transparency and communication and should undertake activities to increase trust.
To confront disinformation, the works of Jon Roozenbeek might be of interest for the research. Research
has been done on how to debunk disinformation. The inoculation theory is of interest in researching
ways to counter disinformation. Moreover, four debunking methods were discussed. First,
algorithmically to automatically delete disinformation from the internet. This method is, however, also
deleting research into disinformation and therefore not wholly suitable. Second, confronting it by law
but with the risk of oppressing fundamental human rights. Third, task forces to confront campaigns by
quickly acting when it surfaces. Lastly, by educations and making people aware of the information
environment and how to indicate to what extend information is reliable. This method is especially
beneficial for younger generations.
During the discussion, the following sources were mentioned that might be of interest for this research:
- Miltary spectator – information manoeuvre
- Works of Jon Roozenbeek
- Laura Staring – De schaduw van de grote broer
- Inoculation theory
- The Official CIA Manual of Trickery and Deception
59
Appendix C: Interview NCTV - Kerneenheid analyse
Date: 14-04-2021 Function: Analyst
These are worked-out notes of the discussion with the expert within the Nationaal Coördinator
Terrorismebestrijding en Veiligheid (NCTV). Results can be used in the thesis. Everything described
below are the results of the discussion.
The interviewee is an analyst part of the cluster state-actors, aiming to identify and interpret threats
coming from state-actors such as countries for national security. For example, a theme within this cluster
is foreign influence which can be done with disinformation.
First, it is essential to indicate the dividing line between disinformation and misinformation.
Disinformation is used to cause damage/harm (while being aware of the containing frame/untruths).
Misinformation is the spreading of false information in good faith. This is especially the case when it
comes to, for example, conspiracies theories as conspiracy theorists believe some narratives. In this case,
we talk about misinformation. However, conspiracy theories are sometimes fed with disinformation.
In the last year, there is an increase in subversive theories and mobilisation. Recently, people are more
willing to mobilise on alternative narratives. In the Netherlands, this has been the case as, for example,
5G transmission towers were set on fire at the same time there were subversive theories about those
transmission towers. The step towards taking action seems to have become smaller. For these alternative
truths and conspiracies, a breeding ground is necessary to make the disinformation successful. Current
disinformation in society aims to undermine and take away trust in the Netherlands' system, especially
in the long term. Most of the time, disinformation is an amplification of some already existing
grievance. Alternative narratives are most successful if they speak to people's feelings and when they
can relate to them. Most disinformation or misinformation is seen on issues related to peoples
experiences in the present moment.
Digitisation and social media have led to the development that ideas, narratives and conspiracies spread
faster through societies. There is a weak signal that, in recent years, theories or narratives on the
internet's fringes, deep down in the caverns of social media, have been seen to move much faster to
the centre and be picked up by mainstream platforms. The more a person is exposed to a particular
narrative, the more he gets used to it and the plausible it becomes. Micro-targeting has led to a situation
where disinformation can be aimed at specific groups of people, allowing them to relate more
straightforwardly to disinformation and alternative narratives, increasing the impact of disinformation
because it tunes into their experiences and grievances more than general targeting. This micro-targeting
was done during the Brexit-referenda by selecting specific groups of people where the disinformation
would have the most impact. Deepfakes are a current development that can bring a new dimension to
certain narratives.
60
Disinformation becomes a security issue when people act on it and change their behaviour to
unlawful acting and committing violence. However, it is hard to prove that this behaviour can be directly
related to disinformation. Here, the dividing line between disinformation and misinformation comes into
play as people act according to their beliefs. However, disinformation has the most effects on winner-
takes-all elections and referenda. This makes these cases more attractive for disinformation campaigns.
Another development is that people tend to trust their peers more than authorities. For example, people
trust other people on social media more because they accept their views and mistrust authorities. This
development increases the declining trust in authorities.
Successful treatments for disinformation can be:
1. Debunking
Quickly reacting to disinformation and debunk the story. Although debunking is limited as
debunking, most of the times, gets lesser traction than disinformation, and there might be a lack
of trust in authorities debunking the story.
2. Pro-active factsheets
The creation of factsheets on issues that will probably become targets of disinformation so that
a quick response is possible.
3. Reacting
For example, a politician used for a story quickly reacts to it to say it is not valid.
4. Increasing awareness
Can be accomplished by debunking and being transparent about disinformation cases.
5. Education
Learning people, especially young, assess information and learn how to value and check
information on legitimacy and validate their information.