ArticlePDF Available

The Disconnective Power of Disinformation Campaigns

Authors:

Abstract

This paper argues that one of the major purposes of a disinformation campaign is to sustain a discursive conflict between users of social networks. By examining the phenomenon of “unfriending,” the paper describes how disinformation campaigns sabotage horizontal connections between individuals on either side of a conflict and strengthen a state’s capacity to construct an image of an external enemy. The paper argues that horizontal connections are targeted because they have the potential to mitigate social cleavages, challenging state control over the legitimacy of a conflict narrative. Understanding disinformation campaigns as a technology for the facilitation of social polarization and the destruction of strong social ties allows us to re-think policies for addressing the role of fake news, especially in the context of a conflict. The paper highlights the need to develop tools that defend users from engagement in manipulative discursive conflict and protect cross-conflict social capital as a resource for potential conflict resolution.
69
Journal of International Aairs, Special Issue 2018, Vol. 71, No. 1.5.
© The Trustees of Columbia University in the City of New York
Gregory Asmolov
THE DISCONNECTIVE POW ER OF
DISINFORMATION CAMPAIGNS
Abstract: This paper argues that one of the major purposes of a disinforma-
tion campaign is to sustain a discursive conflict between users of social net-
works. By examining the phenomenon of “unfriending,” the paper describes
how disinformation campaigns sabotage horizontal connections between
individuals on either side of a conflict and strengthen a state’s capacity to
construct an image of an external enemy. The paper argues that horizontal
connections are targeted because they have the potential to mitigate social
cleavages, challenging state control over the legitimacy of a conflict narrative.
Understanding disinformation campaigns as a technology for the facilitation
of social polarization and the destruction of strong social ties allows us to re-
think policies for addressing the role of fake news, especially in the context
of a conf lict. The paper highlights the need to develop tools that defend users
from engagement in manipulative discursive conflict and protect cross-conflict
social capital as a resource for potential conflict resolution.
In March 2018, I met the developer of the Ukrainian Fakes Radar project,
Dmytro Potekhin. Dmytro was driven by the idea of developing some sort
of “anti-virus” that would alert social media users to “fakes” on his or her news
feed. This would allow a user to make his or her friends aware of their role in pro-
moting fake news. I asked Dmytro if he thought people really wanted to engage
with friends who distributed fakes. In some cultures, people seek to avoid political
discussion to minimize risks to their social ties. Potekhin responded by saying
that the struggle against fakes requires this sort of direct engagement. I asked,
“But whose interest might it serve? Perhaps those state-affiliated actors that create
fakes want to not only spread false information, but to destroy social ties between
people.I was suggesting that accentuating claims of fake news would only serve
to exacerbate tensions among friends on a social media platform. My argument
seemed to puzzle Dmytro, but his eventual response was profound: “Fake news
Gregory Asmolov
70
actually kills. So perhaps that’s a proper price and some social ties need to be
destroyed.” The point of this article is to describe the costs associated with this
approach to fighting disinformation.
Many scholars and practitioners often consider counter-fake initiatives the
most effective remedies for combatting disinformation campaigns. I take a dif-
ferent view. Developing effective policies to address the increasing role of infor-
mational fakes requires a more critical understanding of the social role of disin-
formation. Counter-fake initiatives could worsen the negative effects of fakes by
highlighting the contest between orthogonal versions of reality. Well-intentioned
anti-fake campaigns might lead to a greater fissure in social relations online, just
as fact-checking, by restating a falsehood, can deepen the likelihood of the false-
hood’s embrace by motivated reasoners.1
This article considers how disinformation campaigns advance the political
goals of state actors involved in conflict. It argues that the way fake news under-
mines liberal democratic institutions and norms does not necessarily deal only
with the notion of truth. It suggests that the impact of disinformation should
be examined in the context of the social relationship between people who read,
respond to, and share news in a situation of conflict.
The social nature of fake news
So-called fake news or disinformation is usually understood as a tool “to shape
perceptions and actions of domestic and international audiences, according to
Khaldarova and Pantti. At the same time, they highlight how the purpose of fake
news is also “supporting already-constructed identity claims, rather than reporting
on events.2 Bennett and Livingston suggest “caution in adopting the term ‘fake
news’” and argue that the notion of “disinformation” allows more systematic
investigation of the “disruptions of authoritative information flows.They define
disinformation as intentional falsehoods spread as news stories or simulated docu-
mentary formats to advance political goals.”3
Other scholars highlight how fake news is more accurately thought of as a
social phenomenon. Tandoc Jr. et al., point out that “fake news is co-constructed
by the audience” whereas “meanings are negotiated and shared.”4 While in the
past, traditional media—as the space of news consumption—had been separated
from the space where people discussed the news, the interactive nature of social
networks has offered a new type of information environment where the prolifera-
tion and consumption of news is not separable from interaction around news. In
this way, news is embedded in interpersonal interaction, while reading, sharing,
and commenting are elements of news consumption.
A number of notions highlight the participatory nature of disinformation. This
The Disconnective Power of Disinformation Campaigns
journal of international affairs | 1
includes the notion of “peer-to-peer propaganda” as a situation in which “ordinary
people experience the propaganda posts as something shared by their own trusted
friends, perhaps with comments or angry reactions, shaping their own opinions
and assumptions.”5 The notion of “crowdsourced information warfare” highlights
how the response to disinformation campaigns relies on the digitally mediated
mobilization of a crowd’s resources.6 As pointed out by Tandoc Jr. et al., “the power
of fake news lies in how well it can penetrate social spheres.7 Bakir and McStay
highlight how fake news can be considered “affective content” that provokes emo-
tions including outrage.8 Understanding fake news as an outcome and driver of
interaction between users of social networks suggests shifting the focus of ana-
lyzing the role of disinformation from a specific event to the social consequences
of emotional engagement. A consideration of the social media dynamics in Russia
and Ukraine following the Russian annexation of Crimea illustrates this point.
Conflict-related news in social networking: The case of the Russia-
Ukraine conflict
Since the Russian annexation of Crimea in 2014, the Russian internet has been
filled with online battles over the interpretation of Russia’s role in the conflict in
Ukraine. In many cases people decide either to unfriend each other, or at least to
limit their presence on social networking websites.
“I opened my Facebook feed. I have never seen so much pain and desperation,
said a user describing her feelings following the annexation of Crimea. Another
user said that every time she went online she did not want to live anymore. In light
of the omnipresence of the conflict in newsfeeds, some users tried to reduce their
own engagement with social networks. A well-known Russian journalist declared
a self-imposed ban on writing any Facebook posts about politics in order to get
some fresh air.9
While digital escapism is one of the ways people exclude themselves from
news, a more common response is remaining on a platform while excluding pre-
vious friends. Following the downing of MH-17, one Facebook user wrote, “Today,
I have unfriended more people than I did at the apogee of the Ukrainian crisis.
Both sides. For a total loss of humanity.In some cases, people unfriended others
on the basis of their liking specific pages. For instance, one user shared that
he found the Facebook page with a title “Polite People. Russian Military, and
unfriended 21 friends who liked this page. At the same time one of the users wrote
that she was proud that she had not unfriended anyone: “So far, I haven’t deleted
from my friend list any victim of this information war.” She, and others, wondered
if unfriending was a proper solution, because it meant giving up on efforts to per-
suade others.
Gregory Asmolov
72
A research project on unfriending practices in the context of the Israeli-
Palestinian conflict concluded that “unfriending was more prevalent among more
ideologically extreme and more politically active Facebook users.” In addition,
“weak ties were most likely to be broken.10 In the case of the Russia-Ukraine
conflict, however, evidence suggests that the conflict has had a robust impact on
previously strong apolitical ties among people, including ties between relatives,
classmates, and friends. One user complained that “the most unpleasant thing
about this information struggle is that even the best of friends suddenly start to
publish absolute trash.” Another user shared his frustration with his friend’s lack
of media literacy. Friends from both sides of the conflict shared fake news, and the
comments on fake news posts often turned into vitriolic exchanges. In some cases,
users learned through these exchanges that they had already been unfriended.
For instance, a user shared an experience of visiting the profile of an old friend
and discovering the presence of a lot of Russian patriotic imagery, as well as the
fact that he was no longer a friend. He concluded, “Thank God I am not a friend
anymore. Goodbye.
Online friendships among former classmates seemed especially vulnerable. One
Facebook user reported that she had unfriended two of her classmates because of
their position on the situation in Crimea. Another Facebook user shared an experi-
ence of a close friend blocking her page because of her position on the conflict. In
response, she said, “I never said anything bad to you, though it was very painful
for me to read your posts since the war started. Does it [unfriending] mean that
human relations end at the border that is crossed by tanks from your country?”
Users cut ties even with intimate relationships because of different positions on
the conflict. One user described a romantic encounter as part of her relationship
with a Facebook friend. However, she wrote that she had decided to unfriend him
following his post on the Ukrainian events.
Some users also describe seeing a transformation of the Russian-speaking
social networking environment in the shadow of the conflict. One user wrote,
“Russian Facebook is so aggressive that it seems people are divided into two
groups: friends and enemies, though there are more enemies than friends.” Another
user shared her concern: “It seems that everyone’s gone crazy. People seem to
think that unfriending someone because of his opinion is normal. It looks like an
information civil war.” One user highlighted the gap between everyday, offline life
and conflictual, online space: “There is so much hatred on the Internet, and it’s
so easy to get infected by it, to start to classify anyone as either friend or enemy.
Some users warned their friends: “People, take care of each other. You think you
just share your negative emotions online. Actually, you’re destroying each other’s
minds, you’re entering a circle of verbal violence that also manifests in offline life.
The Disconnective Power of Disinformation Campaigns
journal of international affairs | 3
The stories offered above offer an alternative view of the role of fake news.
While Russian-speaking social networks are full of discussions about the Russia-
Ukraine conflict, one of the most sensitive topics is how people respond to what
their friends share, not the content of the news. Controversial news has a substan-
tial impact on social structures and can lead to the destruction of even relatively
strong social ties between people. The following section describes how disinforma-
tion campaigns operate to achieve this type of impact on social ties.
Social categorization and disconnective power
According to John and Dvir-Gvirsman, unfriending can be “a mechanism of
disconnectivity that contributes to the formation of homogeneous networks,” or, in
the words of Ben Light, a manifestation of disconnective power.11,1 2 Unfriending is
a process of “ingrouping” and “outgrouping,where the subject constantly revises
which individuals belong to the ingroup—his or her social group—and which are
alien agents belonging to an outgroup. This process of social inclusion and exclu-
sion relies on social categorization. Shkurko highlights that, “social cognition is
fundamentally categorical [and] we perceive others and regulate our behaviour
according to how we position ourselves and others in the social world by ascribing
them to a particular social label.13
The distinction between “ingroup” and “outgroup” in online social networks
often relies on the positions users take with regard to a specific topic, especially
when the topic is politically sensitive or controversial. This may include personal
posts about a specific subject and the acts of sharing, liking, and commenting on
the news or posts by other users. In this light, the consumption of information on
social networks from newsfeeds relying on a circle of friends is related to the con-
stant revision of boundaries between the ingroup and outgroup. The consumption
of news on social media is a process whereby the “other” user is considered in the
context of his or her relative position on a specific issue.
The question, however, is to what extent a position on a specific issue can be
significant enough to trigger the process of inclusion or exclusion. In some cases,
users do not reconsider their close ties despite the existence of political controver-
sies, but the illustrations in the previous section suggest that positions with regard
to the Russia-Ukraine conflict are best thought of as drivers of social categoriza-
tion that provide a basis for ingroup/outgroup distinctions. The conflict shapes
the structure of classification and produces categories that differentiate between
various users. In this situation, users tend to reconsider their social ties on the
basis of positions taken with respect to the conflict. The vulnerability of appar-
ently strong ties suggests that the role of a conf lict-related classification is more
dominant in shaping social structures than are shared experiences that used to
Gregory Asmolov
74
unite these groups.
Fake news contributes to the transformation of social-networking newsfeeds
into a field of discursive conflict where people engage in conflict-related commu-
nication that shapes their social circles. Disinformation campaigns that appeal to
emotion and constantly insert controversial news items into newsfeeds contribute
to the increasing impact of conflict-related social categorization on social ties.
So-called “fake news” can be a tool that drives the process of ingrouping/out-
grouping formation.
In this light, disinformation campaigns are a manifestation of the state’s discon-
nective power. The purpose of these campaigns is not to shape people’s perceptions
about reality, but rather to dissolve horizontal ties among people by increasing the
impact of conflict-related social categorization. The constant flow of state-spon-
sored disinformation triggers and sustains the phenomenon of “unfriending” as an
outcome of conflict-dominated social categorization. Disconnective power helps
shape people’s identity through the artificial development of information cocoons.
Conclusion
The engagement of users in a constant state of online conflict can be a form of
political control by the state. As conflict becomes embedded in a structure of per-
sonal relationships, important horizontal social bonds fracture. This manifestation
of disconnective power allows a state to shape users’ individual identities by dimin-
ishing the impact of horizontal connections that threaten the state’s monopoly on
framing the conflict and challenge the state’s ability to affect perceptions of the
conflict’s legitimacy. In this way, disinformation campaigns sabotage horizontal
connections between different sides of a conflict while strengthening the state’s
capacity to construct an image of an external enemy.
Discussions on remedies to address disinformation campaigns often focus on
battles related to perceptions of reality. This paper highlights the need to shift the
focus from how events are represented to how relationships among people who
have differing versions of reality are affected. What must be protected is not the
predominance of a particular message or source, but rather the capacity of people
to distinguish between political controversies and personal ties. That said, the
externalization of conflicts from within a structure of personal relationships seems
too ambitious as a goal. Given the current reality of an information environment
that integrates general news with personal interaction, such a separation would
be artificial. Therefore, the question is how to make personal relationships less
vulnerable in the face of the digitally mediated convergence of everyday life and
conflicts. In general terms, lessening the salience of a social categorization that
relies on the position of individuals with regard to a conflict could help achieve
The Disconnective Power of Disinformation Campaigns
journal of international affairs | 
this goal. This suggests a need for a set of defense mechanisms that would protect
horizontal connections between those who have different opinions on a particular
controversial issue.
First, if there are online debates that pose a threat to horizontal connec-
tions, one way to avoid disconnection is to introduce a third actor able to show
those engaged in the debate that there is a way to deal with controversy without
destroying social ties. A conceptual foundation for this type of practice exists in
the field of discursive psychology, and specifically in the concept of “narrative
mediation.” Kure highlights that the externalization of conflict requires us to
develop other possible modes of relationship—new discursive fields—between the
individual and the conflict or the individual and other individuals. In this context,
the role of a narrative mediator is to “constitute a new discursive background that
does not fit into the events of the dispute and opens for less polarizing and mar-
ginalizing positioning practices.”14
A second approach argues that the protection of horizontal connections can
rely on a platform’s technical features. Unfriending or blocking can be affordances
of disconnection offered by social-networking websites, which simplify disconnec-
tion by offering a symbolic gesture that signifies the breaking of a tie between
two individuals. Offline practices of social disconnection that communicate a
symbolic meaning of unfriending, such as a refusal to shake hands when in a situ-
ation of physical proximity, seem more complicated and difficult than a simple
click. Features affording new forms of digital support for the protection of social
ties could help mitigate the simplification of digitally mediated disconnection.
For instance, some kind of digital “yellow card” issued by one user to another to
alert them when an online discussion is entering a phase where it may threaten
the social connection between participants and to raise the question of whether
continuing the debate is worth paying the price of “disconnection.
Finally, increasing awareness of the potential risks of online communication
can enhance people’s capacity to distinguish between political controversies and
personal ties. According to Beck, the rising number of risks in modern society is
linked to the increasing role of reflexivity in addressing and handling these risks.15
An increasing reflexivity regarding the role of social media means exposing how
state actors use online disinformation to serve their political interests. Making
users more aware of the nature of disconnective power would contribute to safe-
guarding their own social circles. Strengthening people’s personal sovereignty
regarding their own social worlds, while emphasizing cross-conflict social capital as
a potentially potent resource for conflict resolution, can help combat the manifes-
tation of state sovereignty as a vertical intervention in horizontal connections.
Gregory Asmolov
76
Dr. Gregory Asmolov is Leverhulme Early Career Fellow at the Russia Institute, King’s
College London. His work focuses on how information technologies, specifically social media
and crowdsourcing platforms, constitute the role of individual users and crowds in crisis situ-
ations. Gregory’s current project titled “Participatory Warfare: the Role of ICTs in Modern
Conflicts” explores how information and communication technologies change the lives of users
with respect to a conflict zone. Gregory holds a BA in Communication and International
Affairs from the Hebrew University in Jerusalem, an MA in Global Communication from
George Washington University, and PhD in Media and Communications from the London
School of Economics and Political Science (LSE).
NOTES
1 Hugo Mercier and Dan Sperber, The Enigma of Reason (Cambridge, MA: Harvard University Press,
2017).
2 Irina Khaldarova and Mervo Pantti, “Fake News,” Journalism Practice 10, no. 7 (2016), 893.
3 W. Lance Bennet and Steven Livingston, “The Disinformation Order: Disruptive communication
and the decline of democratic institutions,” European Journal of Communication 33, no. 2 (2018), 124.
4 Edson C. Tandoc Jr., Wei Lim Zheng, and Richard Ling, “Defining ‘Fake News,’” Digital Journalism
(2017), 11, https://w ww.tandfonline.com/doi/full/10.1080/21670811.2017.1360143?scroll=top&need A
ccess=true.
5 Maria Haigh, Thomas Haigh, and Nadine. I. Kozak, “Stopping Fake News,” Journalism Studies
(2017), 8, https://w ww.tandfonline.com/doi/abs/10.1080/1461670X.2017.1316681.
6 Khaldarova and Pantti (2016), 892.
7 Tandoc, Zheng, and Ling (2017), 12.
8 Vian Bakir and Andrew McStay, “Fake News and the Economy of Emotions,” Digital Journalism 6,
no. 2 (2017), 8.
9 All quotes collected by the author.
10 Nicholas A. John and Shira Dvir-Gvirsman, “‘I Don’t Like You Any More’: Facebook Unfriending
by Israelis During the Israel-Gaza Conflict of 2014,Journal of Communication 65 (2015), 953.
11 Ibid., 955
12 Ben Light, Disconnecting with Social Networking Sites (Basingstoke, England: Palgrave Macmillan,
2014), 955.
13 Alexander V. Shkurko, “Cognitive Mechanisms of Ingroup/Outgroup Distinction,” Journal for the
Theory of Social Behaviour 45 no. 2 (2014), 188-189.
14 Nikolaj Kure, “Narrative mediation and discursive positioning in organisational conflicts,”
Explorations: An E-Journal of Narrative Practice, no. 2 (2010), 25.
15 Ulrich Beck, “The Reinvention of Politics: Towards a Theory of Reflexive Modernization,” Ulrich
Beck, Anthony Giddens, and Scott Lash, eds., Reflexive Modernization: Politics, Tradition and Aesthetics in
the Modern Social Order (Stanford, California: Stanford University Press, 1994), 1-55.
... Disinformation thus removes the accuracy of public debates and affects the quality of communication between policy makers and citizens, eroding trust and deliberative capability. Research on social media unfriending shows that disinformation campaigns may target connectors and erode trust between different groups in society (Asmolov 2018). Disinformation messaging often tends to be emotionally charged, as frustration and anger increase people's propensity to buy into the messaging and to unknowingly diffuse false content through social media (Horner et al. 2021;Chuai and Zhao 2022). ...
Article
Full-text available
The aim of this study is to explore civic literacy as an approach to counter disinformation in democracies. From group interviews, we elicit, categorize, and analyze diverse perspectives on disinformation in Sweden, previously upheld as a country with high civic literacy levels. We focus on people’s understandings of disinformation, their assessment of their own abilities to discern disinformation, and their ideas about how increased resilience to disinformation could be achieved. Our findings, based on input from 73 interviewees across Sweden, suggest that shared basic knowledge on disinformation is lacking. Moreover, there is a related weak understanding of what constitutes authentic information. Those with low awareness operate on a logic of beliefs, implying that measures to improve factuality and objectivity could not even be aspired for. Still, there are also constituents showing advanced understandings. The majority of respondents call for new measures to strengthen citizen knowledge and skills and generate many proposals to that end. Our results indicate that citizen competence needs to increase considerably to keep up with the rapidly evolving disinformation environment. A concerted drive to boost citizen knowledge and skills, tailored to different constituencies, is needed for the democratic system to work as intended.
... Moreover, it is known that misinformation weakens social bonds and divides people into increasingly isolated online political communities. This situation indicates that in Turkey, fake social media posts have turned into radicalized expressions based on either incomplete or intentionally misleading information (Asmolov, 2018). Examined in the context of the Russia-Ukraine war, fake social media posts in Turkey have evolved into radical expressions viewing the war from a perspective of "taking sides." ...
Article
Full-text available
This study aims to analyze the fake posts circulated on Turkish social media during the Russia–Ukraine war. With advancing technology, social media platforms have a profound impact on the way we perceive and interpret events and make us question the accuracy of information generated about international events such as wars. While the Russia–Ukraine war constitutes an important turning point in international relations, the reflection of these events on social media is also seen in fake posts. In this context, the main purpose of this study is to identify the common themes of fake social media posts and to reveal the general context of these posts on social media. In addition, the study aims to analyze the fake content circulating on Turkish social media and to reveal the emerging polarized discourses through the identified themes. The research revolves around five main themes that feed polarization: war reporting, ideological misrepresentation, humor, hate speech, and conspiracy theories. The findings show that fake content is particularly concentrated around ideological polarization and antagonisms. It was also found that misinformation and decontextualized humor blurred the true context of the war and that fake content combined with hate speech and conspiracy theories distorted the context of the war.
... More specifically, we must consider the role of the end recipient of the fake or misleading content, the audience, and their contribution to spreading said content (Tandoc et al. 2017). Instead of acting as passive recipients, audiences play an important role in disseminating content related to information disorders, to such an extent that some authors defend a certain participatory nature of disinformation (Asmolov 2018;Wanless and Berk 2017). ...
Book
Full-text available
The emergence of generative artificial intelligence applications has triggered great interest in all areas of media studies. Its presence seems definitive and, although its rapid evolution makes it difficult to anticipate the way in which it will affect the media, it seems clear that it will transversally affect all its areas, from the production to the distribution of contents. The application of AI is perceived, on the one hand, as an opportunity for the media to reduce production costs that facilitate their viability or as tools that free journalists from more mechanical tasks. But, on the other hand, there is also a perceived risk of increasing the precariousness of journalists' work and reducing the number of jobs. For audiences, having this new technology at hand is accompanied by new opportunities, such as greater personalization of content, but also threats, such as the presence of biases, the increase in existing gaps or an increase in the risk of misinformation.
... In their study of memory narratives about commemorations of the October 1917 revolution in Russian online publics, Litvinenko and Zavadski (2 020) came to the conclusion that the Russian state promoted different narratives on the same topic for various target groups, and the resulting patchwork of narratives with `memories on demand' (Litvinenko andZavadski, 2020, p. 1657) fostered societal fragmentation. Asmolov (2018) explored the practice of unfriending on social networks as an effect of disinformation campaigns of the Russian state, and suggested that disruption of horizontal ties -disconnection -was one of the main goals of propaganda, helping the state construct the image of an external enemy. ...
Article
Full-text available
The ongoing Russo-Ukrainian War is generating an astonishing amount of information, much of which is being transmitted through social media, allowing events to be observed and followed almost in real time. The impression is that in the initial weeks and months in the 2022 phase of this war, media and social media was much less filtered than it is now. This paper documents a curious aspect of this war observed at that time, namely the open and public display of photos and identity documents of some Russian and Ukrainian combatants, deceased or captured , on select websites or social media platforms. Focusing primarily on April 2022 cases, this paper also reflects on the possible purpose or value of such public exposure, including privacy-related issues, and its role in a parallel information and psychological warfare.
Chapter
Full-text available
Given the growing concern about the fast contiguity of the novel Coronavirus, the nation has witnessed unprecedented impact on the educational system worldwide. This chapter deals with how the educational board has made concerted efforts and measures to run the academic calendar in a completely virtual manner. There arises an urgent need to catechize, train, and equip teachers and students in familiarizing them with the information and communication technology (ICT) and its assorted tools that comes with all its intricacies and exploding potential, and this calls for special attention to provide expert tutelage in online teaching and learning to staff and students. There exists a common misconception that teaching skills, infrastructural facilities, competency of students, and the educational pedagogies needed for a physical classroom are all-sufficient to conduct online classes successfully. This misconception needs to be reconceptualized with the precise percipience of the fact that online tutelage is totally aloof from physical classroom teaching since the duo work at two different levels – physical world and virtual world which cannot be navigated using the same single route. Apart from the fundamental concerns, the discussion also deals with mental preparation, staff readiness, confidence to learn the new pedagogy, student accessibility, and motivation to embrace ICT-integrated learning. As online tutelage carries multidimensional aspects, it demands a separate array of knowledge and skill sets from teachers trained through experts, which will result in upskilling their teaching abilities. Familiarizing with the new online mode, getting the staff trained by professional ICT experts, and overcoming the emotional barrier is much focused in this study. All this invites unremitting challenges like poor network coverage, limited mobile data, incapacity to buy a new gadget, vulnerability to students and teachers on several health and mental issues it has faced in the speedy pursuit of completing their syllabus in a stipulated time frame. The study focuses on providing plausible solutions to mitigate the looming fear of online teaching and making it a common part of teaching instruction. To summarize, the teaching fraternity has a herculean task ahead to stabilize with the newly embraced norm and continue to disseminate information in a very new way, to be adroit with technology and hegemonize technology and not the other way round. Online teaching tools do provide a plethora of opportunities to be innovative, creative, productive, and resourceful.
Article
Full-text available
Las redes sociales se muestran como un nuevo campo de operaciones a gran escala, en las que muchos se aprovechan de su instantaneidad para ejercer influencia sobre otros y manipular las percepciones sobre determinados temas. El presente artículo consiste en una consulta bibliográfica actualizada para estudiar las principales redes sociales empleadas por la República Popular de China, a nivel doméstico e internacional, en las que llevan a cabo acciones propagandísticas. Para abordar la cuestión, se realiza un contexto histórico del país que actúa como detonante de su modus operandi. A raíz de esta investigación, se observa que China emplea tres tipos de campañas desinformativas donde aplica operaciones de dominio cognitivo para reforzar el discurso positivo chino, con un enfoque para la “gestión del pensamiento” de la próxima generación, las cuales se llevan a cabo a través de un complejo entramado de instituciones dependientes del Partido Comunista Chino y, a su vez, del ejército.
Article
Full-text available
Many democratic nations are experiencing increased levels of false information circulating through social media and political websites that mimic journalism formats. In many cases, this disinformation is associated with the efforts of movements and parties on the radical right to mobilize supporters against centre parties and the mainstream press that carries their messages. The spread of disinformation can be traced to growing legitimacy problems in many democracies. Declining citizen confidence in institutions undermines the credibility of official information in the news and opens publics to alternative information sources. Those sources are often associated with both nationalist (primarily radical right) and foreign (commonly Russian) strategies to undermine institutional legitimacy and destabilize centre parties, governments and elections. The Brexit campaign in the United Kingdom and the election of Donald Trump in the United States are among the most prominent examples of disinformation campaigns intended to disrupt normal democratic order, but many other nations display signs of disinformation and democratic disruption. The origins of these problems and their implications for political communication research are explored.
Article
Full-text available
This paper examines the 2016 US presidential election campaign to identify problems with, causes of and solutions to the contemporary fake news phenomenon. To achieve this, we employ textual analysis and feedback from engagement, meetings and panels with technologists, journalists, editors, non-profits, public relations firms, analytics firms and academics during the globally leading technology conference, South-by-South West, in March 2017. We further argue that what is most significant about the contemporary fake news furore is what it portends: the use of personally and emotionally targeted news produced by algo-journalism and what we term “empathic media”. In assessing solutions to this democratically problematic situation, we recommend that greater attention is paid to the role of digital advertising in causing, and combating, both the contemporary fake news phenomenon, and the near-horizon variant of empathically optimised automated fake news.
Article
Full-text available
When faced with a state-sponsored fake news campaign propagated over social media, in a process we dub “peer-to-peer propaganda,” a group of volunteer Ukrainian journalistic activists turned fact checking into a counter-propaganda weapon. We document the history of StopFake, describe its work practices, and situate them within the literatures on fact checking and online news practices. Our study of its work practices shows that StopFake employs the online media monitoring characteristic of modern journalism, but rather than imitating new stories it applies media literacy techniques to screen out fake news and inhibit its spread. StopFake evaluates news stories for signs of falsified evidence, such as manipulated or misrepresented images and quotes, whereas traditional fact-checking sites evaluate nuanced political claims but assume the accuracy of reporting. Drawing on work from science studies, we argue that attention of this kind to social processes demonstrates that scholars can acknowledge that narratives are socially constructed without having to treat all narratives as interchangeable.
Article
The crisis in Ukraine has accentuated the position of Russian television as the government’s strongest asset in its information warfare. The internet, however, allows other players to challenge the Kremlin’s narrative by providing counter-narratives and debunking distorted information and fake images. Accounting for the new media ecology—through which strategic narratives are created and interpreted, this article scrutinizes the narratives of allegedly fake news on Channel One, perceiving the fabricated stories as extreme projections of Russia’s strategic narratives, and the attempts of the Ukrainian fact-checking website Stopfake.org to counter the Russian narrative by refuting misinformation and exposing misleading images about Ukraine. Secondly, it analyses how Twitter users judged the veracity of these news stories and contributed to the perpetuation of strategic narratives.
Article
This article explores Facebook unfriending during the Israel–Gaza conflict of 2014. We suggest that politically motivated unfriending is a new kind of political gesture. We present an analysis of a survey of 1,013 Jewish Israeli Facebook users. A total of 16% of users unfriended or unfollowed a Facebook friend during the fighting. Unfriending was more prevalent among more ideologically extreme and more politically active Facebook users. Weak ties were most likely to be broken, and respondents mostly unfriended people because they took offense at what they had posted or disagreed with it. Although social network sites may expose people to diverse opinions, precisely by virtue of the many weak ties users have on them, our findings show these ties to be susceptible to dissolution.
Article
People use social categories to perceive and interact with the social world. Different categorizations often share similar cognitive, affective and behavioral features. This leads to a hypothesis of the common representational forms of social categorization. Studies in social categorization often use the terms “ingroup” and “outgroup” without clear conceptualization of the terms. I argue that the ingroup/outgroup distinction should be treated as an elementary relational ego-centric form of social categorization based on specific cognitive mechanisms. Such an abstract relational form should produce specific effects irrespective of the nature of a particular social category. The article discusses theoretical grounds for this hypothesis as well as empirical evidence from behavioral and brain research. It is argued that what is commonly termed as “ingroup” and “outgroup” can be produced by distinct cognitive operations based on similarity assessment and coalitional computation.
Narrative mediation and discursive positioning in organisational conflicts
  • Nikolaj Kure
Nikolaj Kure, "Narrative mediation and discursive positioning in organisational conflicts," Explorations: An E-Journal of Narrative Practice, no. 2 (2010), 25.