ArticlePDF Available

Disinformation is Everywhere.Why Should we Change our Perspective on this Phenomenon?

Authors:

Abstract

Disinformation is a complex phenomenon, although we associate it mainly with politics and the media. However, its tools and consequences can be seen in various areas of social life. Understanding the nature of disinformation, its mechanisms and possible ways of influencing it seems necessary to fully understand this phenomenon. Campaigns targeting minority groups, manipulation in business, falsified messages in the area of health, tools of political propaganda or pseudoscience are examples of disinformation that allow you to see it in a broad perspective. And only this will allow you to develop a system of immunity to fake news.
Disinformation is Everywhere.Why Should we Change our
Perspective on this Phenomenon?
Katarzyna Bąkowicz
SWPS University Warsaw, Poland
kbakowicz@swps.edu.pl
ORCID: 0000-0001-6365-2696
Abstract: Disinformation is a complex phenomenon, although we associate it mainly with politics and the media. However,
its tools and consequences can be seen in various areas of social life. Understanding the nature of disinformation, its
mechanisms and possible ways of influencing it seems necessary to fully understand this phenomenon. Campaigns targeting
minority groups, manipulation in business, falsified messages in the area of health, tools of political propaganda or
pseudoscience are examples of disinformation that allow you to see it in a broad perspective. And only this will allow you to
develop a system of immunity to fake news.
Keywords: disinformation, misinformation, fake news, manipulation, resilience
1. Introduction and methodology
Most of us associate disinformation only with media and politics, and we are right there are numerous
examples of fake news and manipulation in these areas. However, we need to broaden our perspective and
recognize that disinformation is a more complex phenomenon that can also be found in unexpected places. This
broader perspective can help us understand disinformation and build resilience against it. When we recognize
the connections between disinformation and stereotypes, understand the tools used by creators, and are aware
of the consequences and manipulation techniques, we can change our beliefs and behaviors. We must be vigilant
for disinformation everywhere, but we don't need to be afraid of it because we are aware of it and know how
to neutralize it.
In this article, I will try to provide a different perspective on disinformation. I will start by discussing stereotypes
and exclusions, and then delve into disinformation in the health sector, media, and politics, as well as its serious
consequences in business and prevalence of fake news in science. By highlighting the various contexts in which
disinformation can occur, we can start to consider strategies to counteract it. My goal is show, why broader
perspective is essential to understand disinformation and create resilience attitude. If we would like to fight with
fake news and manipulated information, we need to deep understanding of the problem in order to find a
remedy for it. The sources that I analyzed are international studies and reports on disinformation as well as the
literature on the subject.
I believe that the solution lies in fostering a responsible communication system that promotes critical thinking,
encourages breaking out of filter bubbles, and advocates humility in the face of ignorance. These approaches
can help all of us in the fight against disinformation.
2. Information disorder
Information disorder is the overarching concept that encompasses the phenomenon of disinformation, which is
characterized by chaos and the difficulty in distinguishing between real and manipulated content.
Disinformation, misinformation, and malinformation are three perspectives from which we can understand the
impact of information disorder. Disinformation, in particular, is the concept most commonly associated with
deliberate misleading of message recipients. It can be motivated by various factors, such as financial gain,
political influence, societal polarization, or simply the creation or deepening of information chaos.
Disinformation is consciously distributed for specific purposes defined by the sender, often using channels that
accelerate its dissemination and expand its reach to a wide audience. This is why the internet, especially social
media, is a common medium for disseminating disinformation, as it can rapidly spread information without
proper verification. Unlike conveying a message through lies, disinformation involves deliberate concealment or
distortion of facts to create false beliefs in recipients. It is a strategic and calculated action aimed at achieving
specific effects, involving all components of the information process. Understanding the intentionality behind
disinformation is crucial to comprehending its nature and impact (Gans, 2004; Wardle, 2020).
The second type of information disorder is misinformation, which occurs when the sender unintentionally
provides the recipient with manipulated information. Often, the sender shares this information in good faith,
such as when they learn something important and want to share it with friends or family. They may write a post
on a social media platform and publish it, resulting in several hundred people receiving the information in a
2
Proceedings of the 10th European Conference on Social Media, ECSM 2023
Katarzyna Bąkowicz
short amount of time. However, the author often does not verify the truthfulness of the information before
publishing it, and if the message is sensational or emotionally charged, it is likely to spread widely on the web,
misleading recipients. Even though the content is shared in good faith, it can still cause social harm (Wardle,
2021).
It may seem that fact-checking is the solution to this problem. However, the third type of information disorder,
malinformation, demonstrates that fact-checking alone is insufficient. Malinformation involves true information
that is transmitted with the intention of causing harm, such as by evoking specific, usually negative emotions.
As a result, the recipient receives a factual message, but their emotional response to it may be different than if
they had received it in a neutral or contextually appropriate context. Malinformation is often used in political
struggles to discredit a social group or individual, portraying them in a negative light or undermining their social
standing. In theory, the information may be true, but in practice, the effects are similar to disinformation. This
is a dangerous phenomenon because it may not be detected by systems that identify disinformation content on
social media. Content that is factually accurate and verifiable may not violate community standards, and
therefore cannot be removed based on that criteria. The impact of malinformation is serious and often more
dangerous than the previous two types of information disorder, as it is difficult to track and eliminate. The
criteria for detecting context manipulation are not as transparent as identifying false information in content,
and the social impact is significant as it is aimed at harming specific members of society (Wardle, 2020).
The basic tool of disinformation is fake news, which encompasses the space between truth and lies. Importantly,
this term refers to information that is manipulated and falsified, but not entirely false, as it always contains some
element of truth (Bąkowicz, 2023). It is defined as the act of conveying a message by concealing information to
lead someone to form a false belief about a given topic (Elliot, Culver, 1992), and as a phenomenon that
contributes to political and social trends in 21st century societies (McNair, 2018), involving distorted signals
unrelated to the truth or part of psychological warfare based on manipulating public opinion (Alwairi, Alwahedi,
2018).
All of these phenomena can occur due to the existence of information bubbles. This phenomenon consists of
social preferences to surround oneself with people who share similar beliefs, while rejecting those who hold
different views and content that does not align with one's worldview. As a result, recipients of information avoid
conflicting messages, intellectually isolating themselves from their environment. This leads to a self-
perpetuating cycle of opinions, where people are locked into echo chambers, repeating the same content and
surrounding themselves with information that confirms their preexisting beliefs. Over time, they may perceive
everyone as saying and thinking the same thing, further reinforcing their beliefs and contributing to polarization
and social divisions (Michelucci, 2013). This echo chamber effect fosters extremism, as individuals within these
bubbles become less critical and more unified in their beliefs, even if they are based on false information
(Szpunar, 2014). Consequently, individuals may avoid expressing differing opinions within their social group for
fear of ostracization or exclusion, leading to a phenomenon called the spiral of silence, where individuals prefer
to remain silent to maintain their perceived social safety (Noelle-Neumann, 1974).
The complexity of information disorder phenomena shows how complicated the modern information ecosystem
is. It is easy to find manipulated or falsified content that can be found and cause damage in many social areas.
3. Areas of disinformation
3.1 Minorities
More than half of the surveyed Poles have encountered the phenomenon of disinformation, nearly a quarter of
them say that they come into contact with manipulation every day or several times a week, and half indicate
politicians as the creators of disinformation, and this is the area we most often associate the phenomenon with
(NASK, Studies). However, it is not only in the area of politics that we can see disinformation activities,
distortions, manipulations and half-truths can be found in many other areas of social life. And although they are
not always visible, their consequences are clearly felt.
The first area marked by disinformation is minority communities, such as LGBT+ people, migrants, Gypsies, and
Jews. There is no community that does not oppress and discriminate against its minorities. Here, we are dealing
with the correlation of misinformation with the phenomenon of stereotyping, which divides people into those
who are right, belonging to a given social group, and those who should be excluded from the group as not
conforming to its idealized version (Girard, 1987). Organized disinformation campaigns against LGBT+ people
continue. According to European Union research, representatives of this minority are among the main targets
of disinformation attacks, and pro-Kremlin organizations are responsible for their creation and implementation.
3
Proceedings of the 10th European Conference on Social Media, ECSM 2023
Katarzyna Bąkowicz
Most often, disinformation messages contain a narrative that portrays LGBT+ as a new form of colonization of
the West, a threat to the traditional family model, the safety of children and youth, and the natural order related
to procreation. This is particularly visible in the countries of Eastern and Central Europe, where public debate
creates fertile ground for the implementation of such messages. As a result, 30 percent of people from this
minority admit that they have experienced harassing comments, threats, acts of aggression, or other similar
incidents. The authorities are often an active party in such activities, for example, by creating LGBT+ free zones,
as is the case in Poland, Bulgaria, Finland, Greece or Spain (Strand et al., 2021).
Disinformation publications also heavily impact migrant communities. According to the EUvsDisinfo study, the
amount of manipulated information in this thematic area is twice as much as in other areas. The basic narrative
is that migrants pose a threat to European culture and identity, supported by stories of various European cities
allegedly abandoning Christmas traditions or children being forced to pray to Allah in schools. The narrative of
Islamization is prominent in many publications, suggesting that Muslims will soon outnumber Europeans,
depriving the natives of jobs, housing, and social privileges. Another narrative portrays migrants as a criminal or
terrorist threat, with a particular emphasis on rape. While messages about migrants as an economic threat are
weaker and less radical than in previous years, they are still present. In Poland and Italy, migration is also
presented as an element exacerbating tensions between EU countries. This contributes to an increase in
aggressive behavior towards representatives of minorities and intensifies the isolation of its members, who, in
order to feel safe, often close themselves in ghettos instead of integrating with representatives of other
European cultures (The impact of disinformation…, 2021).
Manipulated information messages also target people of non-white skin color. Despite efforts to combat racism,
its manifestations are still evident, especially in areas such as education, employment, housing, healthcare, and
public services. Media and social messages about individuals with non-white skin color often perpetuate
negative beliefs and behaviors towards them. They are more harshly judged by the courts, receive higher
penalties for crimes more frequently, and are often associated with criminal behavior and aggression (Champa,
2021). In news coverage, they are portrayed as more dangerous when they become the subjects of daily news,
often overrepresented as criminals and underrepresented as victims or contributors. This trend is also observed
in music videos and cultural products targeted at young people. As a result of such messages, stereotypes are
reinforced, and exclusionary beliefs are intensified, leading to a diminished quality of life for individuals
belonging to these groups (Media portrayals...).
Furthermore, representatives of the Roma and Jewish populations face a significant disinformation narrative.
Messages about them are based on contempt, expressing low social status, dishonesty, lack of ethics, or socially
unacceptable behavior. Anti-Roma narratives are included in the political programs of populist nationalist
parties, such as those in Hungary, Slovakia, and the Czech Republic. Similarly, anti-Jewish narratives are
propagated by politicians in Poland, such as the Konfederacja party. As a consequence of disinformation, events
such as social lynching occur, such as the attacks on two Gypsies groups in the suburbs of Paris in 2019 (Banaji,
Bhat, 2019).
Disinformation messages rooted in stereotypes contribute to exclusion. Despite discussions about equality and
its importance in modern societies, there are unfortunately not only isolated incidents, but also organized
campaigns targeting specific groups.
3.2 Health
Health is another critical sphere of our lives that is impacted by disinformation. During the COVID-19 pandemic,
we faced not only disinformation but also an infodemic, where an overwhelming amount of false information
posed a threat to the stability of the information ecosystem. Speculations about the origin and sources of the
virus emerged, accompanied by various conspiracy theories, ranging from claims that the virus was planned by
the Rockefeller Foundation, to suggestions that it was a means to solve overpopulation, or that COVID-19
vaccines contained electronic tracking chips. Tragically, some individuals died after consuming chlorine or
refusing treatment based on unfounded beliefs in self-healing through willpower. The consequences were not
limited to health alone, as research indicates changes in trust towards government, particularly among the
younger generation during the pandemic (Trust in government..., 2020). Skepticism about vaccines also
increased due to media reports about adverse effects. People began to fear vaccines more than the potential
complications of the infection itself (Belanger, 2020).
The disinformation surrounding the Blue Whale Game, which was ultimately revealed to be a global hoax, had
serious consequences. In 2017, when countries around the world were grappling with the health and well-being
of young people, the story of a mysterious game that allegedly drove teenagers to suicide, created by a Russian
4
Proceedings of the 10th European Conference on Social Media, ECSM 2023
Katarzyna Bąkowicz
journalist, gained widespread attention. The tasks attributed to game administrators, which were never
substantiated, circulated on the internet and served as a classic example of a disinformation narrative amplified
by the media. According to the narrative, young people were supposed to find their "guardian" by using a specific
hashtag, who would then guide them through a series of tasks, culminating in suicide. At that time, questioning
the existence of the game required great courage, as the belief in its existence was so prevalent that it went
unquestioned. Tragically, the consequences were all too real, as teenagers in Poland and around the world
engaged in self-mutilation and attempted suicide, with some cases resulting in tragic loss of life (Bąkowicz,
2020).
Similar consequences are faced by victims of disinformation about vitamin C and its alleged connection to
cancer. In Poland, Jerzy Zięba is a prominent figure behind the false narrative of a miracle drug, offering ethically
questionable treatments for seriously ill patients based on falsified data and research. He capitalizes on the
social trend of declining trust in the medical community (Poland ranks among the lowest in Europe in this regard)
and combines it with a well-crafted conspiracy theory centered on a global population reduction plan (Pluta,
Pseudoscience...). Similar patterns can be observed at CHIPSA, or Centro Hospitalario Internacional del Pacifico
in Mexico, which claims to offer integrative cancer treatment. Unfortunately, this type of disinformation is highly
dangerous, as evidence has shown that some medical services offered through manipulated messages have
resulted in negative health effects, permanent damage, or even death for patients (Ohlheiser, 2021).
When discussing the areas impacted by disinformation in the health sector, it is important to highlight the beauty
sector, which boasts a market value of $250 billion and is rife with half-truths and misleading content. A
prevalent form of disinformation in this sector revolves around the cult of beauty, where idealized bodies with
smoothed and slimmed appearances are depicted through graphic programs. Pseudo-experts or even actors
posing as doctors often recommend various supplements, perpetuating such disinformation. This has
detrimental effects on trust in doctors and science as a whole, leading to a rejection of expert knowledge in
favor of destructive disinformation messages that harm society at large (de Regt et al., 2019).
3.3 Business
Unfortunately, the realm where disinformation has been proliferating rapidly in recent years is the business
sector. Its impact is particularly significant in the stock market, where disinformation messages can influence
share prices and, consequently, the overall value of a company. Such actions are feared by nearly 60 percent of
large companies in the United States (Global fraud, 2020).
Disinformation in business can take two forms: the creators and the victims of disinformation. In the former, we
can observe what is known as "washing," which involves using social phenomena to enhance one's image.
Companies create disinformation messages that purportedly inform the public about non-business activities for
a specific group or area, while the reality may be different. Often, disinformation in the business sector pertains
to environmental issues, known as greenwashing, where false claims are made about an entity's commitment
to environmental protection, when in fact the opposite may be true. Market research clearly demonstrates how
greenwashing impacts consumer behavior and leads to erroneous decisions. The scale of this phenomenon is
significant, as a study conducted by the American consulting agency Terrachoice found that out of over 1,000
products tested in American supermarkets, which were promoted as environmentally friendly, only one was
truly free from greenwashing (Terrachoice, 2007).
Pinkwashing, also interchangeably referred to as rainbowwashing, involves apparent activities that claim to
support non-heteronormative individuals. However, in reality, it is a strategy used to shift the focus away from
overt discrimination, exclusion, or violence, disguised under the guise of inclusive rhetoric. The concept of LGBT+
pinkwashing was coined in 2011, in reference to the Israeli government's public relations activities related to
homonationalism and the exploitation of sexual minorities to justify racism and xenophobia (Abdelmoez et al.,
2022).
Similar to pinkwashing, sportwashing is a set of practices employed by individuals, companies, or countries to
repair or enhance their reputation using sports. The most common form of sportwashing is organizing sporting
events, ranging from local tournaments to world championships, or owning a sports team or club, such as Paris
Saint-Germain, which is owned by a subsidiary of a Qatari sovereign wealth fund and has faced criticism for
unethical actions (Fruh et al., 2022).
In the aftermath of the war in Ukraine, a new phenomenon called warwashing emerged, which involves feigned
aid activities and a disparity between declarations and actual aid provided to war victims. This may include
5
Proceedings of the 10th European Conference on Social Media, ECSM 2023
Katarzyna Bąkowicz
changing logos to Ukrainian colors or claiming withdrawal from the Russian market despite lacking actual plans
to do so (Korcz, Podus, 2022).
The business environment is often targeted by disinformation propagated by trolls, profiteers, and foreign
entities. Trolls are individuals or groups that engage in anti-social behavior online, particularly in places where
discussions take place. They launch disinformation attacks in the form of entertainment or hate speech
publications, with the aim of destabilizing discussions, misleading other participants, and tarnishing the
reputation of companies. Trolls generate up to 12 times more content, often using colloquial language and
profanity, and they respond to almost every comment that references their statements (Musiał, 2017).
Profiteers, on the other hand, profit from the distribution of disinformation through business speculation. Their
actions can weaken companies and their position in the industry by spreading messages about alleged crises or
the unprofitability of entire sectors. These messages can take the form of single statements or organized
campaigns against competing entities. In the most dangerous form, disinformation is propagated by foreign
entities, where the level of organization is even higher and involves the participation of governments and
authorities in disinformation campaigns. Such organized groups have been identified in 48 countries, most
notably in countries that claim global dominance. This type of disinformation is particularly dangerous as it is
challenging to track and carries serious risks for businesses, including potential collapse (Nemr, Gangware,
2019).
3.4 Politics
The connection between politics and disinformation has existed for centuries. One of the earliest known
examples of disinformation in an election campaign dates back to the fight between Octavian and Mark Antony,
where Octavian spread disinformation about his opponent using coins.
In modern times, Donald Trump was the first prominent politician to widely use the terms "disinformation" and
"fake news." He first used these terms in a Twitter post in December 2016, and by the time of the first
presidential debate, he had repeated them 1,906 more times. Data from The Washington Post's Fact Checker's
Database revealed that Trump made an average of 23 false or misleading tweets per day, creating what the
media referred to as a "tsunami of untruth" that influenced political discourse. In 2019, Trump even claimed
credit for coining the term "fake news." Trump's extensive use of disinformation during his election campaign,
particularly on Facebook and Twitter, contributed to the impact of these platforms on the political beliefs of
their users (Woodward, 2020). The disinformation narrative amplified by Fox News further reinforced false
beliefs, leading to the attack on the U.S. Capitol in January, which resulted in the loss of 5 lives.
Brexit was a significant consequence of political disinformation, with the narrative of Euroscepticism playing a
crucial role on the Twitter platform. An analysis of 7.5 million tweets revealed that users supporting the United
Kingdom's departure from the European Union were not only more numerous, but also more active compared
to other users. They posted more tweets containing anti-EU content, shared links to press publications that
aligned with their beliefs, and engaged with like-minded individuals. Bots also played a role in amplifying this
narrative, with less than 1 percent of accounts generating over a third of the content. More than 13,000 accounts
were designed to disseminate information to varying degrees. The main figures in this campaign were Boris
Johnson and Nigel Farage, who created and spread messages across the network (Hoeller, 2021).
The tragic effects of political disinformation can be observed in the events in Ukraine. There are four main trends
of disinformation in this context: government propaganda, conspiracy theories, context manipulation, and
emotional manipulation. Pro-Kremlin messages distort reality and aim to portray Russia as a victim of Western
aggression, such as the alleged threat of NATO enlargement. State-owned media only disseminate information
that aligns with the political interests of Russia's ruling groups. Paid trolls also play a role, writing thousands of
comments online and spreading the pro-Russian narrative across various social media platforms (Paz, 2022).
The Ukrainian side also utilizes disinformation as a weapon of war, with President Zelenski having 1.4 million
followers on Telegram, and the content he shares with Ukrainian society aimed at boosting morale, often
adopting a heroic tone (Demagog, 2022).
All members of society are affected by political disinformation, as it undermines freedom of thought, human
rights, right to privacy, right to democratic participation, and various economic, social, and cultural rights.
Moreover, it diminishes indicators of democratic quality by eroding trust in the independence of democratic
institutions, disrupting elections, and undermining confidence in their fairness. Additionally, it contributes to
social polarization, exacerbating existing divisions (Eurobarometer, 2022).
6
Proceedings of the 10th European Conference on Social Media, ECSM 2023
Katarzyna Bąkowicz
3.5 Media
The media holds a significant, often dominant, position in the social structure. As their main role is to provide
information, they shape attitudes and influence opinions and decisions. Despite the element of journalistic
integrity, media outlets are also commercial enterprises where issues such as profits and ownership relations
come into play. Moreover, the phenomenon of convergence, which has blurred genres and information
distribution channels, has made the media system increasingly complex. Consequently, it is not difficult for
disinformation to infiltrate, often taking the form of manipulating quotes or changing context, which has an
immediate effect on the perception of content. For instance, a notable example is the case of Anthony Fauci,
the White House's chief medical adviser, whose interview excerpt in The New York Times was manipulated to
create the impression of vaccines' ineffectiveness. However, reading the entire text reveals that the doctor was
referring to the ineffectiveness of vaccines administered with too long intervals between doses. Media
disinformation can also be created through false narratives built on distorted facts, which is challenging to detect
and rectify as it requires thorough analysis and fact-checking, which recipients of the content usually do not
undertake. For example, in the issue of so-called push backs, where migrants are pushed back to Belarus at the
border despite crossing into the Polish side, the head of the National Security Bureau, during an interview with
Radio Plus, cited a wrong example justifying the unauthorized conduct of Border Guard officers by referring to
a judgment of the Court of Human Rights. Another form of media disinformation is the generation of false
messages, which often occurs on internet portals through hacking of pages or profiles, leading to the creation
of misleading narratives (Digital Poland, 2022)
In addition, disinformation is facilitated by clickbait, which is increasingly employed in the media, particularly in
mainstream outlets. Clickbait is a short and attention-grabbing form of journalistic material that focuses on
generating popularity for the content rather than delivering qualitative messages. While clickbait is often viewed
as a tool for selling content rather than intentionally deceiving or manipulating the audience, it is hard to ignore
the possibility that this marketing activity may contribute to the dissemination of fake news (Alves et al., 2016).
The publication of clickbait itself can be considered as fake news, as it misleads the audience through headlines
that are inconsistent with the actual content.
The consequences of media disinformation are numerous and grave. Firstly, they further diminish the already
low level of social trust. In Poland, only 20 percent of the media are perceived as independent, leading to
increased reliance on social media by a growing number of people, especially the youth. The changing nature of
journalism itself does not help either, particularly with the emergence of media workers who perform similar
tasks as journalists but lack the ethical codes and standards. This impacts both the volume of disinformation in
the media and the industry as a whole.
3.6 Science
Unfortunately, science is not immune to misinformation, despite being expected to uphold the highest standards
as a model for all other publications and content. One form of misinformation in science is pseudoscience, which
relies on simplistic reasoning and selectively accepts only favorable results while disregarding data that do not
support the original thesis. It often refers to concepts lacking scientific justification or combines them with
scientific definitions and publications from unknown or unrecognized sources, relying on isolated and ambiguous
experiments (Żukowska et al., 2021).
Pseudoscience is closely related to fake science, which distorts reality by presenting facts as non-facts and
presenting pseudo-evidence as credible evidence. Fake science is the result of intentional attempts to introduce
false or unconfirmed disinformation claims into the scientific discourse, often perpetuated by scientists
themselves. This can create a false belief about the truth or probability of certain claims, leading other
researchers to rely on them as a basis for their own studies and analyses. Fake science comprises theories and
claims that may appear similar to scientific theories in terms of writing, methodology, and argumentation, but
lack empirical confirmation. It can be seen as an imitation of science. Unlike pseudoscience, fake science employs
a reproducible methodology and is based on true claims (May, 2019).
Research conducted in 2021 by Else and Van Noorden indicates that fake science is not limited to isolated cases,
but often occurs as an institutionalized form of forgery, aided by predatory journals that prioritize profits from
publications over the dissemination of high-quality content. For example, researchers from China have been
found to create low-quality content in order to receive undeserved accolades. In 2020, over 1,400 scientific
articles submitted by Chinese researchers were registered in biomedical journals. Although nearly 400 of these
articles have been retracted after their misinformation was exposed, around 1,000 of them still remain in
circulation (Turek, 2022). A similar situation has occurred with publications from Italian researcher Alfredo Fusco
7
Proceedings of the 10th European Conference on Social Media, ECSM 2023
Katarzyna Bąkowicz
and his co-authors in the field of oncology, where between 2013 and 2018, 22 articles were retracted and 10
corrections were published. Nevertheless, eight of the retracted articles have received 71 citations on Google
Scholar since their retraction (Bucci, 2019).
It is not difficult to highlight the negative consequences of disinformation in science. In addition to the loss of
trust in representatives of this field, the implementation of falsified or manipulated content carries a high risk.
Even if only partially misinformation, the social effects remain negative. Science should remain a bastion of truth
and reliability, serving as a reference point for creating a reality based on facts.
4. Tools to promote resilience
Viewing the phenomenon of disinformation in a wide spectrum allows for the development of a resistant
attitude towards manipulated content, which is necessary to navigate the information ecosystem. It should
begin with cooperation and cohesion among different social groups, sectors, and governments. The voices of
entities such as the European Union or the United Nations are increasingly encouraging agreements and joint
activities in this field. Only through mutual support and the creation of solutions to improve the fight against
disinformation can we have a chance of success in the face of its rapid proliferation. Building a culture of
communication, improving trust in the media and academia should be prioritized in actions against
disinformation (Baptista, 2021).
Improving media literacy is of great importance in addressing this issue. The ability to think critically, evaluate,
use, and create information is a key skill in the 21st century that allows individuals to navigate the information
environment and make informed choices. It also promotes responsible participation in political processes and
democratic elections, free from interference and manipulation (Countering disinformation..., 2022). This is
exemplified by programs implemented in Europe and around the world that focus on building mental resilience
to manipulation, starting with awareness of these phenomena. Familiarity with the terminology associated with
techniques used to falsify reality allows for approaching information with distance and healthy skepticism
(Teperic, 2022). When we are aware that disinformation can be found in every aspect of social life, we are better
prepared for its occurrence and less susceptible to its harmful effects. However, if we are unaware that
disinformation can be related to stereotyping, or can affect business or be present in science, there is a high
probability that we may not respond to it properly when it arises. Considering the rapid technological progress
and the increasing pace at which it occurs, it should be anticipated that there will be more manipulations in the
years to come.
An attitude of responsibility is also crucial. We must recognize that disinformation is a concern for all members
of modern societies, and we should feel accountable for it. What we read and what we share can either support
a discourse based on truth or deviate from it. Therefore, it is important to break out of information bubbles and
our own comfort zones, which may give us a false sense of security but ultimately distort our perception of
reality. Such bubbles can contribute to the disappearance or significant reduction of awareness about the
phenomenon of disinformation, making it a fertile ground for its spread. Similarly, the fear of expressing our
own views and a conformist attitude that amplifies the voice of the majority can also support manipulated or
polarizing content.
Building resistance to disinformation is not only necessary to address this phenomenon, but also to ensure the
quality of the information that surrounds us. The examples described in this article illustrate the wide reach of
disinformation and the serious consequences it can cause. Neutralizing these consequences is often challenging,
which is why prevention, an approach that aims to mitigate this harmful phenomenon, becomes so crucial.
5. Discusion
Disinformation is a complex phenomenon that should be viewed in a broad spectrum. The fact that it can be
pervasive highlights the dynamic and heterogeneous nature of the issue we are grappling with. In order to
effectively counteract it, whether by building resilience or neutralizing its consequences, it is crucial to
understand its complexity. Disinformation evolves, much like the world around us, and therefore requires
constant vigilance and analysis of signals and phenomena. In-depth discussions are also important, aimed at
understanding the processes that contribute to the spread and strengthening of disinformation in the social
space. Viewing disinformation across the spectrum is a crucial step in building a high-quality information
ecosystem, and should be a top priority for all of us.
References
Abdelmoez, J.W. , Rosenberg, T., D’Urso, S., Winget, A. R. (2022), Deviants, Queers or Scissoring Sisters of Men? Translating
and Locating Queer and Trans Feminisms in the Contemporary Arabic-Speaking World, The Palgrave Handbook of
8
Proceedings of the 10th European Conference on Social Media, ECSM 2023
Katarzyna Bąkowicz
Queer and Trans Feminisms in Contemporary Performance, Cham: Springer International Publishing, DOI:
10.1007/978-3-030-69555-2_16.
Aldwairi, M., Alwahedi, A. (2018), Detecting Fake News inSocial Media Networks, Procedia Computer Science,vol. 141.
Alves, L., Antunes, N., Agrici, O., Sousa, C.M.R., Ramos, C.M.Q. (2016), Clickbait: you won’t belive what happens next!,
https://www.researchgate.net/publication/311930296_Click_Bait_You_Won%27t_Believe_What_Happens_Next
(accessed on 21.04.2023)
Badanie NASK, https://www.nask.pl/pl/aktualnosci/2249,Badania-NASK-ponad-polowa-polskich-internautow-styka-sie-z-
manipulacja-i-dezinfo.html (accessed on 21.04.2023)
Banaji, S., Bhat, R. (2019), WhatsApp Vigilantes: An Exploration of Citizen Reception and Circulation of WhatsApp
Misinformation Linked to Mob Violence in India, London School of Economics and Political Science (LSE)
Baptista, K. (2021), Building Resilience Against Disinformation, https://dai-global-digital.com/building-resilience-against-
disinformation.html (accessed on 25.04.2023)
Belanger, J. (2020) What Motivates COVID Rule Breakers? https://www.scientificamerican.com/article/what-motivates-
covid-rule-breakers/ (accessed on 21.04.2023)
Bąkowicz, K. (2023), Dezinformacja. Instrukcja obługi, CeDeWu, Warszawa.
Bąkowicz, K. (2020), Fake news. Produkt medialny czasów postprawdy, Aspra, Warszawa.
Bucci, E.M. (2019), On zombie papers, https://www.nature.com/articles/s41419-019-1450-3 (accessed on 24.04.2023)
Champa, J. (2021), Misinformation in the Media and its Influence on Racism
https://stars.library.ucf.edu/cgi/viewcontent.cgi?article=1968&context=honorstheses (accessed on 21.04.2023)
Countering disinformation and building societal resilience (2022), https://www.eeas.europa.eu/eeas/countering-
disinformation-and-building-societal-resilience_en (accessed on 25.04.2023)
Demagog, Dezinformacja wokół wojny w Ukrainie. wiatowe narracje i trendy, https://
demagog.org.pl/analizy_i_raporty/dezinformacja-wokol-wojny-w-ukrainie-swiatowe- narracje-i-trendy/ (accessed on
23.04.2023).
de Regt, A., Montecchi, M., Ferguson, S.L. (2019), A false image of health: how fake news and pseudo-facts spread in the
health and beauty industry
https://www.researchgate.net/publication/335447053_A_false_image_of_health_how_fake_news_and_pseudo-
facts_spread_in_the_health_and_beauty_industry (accessed on 21.04.2023)
Konkret 24, Fałszywe przekazy: jak powstają i jak je rozpoznawa, Raport Digital Poland,
https://digitalpoland.org/publikacje/pobierz?id=4f2e2116-82a6-47b5-a984-801b5e704b56 (accessed on 19.01.2023).
Elliot, D., Culver, C. (1992), Defining and Analyzing Journalistic Deception, w: Journal of Mass Media Ethics, vol. 7.,
Thomson Reuters.
Fruh, K., Archer, A., Wojtowicz, J. (2022), Sportwashing: Complicity and Corruption,
https://dukespace.lib.duke.edu/dspace/bitstream/handle/10161/25569/Sportswashing%20Complicity%20and%20Co
rruption.pdf?sequence=2&isAllowed=y (accessed on 22.04.2023).
Gans, H. (2004), Deciding what’s news: a study of CBS Evening News, NBC Nightly News, Newsweek and Time, Illinois.
Girard, R. (1987), Kozioł ofiarny, Łod.
Global fraud and risk report 2019/20, https://www.kroll.com/en/insights/publications/global-fraud-and-risk-report-2019
(accessed on 21.04.2023)
Hoeller, M. (2021)The human component in social media and fake news: the performance of UK opinion leaders on Twitter
during the Brexit campaign, https://www.tandfonline.com/doi/full/10.1080/13825577.2021.1918842 (accessed on
22.04.2023)
Konkret 24, Fałszywe przekazy: jak powstają i jak je rozpoznawa, Raport Digital Poland,
https://digitalpoland.prowly.com/178556-mlodzi-polacy-bezradni-wobec-dezinformacyjnego-chaosu (accessed on
23.04.2023)
Korcz, M., Podus, S. (2022), Czym jest warwashing, https://www.ican.pl/b/czym-jest-warwashing/PTuQgQXvw (accessed on
22.04.2023)
May, A. (2019), Fake Physics, Spoofs, Hoaxes and Fictitious Science, Springer.
Mc Nair, B. (2018), Fake News, Falsehood, Fabrication and Fantasy in Journalism, New York.
Media portrayals and black mail outcomes, https://opportunityagenda.org/messaging_reports/media-representations-
black-men-boys/media-portrayals-black-men/ (accessed on 21.04.2023)
Michelucci, P. (2013), Handbook of human computation, New York.
Musiał, E. (2017), Trolling jako przykład zagroe informacyjnych w cyberprzestrzeni,
https://rep.up.krakow.pl/xmlui/handle/11716/2040 (accessed on 22.04.2023)
Nemr, C., Gangware, W. (2019), Weapons of mass distraction: Foreign State-Sponsored Disinformation in the Digital Age,
https://www.state.gov/wp-content/uploads/2019/05/Weapons-of-Mass-Distraction-Foreign-State-Sponsored-
Disinformation-in-the-Digital-Age.pdf (accessed on 22.04.2023)
Noelle Neumann, E. (1974), The spiral of silence: A theory of public opinion, Journal of Communication, vol. 24, nr 3.
Ohlheiser, A. (2021), Facebook is bombarding cancer patients with ads for unproven treatments,
https://www.technologyreview.com/2022/06/27/1054784/facebook-meta-cancer-treatment-ads-misinformation/
(accessed on 21.04.2023)
9
Proceedings of the 10th European Conference on Social Media, ECSM 2023
Katarzyna Bąkowicz
Paz, E. (2022), Strategic disinformation: Russia, Ukraine, and crisis communication in the digital era,
https://www.researchgate.net/publication/362022036_Strategic_disinformation_Russia_Ukraine_and_crisis_commu
nication_in_the_digital_era (accessed on 23.04.2023).
Pluta, E., Pseudonauka przypadek znachora Jerzego Zięby, https://web.swps.pl/strefa-psyche/blog/18467-pseudonauka-
dlaczego-w-nia-wierzymy-przypadek-znachora-jerzego-zieby (accessed on 21.04.2023)
Reuters Institute Digital News Report 2022, https://reutersinstitute.politics.ox.ac.uk/digital-news-report/2022 (accessed on
24.04.2023)
Standard Eurobarometr 96, Opinia Publiczna w Unii Europejskiej,
https://poland.representation.ec.europa.eu/news/eurobarometr-96-demokracja-zagrozona-przez-dezinformacje-
2022-04-08_pl (accessed on 23.04.2023)
Starnd, C., Svensson, J., Blomeyer, R., Sanz, M. (2021), Disinformation camapaingns about LGBTI+ people in the EU and
foreign influence, https://www.europarl.europa.eu/thinktank/pl/document/EXPO_BRI(2021)653644 (accessed on
21.04.2023)
Szpunar, M. (2014), Internet nowa sfera publiczna czy kamera pogłosowa?, w: M. Adamik – Szysiak (red.) Media i
polityka. Relacje i współzaleności, Lublin.
Teperic, D., Denisa-Liepniece, S., Bankauskaite, D., Kullamaa, K. (2022), Resilience against disinformation,
https://icds.ee/wp-
content/uploads/dlm_uploads/2022/10/ICDS_Report_Resilience_Against_Disinformation_Teperik_et_al_October_20
22.pdf (accessed on 25.04.2023)
Terra Choice the six sins of Greenwashing. A study of Environmental Claims in North American Consumer Markets,
https://sustainability.usask.ca/documents/Six_Sins_of_Greenwashing_nov2007.pdf (accessed on 21.04.2023)
The impact of disinformation campaings about migrants and minority group in the EU,
https://www.europarl.europa.eu/RegData/etudes/IDAN/2021/653641/EXPO_IDA(2021)653641_EN.pdf (accessed on
21.04.2023)
Trust in government and others during the COVID-19 pandemic, https://cls.ucl.ac.uk/wp-content/uploads/2020/10/Trust-
in-government-and-others-during-the-COVID-19-pandemic--initial-findings-from-COVID-19-survey.pdf (accessed on
21.04.2023)
Turek, D. (2022), Fake science fałszywa nauka. Powane zagroenie czy kolejny pseudoproblem?,
https://www.researchgate.net/publication/358646980_Fake_science_-
falszywa_nauka_Powazne_zagrozenie_czy_kolejny_pseudoproblem (accessed on 24.04.2023)
Wardle, C. (2020) Understanding information disorder, C. Wardle, Understanding information disorder,
https://firstdraftnews.org/long-form-article/understanding-information-disorder/ (accessed on 20.04.2023)
Wardle, C. (2021), The science of misinformation, https://www.sciline.org/social-sciences/misinformation/ (accessed on
20.04.2023)
War through Symbols, The Conflicts of the Late Republic Viewed Through Coinage
https://storymaps.arcgis.com/stories/03206d2d092f4aa6bbc5a8827bf6d6a9 (accessed on 21.04.2023).
Woodward, A. (2020), Fake news: A guide to Trump’s favourite phrase – and the dangers it obscures,
https://www.independent.co.uk/news/world/americas/us-election/trump-fake-news-counter-history-b732873.html
(accessed on 22.04.2023).
Żukowska, J., Mikołajewska, A., Staniszewska, K. (2021), Problematyka dezinformacji naukowej, Edukacja Ekonomistów i
Menederów, nr 61 (3), lipiec-wrzesie.
10
Proceedings of the 10th European Conference on Social Media, ECSM 2023
Article
Full-text available
Geopolitical events, hybrid threats, and political agendas, particularly of a populist nature, are increasingly pervasive in global media, therefore the imperative to understand the intricacies of disinformation extends beyond mere countermeasures. To effectively mitigate the spread and creation of disinformation, an examination of the primary reasons contributing to its emergence is essential, thus the primary objective of the research is to investigate the most common reasons for the emergence of disinformation and assess its impact on society. The methodology used employs a diverse range of methods, including literature analysis, content analysis, historical methods, and legal interpretation where applicable. Furthermore, it utilizes general scientific techniques such as induction, deduction, and synthesis.
Article
Full-text available
In times of crisis, communication is a critical and sensitive topic that has been analysed to identify and understand the power of media, the use that citizens make of digital channels, and how governments and key agents can use it strategically to resonate with audiences and consolidate important messages. In these past years, crisis communication studies focused on analysing information management during the pandemic, also explaining disinformation and fake news aggravate the situation. This crisis has deepened and extended outside the healthcare dimension now due to the European geopolitical conflict regarding Russia’s invasion of Ukraine, which affects the stability of international political relations considerably and concerns citizens worldwide. This context demands a closer analysis of disinformation as warfare, how it is (and has been) articulated in this framework of political crisis, and what elements contribute to the dissemination of fake news. The objective of this paper is to present a critical analysis of how the various levels of disinformation apply in the current situation and to assess how communication theories engage in this context. To do so, this paper examines four aspects of disinformation: (1) as a strategic form of communication, (2) the social impact that it entails domestically and internationally, (3) the role of social media regarding this type of corrupted ommunication, and (4) measures and proposals to oppose the spread of disinformation.
Article
Full-text available
The article discusses the phenomenon of fake science and shows the consequences of its spread in the scientific community. In the definition dimension, the differences between science, pseudo-science and fake science were pointed out, emphasizing that fake science primarily involves fabricating and falsifying empirical evidence in order to achieve individual (e.g. professional position, fame, money, etc.) and social (political influence, power) benefits. In the analytical dimension, it has been shown that there are some difficulties in separating science from fake science and pseudo-science, however, with the use of appropriate criteria, it can be efficiently achieved. A postulate was also formulated to develop such methods that would allow for the elimination of fake statements from the scientific discourse.
Article
Full-text available
Ever since David Cameron announced the UK’s EU referendum in February 2016, discussions about Fake News during the Brexit campaign have been thriving and sparking debates on the role of social media in the run-up to Brexit. So far, research on this topic has mainly focused on the automatic spread of false information, through bots, for example. Building on the assumption that political leaders accounted for Fake News as well, my analysis adds a human component: I screened more than 1400 tweets posted by David Cameron, Jeremy Corbyn, Boris Johnson and Nigel Farage during the Brexit campaign. Using fact-checking platforms, I verified each leader’s top three arguments for Remain or Leave. As the results show, some political leaders turned out to be part of the Fake News epidemic surrounding Brexit: Johnson and Farage shared multiple arguments that were clearly misleading, while Corbyn and Cameron mostly stuck to the facts (although some of their points were speculative). Furthermore, my analysis provides insights into the prevalent arguments used by the respective leaders and their performance on Twitter in general.
Article
Full-text available
Purpose Diffusion of fake news and pseudo-facts is becoming increasingly fast-paced and widespread, making it more difficult for the general public to separate reliable information from misleading content. The purpose of this article is to provide a more advanced understanding of the underlying processes that contribute to the spread of health- and beauty-related rumors and of the mechanisms that can mitigate the risks associated with the diffusion of fake news. Design/methodology/approach By adopting denialism as a conceptual lens, this article introduces a framework that aims to explain the mechanisms through which fake news and pseudo-facts propagate within the health and beauty industry. Three exemplary case studies situated within the context of the health and beauty industry reveal the persuasiveness of these principles and shed light on the diffusion of false and misleading information. Findings The following seven denialistic marketing tactics that contribute to diffusion of fake news can be identified: (1) promoting a socially accepted image; (2) associating brands with a healthy lifestyle; (3) use of experts; (4) working with celebrity influencers; (5) selectively using and omitting facts; (6) sponsoring research and pseudo-science; and (7)exploiting regulatory loopholes. Through a better understanding of how fake news spreads, brand managers can simultaneously improve the optics that surround their firms, promote sales organically and reinforce consumers’ trust toward the brand. Originality/value Within the wider context of the health and beauty industry, this article sets to explore the mechanisms through which fake news and pseudo-facts propagate and influence brands and consumers. The article offers several contributions not only to the emergent literature on fake news but also to the wider marketing and consumer behavior literature.
Book
Fake News: Falsehood, fabrication and fantasy in journalism examines the causes and consequences of the 'fake news' phenomenon now sweeping the world's media and political debates. Drawing on three decades of research and writing on journalism and news media, leading scholar Brian McNair engages with the fake news phenomenon in accessible, insightful language designed to bring clarity and context to a complex and fast-moving debate. McNair presents fake news not as a cultural issue in isolation but rather as arising from, and contributing to, significant political and social trends in twenty-first century societies. Chapters identify the factors which have laid the groundwork for fake news' explosive appearance at this moment in our globalised public sphere. These include the rise of relativism and the crisis of objectivity, the role of digital media platforms in the production and consumption of news, and the growing drive to produce online content which attracts users and generates revenue. The book also considers the decline of trust in journalism, and the how the traditional left critique of 'dominant ideology' and 'ruling elites' in media has been appropriated by the alt-right, nationalists and populists all over the world. This book rejects the left-right division in discussion of what is and is not 'fake news'. Rather, it aims to provide students, teachers, journalists and general readers with the tools necessary to navigate the digital journalism landscape in the era of President Donald Trump, and to filter out the 'fact' from the 'fake' in their news.
Book
This volume addresses the emerging area of human computation. The chapters, written by leading international researchers, explore existing and future opportunities to combine the respective strengths of both humans and machines in order to create powerful problem-solving capabilities. The book bridges scientific communities, capturing and integrating the unique perspective and achievements of each. It coalesces contributions from experts in diverse areas, including a foreword by celebrated cultural anthropologist Mary Catherine Bateson, to reveal that human computation encompasses disciplines from crisis management to digital curation to scientific and market research. The book combines industry perspectives with related disciplines in order to motivate, define, and anticipate the future of this exciting new frontier in science and cultural evolution. The comprehensive, current, and interdisciplinary treatment transcends the technical scope of previous studies on the topic. Readers will discover valuable contributions covering Foundations; Application Domains; Techniques and Modalities; Infrastructure and Architecture; Algorithms; Participation; Analysis; Policy and Security and the Impact of Human Computation. Researchers and professionals will find the Handbook of Human Computation a valuable reference tool. The breadth of content also provides a thorough foundation for students of the field.
Article
This article is based on a much longer paper published in German in Ernst Forsthoff and Reinhard Horstel (Eds.) Standorte im Zeitstrom: Festschrift fur Arnold Gehlen. Zum 70. Geburtstag am 29.1.1974. Frankfurt am Main: Athenaum, 1974. The longer version documents in detail (33 tables) the results of surveys conducted to test the propositions contained in the five hypotheses presented in this article. The propositions are confirmed or refuted, or they are tentatively supported by the data, or they await further testing. Research is being continued. A complete English translation of the paper is available to interested scholars upon request.
Deviants, Queers or Scissoring Sisters of Men? Translating and Locating Queer and Trans Feminisms in the Contemporary Arabic-Speaking World, The Palgrave Handbook of Queer and Trans Feminisms in Contemporary Performance
  • J W Abdelmoez
  • T Rosenberg
  • S D'urso
  • A R Winget
Abdelmoez, J.W., Rosenberg, T., D'Urso, S., Winget, A. R. (2022), Deviants, Queers or Scissoring Sisters of Men? Translating and Locating Queer and Trans Feminisms in the Contemporary Arabic-Speaking World, The Palgrave Handbook of Queer and Trans Feminisms in Contemporary Performance, Cham: Springer International Publishing, DOI: 10.1007/978-3-030-69555-2_16.