Content uploaded by Greg Gondwe
Author content
All content in this area was uploaded by Greg Gondwe on Feb 19, 2021
Content may be subject to copyright.
International Journal of Communication 15(2021), 1200–1219 1932–8036/20210005
Copyright © 2021 (Dani Madrid-Morales, Herman Wasserman, Gregory Gondwe, Khulekani Ndlovu, Etse
Sikanku, Melissa Tully, Emeka Umejei, and Chikezie Uzuegbunam). Licensed under the Creative Commons
Attribution Non-commercial No Derivatives (by-nc-nd). Available at http://ijoc.org.
Motivations for Sharing Misinformation:
A Comparative Study in Six Sub-Saharan African Countries
DANI MADRID-MORALES
University of Houston, USA
HERMAN WASSERMAN
University of Cape Town, South Africa
GREGORY GONDWE
University of Colorado Boulder, USA
KHULEKANI NDLOVU
University of Cape Town, South Africa
ETSE SIKANKU
Ghana Institute of Journalism, Ghana
MELISSA TULLY
University of Iowa, USA
EMEKA UMEJEI
American University Nigeria, Nigeria
CHIKEZIE UZUEGBUNAM
University of Cape Town, South Africa
In most African countries, “fake news,” politically motivated disinformation, and
misinformation in the media were common occurrences before these became a
preoccupation in the Global North. However, with a fast-growing population of mobile
users, and the popularization of apps such as WhatsApp, misinformation has become much
Dani Madrid-Morales: dmmorale@Central.uh.edu
Herman Wasserman: herman.wasserman@uct.ac.za
Gregory Gondwe: gregory.gondwe@colorado.edu
Khulekani Ndlovu: ndlkhu005@myuct.ac.za
Etse Sikanku: etse.sikanku@gmail.com
Melissa Tully: melissa-tully@uiowa.edu
Emeka Umejei: emeka.umejei@aun.edu.ng
Chikezie Uzuegbunam: chikezieuzuegbunam@gmail.com
Date submitted: 2020-03-06
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1201
more pervasive across the continent. Researchers have shown that perceived exposure to
false information is high in some African countries, and yet citizens often share made-up
news intentionally. This article explores the motivations and contributing factors for
sharing misinformation in six sub-Saharan African countries. Our analysis of 12 focus
groups with university students reveals two common motivations: civic duty and fun. The
sharing of political (dis)information was uneven, but common among students with high
levels of self-reported political engagement. We also present an array of cues used to
determine credibility, which often determines the shareability of information. Cross-
national differences are also discussed.
Keywords: disinformation, “fake news,” social media, information sharing, sub-Saharan
Africa, focus groups
The notion of “fake news” and the related concepts of “misinformation” and “disinformation”
1
rapidly
became areas of scholarly inquiry after the 2016 U.S. presidential election, covering issues ranging from
election manipulation through the media to the implications for mainstream news practices (Tandoc, 2019).
Though the case of the United States is most often discussed, multiple countries around the world have been
grappling with different expressions of “fake news” for some time. Several of these are in the Global South:
Chile (Valenzuela, Halpern, Katz, & Miranda, 2019), Nigeria (Okoro & Emmanuel, 2018), India (Udupa &
McDowell, 2017), and South Africa (Roper, 2019). However, scholarship in the area still reflects a lack of
geographical diversity, and consequently, responses by scholars and policy makers continue to display a limited
purview of the phenomenon. As in the case of journalism studies more generally, research on social media,
including its use for the spread of misinformation, has “often failed to include adequate diversity on matters of
geography, culture, and language as well as race, class, and gender” (Lewis & Molyneux, 2018, p. 19).
Though “fake news” is sometimes presented as a novel scholarly topic in recent literature, false
news as a phenomenon in Africa and elsewhere predates the era of social media (Mäkinen & Kuira, 2008).
Journalists in these regions have always had to learn to treat journalism as a contested area, vulnerable to
manipulation by governments and powerful social elites (Mutsvairo & Bebawi, 2019). Recent discourses on
“fake news” have, however, given authorities a new opportunity to restrict freedom of expression on social
media, particularly on Facebook and WhatsApp (Dwyer & Molony, 2019). These platforms have become
prime sources of viral content, some of which comes from reputable sources, whereas some could easily be
characterized as either mis- or disinformation. The pervasiveness of this type of content is such that
perceived exposure to made-up political news stories in countries such as Kenya, Nigeria, and South Africa
has been found to be higher than in the United States (Wasserman & Madrid-Morales, 2019), the country
that has attracted the most research to date.
The growth of misinformation around the world has been described by the United Nations Educational
Scientific and Cultural Organization (UNESCO) as an “emerging global problem” (UNESCO, 2018, p. 7). The
1
For the remainder of the article, we use the terms “misinformation” and “fake news” interchangeably to refer
to all the expressions and formats in which made-up and inaccurate information has been found to be common.
1202 Dani Madrid-Morales et al. International Journal of Communication 15(2021)
phenomenon has been exacerbated by new ways of accessing news—namely, websites and social media
platforms. This is true of contexts as diverse as the United States (Bigman, Smith, Williamson, Planey, & Smith,
2019), Britain (Chadwick & Vaccari, 2019), Singapore (Tandoc, Lim, & Ling, 2020), Kenya (Wahutu, 2019),
and South Africa (Roper, 2019). This changing pattern of news consumption is often interlinked with a sharp
decline in trust in mainstream news sources. A report by the London School of Economics and Political Science’s
Truth, Trust and Technology (T3) Commission described the “erosion of trust” as one of the “five evils” arising
from the spread of disinformation, which poses a threat to the ability of individuals to make informed decisions
(London School of Economics and Political Science, 2020, p. 11). This means that not only is determining the
veracity of information online of increasing importance, but also knowing the factors that shape the sharing of
such information is a crucial first step toward improving the quality of online discourse. Although geographically
limited, previous research has identified a handful of factors that contribute to misinformation being shared
online (e.g., Bigman et al., 2019; Duffy, Tandoc, & Ling, 2019; Sterrett et al., 2019). This article assesses how
applicable these are to the sub-Saharan African context by analyzing data from 12 focus groups with university
students in six countries (Ghana, Kenya, Nigeria, South Africa, Zambia, and Zimbabwe).
Satire, Rumors, and Misinformation in Sub-Saharan Africa
Although current manifestations of “fake news” in Africa call for comparative analyses with similar
practices elsewhere in the world (Wasserman, 2020), the phenomenon also has a much longer history that
is tied to rumors and satire. Satire has in fact been an alternative channel of information when trust in the
mainstream media—often either owned by the state or oriented toward social elites—is low (Moehler &
Singh, 2011). In Africa, satire can be used to express social reality in contexts where other forms of
journalism might be suppressed (Mano, 2007). Examples of such satirical news content include the Kenyan
comedic program Redykulass and puppet show XYZ (Ogola, 2010). Drawing on the Zimbabwean experience,
Willems (2011) argues that satirical forms such as cartoons and comic strips not only challenge those in
power and reframe officially sanctioned constructions of reality but also self-reflexively help readers to laugh
at their circumstances and powerlessness, and in so doing, help them cope with their everyday experiences.
Social media such as Facebook and Twitter have become especially productive sites for the circulation of
activist and protest messages, as well as satirical memes, jokes, videos, and similar artifacts used to
challenge political power in African contexts (Mare, 2020; Tully & Ekdale, 2014; Uzuegbunam, 2020).
Therefore, an analysis of misinformation in Africa, as elsewhere, stands to benefit from a historical
and cultural perspective on the way media technologies operate within African societies. Nyamnjoh (2005)
has argued for an approach that is attentive to the continuities between older, indigenous forms of
communication, and newer technologies, as Africans are “daily modernising the indigenous and indigenising
the modern with novel outcomes” (p. 4). Oral traditions, which include satire, gossip, and jokes, underpin
the creative adaptations of technology by African media users. Nyamnjoh (2005) sees such creativity as
“not only informed by cultures amenable to conviviality, interdependence and negotiation, but also by
histories of deprivation, debasement and cosmopolitanism” (p. 4). African values of “solidarity,
interconnectedness and interdependence” (Nyamnjoh, 2005, p. 16) have shaped the way African people
interact with digital media since its inception, and it should be assumed that these values and practices will
also inform the way people engage with misinformation on such platforms.
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1203
Media technologies, especially mobile phones, have long been instrumental in the spread of rumor
and misinformation. One example is the 2007–8 elections in Kenya, where rumors about election rigging,
ethnic hate speech, and calls for violence spread via SMS technology (Goldstein & Rotich, 2010; Mäkinen &
Kuira, 2008). Rumors and falsehoods being spread through these circuits of informal exchanges of
information may result from the lack of trust in the mainstream news media but could also be read as a
result of political disillusionment. This would explain, argues Shoki (2020), conspiracy theories such as the
one circulating in South Africa about Nelson Mandela having died in 1985, not in 2013, and suggesting that
an imposter negotiated the terms of the democratic transition, which proved to be unfavorable to the Black
majority. Such rumors may arise as a result of feelings of disempowerment and serve as an alternative
narrative to better explain the causes of current circumstances.
Long before the emergence of extreme speech on social media platforms and WhatsApp, older
information and communication technologies (ICTs), such as mobile phones, websites, and blogs, were seen
as sites of contestation between different political and social forces in African countries (Goldstein & Rotich,
2010). More recently, mobile phones—this time especially used to interact with social media—have proved
to be an important factor in African elections, as seen in Sierra Leone (Dwyer & Molony, 2019) and Nigeria
(Orji, 2019). WhatsApp, especially, facilitated the spread of misinformation, and it became difficult to trust
political information received on this platform during election times (Dwyer & Molony, 2019, p. 119). Beyond
electoral periods, the prevalence of misinformation on social media platforms has also had a limiting impact
on the exercise of journalism, and media organizations have had to play the additional role of educating
audiences about the dangers of misinformation (Electoral Commission of South Africa, 2019).
Information Sharing on Social Media
The sharing of news on social media platforms, particularly when it helps spread misinformation,
has raised concerns that these practices could negatively reshape online culture and limit the ability of online
media to contribute to the democratic process (Chadwick & Vaccari, 2019, p. 7). This is in line with findings
by Wagner and Boczkowski (2019), who argue that the consumption of “fake news” is linked to a general
distrust and cynicism about the credibility of the whole news ecosystem. More broadly, the perception that
information cannot be trusted can lead to the development of what researchers at the London School of
Economics and Political Science (2020) label the “five evils” of misinformation (p. 11). The first is confusion:
Citizens are unsure whom and what to believe among an abundance of sources and information. The second
is cynicism: Misinformation has further eroded public confidence in mainstream news sources and has
contributed to the “fomenting of social antagonism.” The third is fragmentation: Citizens are being divided
into “truth publics” with parallel realities and narratives online. The fourth evil is irresponsibility: An increase
in information generated outside of news organizations with ethical codes has caused a lack of transparency
and accountability. The fifth is apathy: Declining trust in political information may make citizens less likely
to participate in political processes.
In an established democracy such as the United Kingdom, in 2019, more than half of social media
users (57.7%) reported that they had recently come across news on these platforms of which the veracity
was in doubt. What is more, a high percentage (42.8%) admitted to having shared false or inaccurate news,
of which 17.3% said they thought the news was false at the time of sharing it (Chadwick & Vaccari, 2019).
1204 Dani Madrid-Morales et al. International Journal of Communication 15(2021)
Wasserman and Madrid-Morales (2019) found similar results in a study of three African countries. To the
question, “Have you ever shared a political news story online that you thought at the time was made up?”,
29% of Kenyans, 18% of Nigerians, and 25% of South Africans answered “yes.” Based on the distinction
between knowing at the time of sharing that news is false and only finding out later that it was untrue,
Chadwick and Vaccari (2019, p. 14) distinguish between the concepts of “misinformation” (“unintentional
behavior that inadvertently misleads”) and “disinformation” (“intentional behavior that purposely
misleads”). However, as the authors point out, very little is currently known about the reasons and
motivations prompting people to share news online.
One factor to consider is the social identity of the user. In a study with university students in the
United States, Bigman and colleagues (2019) found that race is a strong predictor. Black students reported
“both seeing and posting more content about race on social media” (p. 14). They see their study as providing
evidence that “selective sharing is likely to result in racially differentiated retransmission of news about
disparate racial impact” (p. 14). Chadwick and Vaccari (2019) found that users who willingly and/or knowingly
shared false information on social media platforms were “likely to be male, younger, and more interested in
politics” (p. 5). Not only social position but also political orientation was found to play a role in the likelihood
of British social media users sharing false information. Supporters of the Conservative Party and those with
right-wing leanings were found to be more likely to share inaccurate or false news. This corresponds with the
findings by Guess, Nagler, and Tucker (2019) that during the 2016 U.S. presidential election, conservatives
and those who are extremely conservative were more likely to share “fake news” on Facebook. Unlike in
Chadwick and Vaccari’s study, gender was not found to be a strong predictor in the United States. However,
age was, as those over 65 were the most likely to share links to false news (Guess et al., 2019). More recent
findings from the United States (Jamieson & Albarracín, 2020) seem to confirm that there is an association
between the consumption of conservative media and belief in conspiracy theories.
When asked to reflect on the reasons why they share news on social media, the top three reasons
provided by British respondents (Chadwick & Vaccari, 2019) were the following: “To express my feelings”
(65.5%); “To inform others” (also 65.5%); and “To find out other people’s opinions” (51.1%). These reasons
display an orientation toward civic participation or purpose. Duffy and associates (2019) explored the social
utility of sharing “fake news” in Singapore and draw comparisons between the sociality of “fake news” and
rumor; both are used to “cope with uncertainty, build relationships, and for self-enhancement” (p. 3). The
main types of news stories that are shared, the authors argue, are those that have a high informational
utility (“news you can use”), which resonate with their own lives and have a high emotional impact (p. 5).
They encourage an understanding of sharing practices that looks beyond the political implications of sharing
“fake news” to the interpersonal and social uses for sharer and recipient. Sharing news is seen as
contributing to social cohesion. Users are motivated by the emotional impact the news is seen to have, the
relevance it might have for the receiver, and the sender’s intention to “provide advice or warning” (Duffy et
al., 2019, p. 10). Sharing “fake news,” the authors argue, can therefore be seen as a sign of trust between
sender and recipient. This suggests that “what is shared—and reciprocated—is more than just news or
information; it is also a marker of trust, fellow-feeling and mutuality” (Duffy et al., 2019, p. 10).
Rumor can, however, also be detrimental to the social fabric in advanced democracies, as Petersen,
Osmundsen, and Arceneaux (2018) show. The authors found that in Denmark and the United States, when
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1205
hostile rumors are shared, the aim is to “coordinate the attention and action of the audience with the goal
of mobilizing against the target group and signal their willingness to engage in conflict escalation (i.e.,
helping push the collective over the tipping point for collective action)” (p. 4). The motivations behind
sharing this kind of false rumor online can be partisan in nature—to mobilize against a political opponent—
or to rail against the whole political system (p. 6). Using experimental and observational data, the authors
conclude that, at least in the two countries they studied, the overriding psychological motivation
underpinning the sharing of false news is the latter: a “need for chaos.”
The applicability to the African context of the motivations outlined above has not been thoroughly
explored. The only study to have investigated audiences’ engagement with “fake news” in an African context
found three reasons that help explain the sharing of “fake news” in Kenya and Nigeria (Chakrabarti, Rooney,
& Kweon, 2018). First, there is the desire to be “in the know” socially, so sharing “fake news” becomes a
form of social currency. This may not be unique to Kenya and Nigeria; the long-standing use of humor in
African societies, which has been noted to play a politically progressive role on the continent, may amplify
the social capital obtained through sharing satirical information. Second, there is a sense of civic duty that
might lead social media users to share warnings of impending disasters or crises. Even if the information
turns out to be false, the potential harm that could result from not informing others may be seen as
outweighing the dangers of spreading false information. And, third, there is the sense that information is
democratic and needs to be passed on. Users may take the popularity or virality of a shared piece of
information as an indication of its veracity (Chakrabarti et al., 2018, p. 44). This motivation may be
especially relevant in African countries where the state exercises a great deal of control or ownership over
the media, which may lead to a decline in trust in mainstream media (Wasserman & Madrid-Morales, 2019).
In this article, as we seek to explore the motivations for African audiences’ consumption and sharing
of false information online, these social, cultural, political, and economic factors should be borne in mind. A
technologically determinist approach that foregrounds the platform on which misinformation is shared
should be avoided in favor of an audience-centered, contextually informed understanding of the motivations
for sharing misinformation. Against this background, we address the following research questions:
RQ1: How often and where do university students encounter misinformation online?
RQ2: How do students decide what information to share on social media, and to what extent do type of
content and source affect shareability across countries?
Method
This article uses data collected between August 2019 and January 2020 in 12 focus groups with
university students in six sub-Saharan African countries. Discussions lasted between 50 and 90 minutes and
were conducted in English. In contrast to surveys, in-depth interviews, or experiments, all of which have
been used in previous studies examining misinformation sharing practices (Chadwick & Vaccari, 2019;
Valenzuela et al., 2019), focus groups offer “richer, more complex and more nuanced information”
(Kamberelis & Dimitriadis, 2013, p. 40). The method is deemed appropriate as it could make an important
1206 Dani Madrid-Morales et al. International Journal of Communication 15(2021)
contribution to an exploratory study such as this one, given the lack of previous research on sharing practices
among young people in the six countries we studied.
Sampling
Two focus groups (FGs), one with undergraduate and one with graduate students, were convened
in each country. Ninety-four participants joined the 12 discussions, in groups ranging from five
(postgraduate FG in Nigeria and Kenya) to 15 (undergraduate FG in Zimbabwe) participants. Our sample
includes more undergraduates (64%), men (52%), and students in the social sciences (69%). To recruit
participants, we used a combination of convenience and snowball sampling. Focus group moderators
requested assistance from undergraduate students in their classes and seminars to enlist other students in
different departments. We tried to assemble a representative and balanced sample in terms of disciplines,
so not all students who volunteered were invited to attend. A consequence of the recruitment process was
that some participants knew each other, but in most cases, they were strangers. Aside from education level,
no other traits were used to enforce sampling quotas. During the recruitment, students were told they would
attend a group discussion about news consumption in general to minimize the risk of priming. Participation
was voluntary. In Kenya, students received 500 Kenyan shillings (approximately US$5) at the end of the
focus group discussion. In Ghana, students were offered food and drinks (equaling approximately 30
Ghanaian cedis, or US$5). Others participated without remuneration. This difference in incentives was
caused by different funding sources for each country. We observed no differences in participation quality
due to the remuneration factor (as indicated by the length and depth of conversation derived from the
transcripts) among the groups. The study procedures received approval from relevant review boards for the
protection of human subjects in research.
In selecting countries, we tried to reflect various political and media systems. South Africa has a high
level of media freedom; self-regulation; an open, participatory media culture; and an established digital media
sphere. Kenya is an East African country with a vibrant independent press as well as a strong presence of
international media, and it has a vocal, active community of social media users. In West Africa, we selected
Nigeria, Africa’s most populated country, which has a strong private media sector as well as a dynamic online
community, and Ghana, which is not only one of the continent’s most stable democracies but also has a free
and diverse media system. Zambia, located in Southern Africa, has a functioning democracy, but has seen
several regressive episodes in terms of media freedoms in recent years. Finally, Zimbabwe has a repressive
media environment and high levels of state ownership and interference in the media. At the same time, there
are also several examples of how Zimbabwean citizens have used alternative channels, including social media
platforms such as Facebook, to undermine authoritarian control of the media. These differences are reflected
in commonly used indexes that measure democracy and press freedom in the world, such as The Economist
Intelligence Unit’s Democracy Index (2020) and Reporters Without Borders’ (2020) Press Freedom Index. We
summarize the categories and scores for the six selected countries in Table 1.
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1207
Table 1. Location, Political, and Press Freedom Indexes of Selected Countries (2019).
The Economist Democracy Index*
RSF Press Freedom Index^
Region
Ghana
Flawed democracy (6.63)
Satisfactory situation (20.81)
West Africa
Kenya
Hybrid regime (5.18)
Noticeable problems (32.44)
East Africa
Nigeria
Hybrid regime (4.12)
Difficult situation (36.50)
West Africa
South Africa
Flawed democracy (7.24)
Satisfactory situation (22.19)
Southern Africa
Zambia
Hybrid regime (5.09)
Difficult situation (36.38)
Southern Africa
Zimbabwe
Authoritarian (3.16)
Difficult situation (42.23)
Southern Africa
Source: *The Economist Intelligence Unit (2020); ^Reporters Without Borders (2020).
Research Design
A common interview guide for all countries was designed by two of the authors, partly based on a
study by Duffy et al. (2019). The suitability of the questionnaire, stimuli, and structure was pretested with
a group of graduate university students in South Africa. The guide was structured around four sections,
each containing key questions asked in all focus groups, and additional questions to be asked at their
discretion. In the first bloc, we asked participants about their media consumption practices (e.g., “How
many of you have a Twitter account? Tell me what you use it for.”). Next, and after having seen the first
stimulus (see Figure 1), participants were asked about their news-sharing practices, and motivations for
sharing (e.g., “Would you consider sharing these posts? Why or why not?”).
Figure 1. Stimulus #1 presented to all focus-group participants.
In the third bloc, and after having been presented with a country-specific stimulus in the form of a (real)
made-up news story (see Figure 2), we asked them about practices of sharing political information (e.g.,
“Can you recall sharing a story that you later found out was not fully accurate?”). Next, groups were asked
1208 Dani Madrid-Morales et al. International Journal of Communication 15(2021)
about their attitudes toward misinformation (e.g., “How much of a problem do you think misinformation and
fake news are where you live?”), and how they personally cope with it (e.g., “What do you usually do when
somebody shares news that you know is made up?”).
Figure 2. Country-specific stimuli #2 presented to participants.
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1209
In the first stimulus, which was shown to all focus groups, participants were presented with one image
depicting a Facebook post and one showing a tweet. Each social media post reproduced a “real” example of
misinformation: one was about plastic rice from China, and the other was about harmful mobile phone
radiation. These cases were selected from a database kept by Africa Check. The second stimulus consisted of
a screenshot of a “real” false news story about politics published online. Though a different item was chosen
for each country, all of them came from equivalent news sources, and included multiple cues that could help
audiences flag them as fabricated news—for example, outlandish claim(s), thin sourcing, no clear authorship,
poor editing, manipulated images, unclear timing of events. To guarantee uniformity across countries, the lead
author preselected two or three stories based on three criteria: (a) the story referenced a divisive local political
actor; (b) it was published on a local news website, and (c) it included several of the cues identified above.
Preselected stories were shared with facilitators for feedback. When multiple stories were considered a good
fit, the one that appeared to have been circulated most widely was retained. For example, the story shown to
Kenyan participants, titled “Raila is harvesting, now feted as ‘Person of the Year in Africa,’” was originally
published on a now defunct blog called “Daily Active Kenya,” had no clear byline, appeared next to a list of
several dubious headlines, and was poorly edited. As part of the debriefing process at the end of each
discussion, facilitators briefed participants about the inaccuracy and origin of the stimuli.
Data Analysis
All discussions were audio recorded and transcribed verbatim. The analysis of the data was conducted
using NVivo, a software package used for computer-assisted qualitative text analysis. One of the authors first
coded each transcript using a list of themes compiled from the questionnaire and from discussion summaries
prepared by each facilitator. During this process, as new themes emerged, they were incorporated into the
list. All transcripts were coded a second time to search for instances of the new themes. After this, the input
of another author was sought to confirm the validity of the coding, and to identify areas of discrepancy. Once
these were resolved, the list of themes (eight) and codes (68) was organized around the proposed research
questions. A copy of the list of themes and codes is available from the authors on request. During the analysis
of the data, and whenever possible, comparisons were drawn among countries, with special attention being
paid to differences that could be linked to the country selection criteria presented earlier.
Findings
In the next two sections, we report the results of the data analysis. The first section addresses RQ1
(frequency of perceived exposure to misinformation online, and sources that students associate with
inaccurate information), and the second section provides an answer to RQ2 (motivations for sharing [false]
news on social media, and determinants of shareability).
Perceived Prevalence of Misinformation Online
As previous research has suggested, for university students in all six countries, misinformation
appears to be a common occurrence, particularly on certain social media platforms, WhatsApp and Facebook
being mentioned most often. In other words, the answer to the first part of RQ1 is that students, regardless
of country and demographic, believe to be exposed to misinformation very frequently. References to the
1210 Dani Madrid-Morales et al. International Journal of Communication 15(2021)
ubiquity of misinformation emerged spontaneously in most of the discussions (except for Ghana), before
facilitators introduced the stimuli, or asked questions on the issue. For example, describing their daily news
consumption, a South African undergraduate student said, “I consume a lot of social media. So, I used to
get a lot of trash and a lot of clickbait stuff. I really had to tailor my social media to include lots of trustworthy
kind of sources.” Similarly, a graduate student in Zimbabwe noted, “I wanted to say WhatsApp is mainly
used to [get news] nowadays, but sometimes it transmits fake news. So, you have to verify because you’ll
spread something that is not true.” Relatedly, an undergraduate in Zambia claimed that his preferred news
sources are Mwebantu Media and Prime TV (two private media companies), because they “are the closest
to telling the truth; and, as you know, nowadays, it is hard to tell what media is telling the truth.” Macro-
level country characteristics (i.e., media freedom, political system) did not appear to affect perceived
exposure to misinformation among students.
Our data suggest that the answer to the second part of RQ1 is that students feel they are exposed
to misinformation the most on social media. Although a few participants said that legacy media (radio, TV,
and for a tiny minority, newspapers) remain their prime sources of news, the vast majority referred to the
digital platforms WhatsApp, Facebook, Twitter, and Google as their preferred sources. Of these, Twitter
seemed much less popular in Zambia and Zimbabwe, and among graduate students in general, the usage
of other platforms for information seeking (e.g., Instagram, Reddit, Snapchat, YouTube) was much less
widespread. Students reported the highest levels of misinformation on WhatsApp, a messaging platform
that allows group conversations and makes it relatively easy to forward information from other sources to
family, friends, and colleagues. To some, social media is the preferred choice to keep informed because
news stories are “curated” by friends and family members, which, some say, adds an extra layer of trust.
In other cases, the reasons are more pragmatic. Many mobile phone data bundles come with unlimited
access to WhatsApp. Also, some students do not have regular electricity at home, so battery-powered mobile
phones are the only way news can be accessed.
Discussions about inaccurate information on WhatsApp in Kenya, South Africa, and Nigeria
quickly led to claims that “older people” (e.g., parents and grandparents) share a lot of unverified
information because they lack an understanding of how social media works and tend to trust content
without verifying it.
For the older generation, social media has taken them by storm. They are excited about
it. I don’t think they realize what is fake and what is not. . . . My dad gets more likes on
his Facebook posts that I would ever get. But if he shares it, one person shares it to three
more, then it becomes such a big thing. (graduate student, Kenya)
You have a lot of people who are used to going to newspapers. When they’re told the
news, they automatically take it as fact, because that’s the way the world worked before.
In this age, where information is decentralized and anybody can say something is true,
they still have those old habits of those days. Older generations, who are new to social
media will start accepting fake news. (undergraduate student, Kenya)
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1211
In contrast to “older” generations, students feel they are much less vulnerable to misinformation. While
generational differences were not mentioned in the other three countries, in Zambia, some undergraduates
studying in the capital, Lusaka, did allude to geographical differences:
Not everyone can tell whether the story is true or not. In my village in Zambezi [in West
Zambia], people easily believe such stories without questioning beyond. People like us, who
can tell if the story is true or not, should not be in the forefront of making such stories viral.
While discussions in some Global North contexts on news, veracity, and trust have been linked to
belligerent antimedia narratives, we did not find evidence of such narratives in our sample. Instead,
participants often referred to one media house or another as their preferred source for trusted news. In
most cases, these go-to sources are the websites/mobile apps of privately owned domestic media (e.g., The
Nation in Kenya, Mail & Guardian in South Africa, and Channels TV in Nigeria). Government-controlled TV
stations were mentioned by no more than one or two students in South Africa (SABC), Ghana (GBC), and
Zambia (ZNBC). While some students did say that they follow blogs and independent commentators, these
sources did not seem to be widely used. International media, such CNN, The New York Times, or the BBC,
which might be the target of attacks by politicians and citizens elsewhere, were described as sources whose
information can be trusted. In the words of a Zimbabwean undergraduate student, “online, there are some
sources that normally lie, but there are some authentic sources like the BBC.”
The overwhelming majority of students regard issues related to misinformation online as a major
source of concern, particularly as it relates to the political process. A student in South Africa brought up the
example of the 2019 elections in South Africa and associated the rise of Freedom Front Plus, a right-wing
White nationalist party, with misinformation online. In Nigeria, a postgraduate student recalled that
President Buhari once said “fake news is as worse as genocide,” a claim we could not verify, to echo what
other participants appeared to be most concerned about: the dangerous connection between politically
motivated misinformation and the fueling of ethnic tensions in Nigeria. This seemed to be a strong reason
most Nigerian students—all undergraduates and postgraduates but one—supported the enactment of
legislation to limit “fake news,” even if it may result in the restriction of personal freedoms. In other
countries, the division around support/opposition to tougher legislation appeared along generational lines
(undergraduates were against regulation, and postgraduates in favor of regulation). The only country in
which misinformation was not unanimously seen as a “big problem” was Zambia, where several participants
highlighted what they saw as the “positive” side of “fake news”:
“Fake news” or misinformation are neither good or bad. They serve two purposes. The
first one is negative: misleading the community and causing divisions. However, “fake
news” also plays a positive role. First, it makes boring news interesting. It is entertaining
and makes everyone regardless of their education level engage in conversations. Second,
“fake news” triggers the truth. By presenting a different point of view, it forces people
question more, there making the media clarify their subliminal messages. (graduate
student, Zambia)
1212 Dani Madrid-Morales et al. International Journal of Communication 15(2021)
In summary, across all six countries, we found that students believe misinformation is widespread,
particularly on social media, but not on mainstream media. Except for some Zambian students, the majority
saw the high prevalence of disinformation as a significant problem politically (e.g., it might lead to violence
and conflict) and socially (e.g., some citizens, such as the elderly and those living in rural areas, might be
prone to believing hoaxes).
Determining the Shareability of Information Online
With RQ2, we sought to explain under which circumstances students become willing contributors
to the dissemination of inaccurate information. There are three takeaways from our analysis: (1) various
cues are used to determine the veracity of information; (2) motivations to share a news item, even if known
to be inaccurate, are dependent on the topic; and (3) the political use of humor appears to be central when
examining how young people interact with politically motivated (mis)information. Next, we present evidence
to support each of these claims.
(1) In all countries, except for Zambia and Zimbabwe, students often used cues to determine what
content to share, even if this might sometimes lead to them sharing inaccurate information. This became
apparent when we presented them with a tweet from “UserABC” (not real), who had a blue check next to
their name, a sign that the account has been verified. For some, the blue check meant the story, a hoax
about plastic rice produced in China, was potentially true. Using this as a definite cue, some said they would
share it immediately; others said they would further investigate whether it was legitimate. A large majority,
however, remained adamant that it was clearly a fake story and would not merit their attention. Other cues
that were mentioned included the number of followers, the lack of likes, comments, retweets and other
metrics, the writing style, and the use of excessive punctuation. Cues used to justify the decision not to
share stimulus #2 included that the source did not seem legitimate, the layout of the website was “off,” the
editing was poor, and they had no recollection of the same story being published in mainstream media. The
skillful recognition of these cues would seem to indicate that some university students are quite media
literate. However, the existence of cues indicating that a story might not be true did not always act as a
deterrent for sharing a story, as we discuss next.
(2) A comparison of reactions to the first and second stimulus reveals that the topic of a story
affects its shareability. We found that stories about health and food (as in the case of stimulus #1), as well
as posts/tweets about scams, safety, and terrorism were evaluated differently than news about politics
(stimulus #2). Those who said they would share the former mostly invoked one reason to do so: to create
awareness. This motivation appears to resonate with findings reported in the literature on sharing practices
in other countries (Chadwick & Vaccari, 2019; Duffy et al., 2019). In the typology proposed by Chakrabarti
and colleagues (2018), this would fall between an act of civic duty and a sense that information is democratic
and should be passed on. For many students, the notion of civic duty applied, regardless of the veracity of
the story, as the following exchange among undergraduate students in Zambia shows:
Student A: This is a matter of life, and it should be taken seriously. Who knows what the
Chinese want? Maybe they want to kill all of us, and take our country. Just like everyone
else, I would also share the news.
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1213
Student B: I would wait before I choose to believe this story. Three bowls making one
plastic bag? I doubt it. But it’s on Twitter, so it could be true. But can we find such a story
on BBC or Al-Jazeera?
Student C: Why would you want to wait until you verify. How long will that take you? You
will end up killing people by not sharing. I think there’s no harm in sharing. If it’s not true,
it will harm no one. If it’s true, then it will save some lives. How many fake news stories
do people, even us here, share on Facebook? Sometimes we even know about them being
a lie, yet we go on to share.
The same motivation, a sense of civic duty combined with a “just in case” attitude, applies to other stories
students said they would share, such as terror-related incidents for Kenyan participants, or, for Nigerians,
news about anti-African xenophobic attacks in South Africa.
There were not many students who thought they would share the second stimulus, and the
motivations outlined above do not seem to apply in the case of a political story. First, most of the participants
expressed their lack of interest in politics, which seemed to make them less likely to react to the stimulus
(e.g., a Ghanaian graduate student explained, “I will first of all read it, but then I wouldn’t share because
I’m not interested in politics so that’s why I wouldn’t even want to share”; and a Kenyan graduate student
stated, “I wouldn’t even care. For me, anything with politics, hands off”). However, in each country there
were students who described themselves as politically aware and engaged. These students said they would
share the news story because it aligned with their political views, or because it would spark some debate.
The intensity and length of the discussion around the shareability of the second stimulus differed across
countries. Kenyan and South African students in our sample appeared politically apathetic and did not find
the story we showed worth sharing. Students in our Ghanaian, Nigerian, and Zambian focus groups appeared
to be much more politically engaged, and that seemed to make them more likely to share the story. Overall,
the Zimbabwean focus group discussion turned out to be the most nuanced. Three positions on the issue
could be seen: (a) those who would not share the story; (b) those who, being politically engaged, would
post it (e.g., an undergraduate student from Zimbabwe said, “I joined a political WhatsApp group. I would
send to that because the people in that group would be interested in that”); and (c) those who did not see
sharing the story as a problem in terms of spreading misinformation per se, but as a form of losing social
capital. These quotes summarize the debate well: “I would get negative comments because people would
trust me as a source of that news” (undergraduate student, Zimbabwe); and, “I don’t share such political
stories because of some intimidation that you [see] on social media. Some would say your WhatsApp is
being followed. Sometimes you feel afraid to send such stories” (undergraduate student, Zimbabwe).
Finally, (3) we found humor, and the use of parody, to be a factor influencing the sharing of political
(mis)information. These are some of the initial responses to the second stimulus:
You may say it’s news from NewsDay [a reputable Zimbabwean newspaper], but overall,
I’ll treat it as a joke. When I’m forwarding it, I’m not forwarding news but a joke. As long
as it’s a joke, I don’t need to verify. (graduate student, Zimbabwe)
1214 Dani Madrid-Morales et al. International Journal of Communication 15(2021)
I will laugh. I feel like writing several laughing emojis before I say anything serious,
because the picture alone is already funny. This can be a meme or something before I
now start to criticize what the president is doing. (undergraduate student, Nigeria)
I’ll share this on WhatsApp, on a political party page. Then I’d try and take a few jabs at
Mahama [former Ghanaian President], and then make some funny comments about his
choosing of running mate. (graduate student, Ghana)
Humor, gossip, and satire seem to provide a refuge for media users overwhelmed with serious or depressing
news. And yet, with many saying they would post fake and fabricated stories about politicians to poke fun
at those in power, we found no references to the idea that sharing misinformation is caused by a desire to
create chaos (Petersen et al., 2018). Rather, the sharing practices students reported seem to point to the
importance of conviviality and community, as found in more traditional networks of orality, or the use of
earlier forms of ICTs to resist government control/abuse. As a Zambian graduate student put it,
This stems from our history, where the media was mostly used for entertainment, and the
elites used it for news. From colonial times to the most recent years, the media was for
status. Reading a newspaper was a sign of education and wealth. Most people listened to
radios, and usually for entertainment purposes. So, when social media was introduced,
the mentality did not change, except that this time the people were allowed to create
content. However, the content they create does not really reflect who they are, except
that they do it for entertainment.
Discussion and Conclusion
Building on previous research on the prevalence of misinformation in sub-Saharan Africa, this study
provides more depth to empirical data that showed online users on the continent are oftentimes contributors
to the spread of inaccurate information (Wasserman & Madrid-Morales, 2019). The first finding of the study
indicates that misinformation is experienced as a common occurrence, and is seen as a cause for concern.
As noted in the literature, age was a factor in the sharing of misinformation: Students apportioned blame
for this to an older generation of media users. Although this is just the perception of a young demographic
rather than a finding based on cross-generational sampling, this perception resonates with Guess and
colleagues’ (2019) study of the use of misinformation during the 2016 U.S. election, which found older
Americans to be more likely to share “fake news.”
A second finding was that young media consumers are discerning users who rely on various cues
to evaluate the veracity of information. Although our respondents were well-versed in the affordances of
social media, they did not report a general mistrust of established news media and indicated that established
sources would serve as benchmarks for evaluating the veracity of information. In this regard, our findings
deviated somewhat from some views expressed in the literature about the Global North (London School of
Economics and Political Science, 2020; Wagner & Boczkowski, 2019), which indicate that the use and
distribution of misinformation stem from a cynicism toward the media as a whole. In verbalizing sharing
practices, students in all six countries often spoke of the current media environment as one in which
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1215
discerning what is true and what is not is increasingly difficult. While students do not seem to distrust all
media or see state-owned media as the epitome of false information, there seems to be no single source
that is trusted by all (or most). Although media literacy as a concept was not familiar to most students, they
described behaviors and practices that could be viewed as applying media literacy skills (e.g., seeking out
additional sources, and verifying claims found on social media). Importantly, additional research needs to
determine what students in sub-Saharan Africa think media literacy is and what it ought to be. Little research
has examined this issue in this context, and grounded empirical research is needed.
Only a handful of the motivations for sharing misinformation found in the literature (Chadwick &
Vaccari, 2019; Chakrabarti et al., 2018; Duffy et al., 2019; Guess et al., 2019; Petersen et al., 2018) could
be matched to those provided by our respondents, and different motivations applied to politically and
nonpolitically motivated content. In all six countries, the sharing of health-related (mis)information (also
news about terrorism, political violence, and scams) was attributed to a sense of civic duty. This confirms
the social utility of information sharing noted in the literature. As previously found in Singapore (Tandoc et
al., 2020), and in Nigeria and Kenya (Chakrabarti et al., 2018), our respondents indicated the need to warn
others as a likely motivation for sharing.
Political motivations have often been highlighted as a reason for sharing misinformation, whether to
mobilize against a target group or to rail against the whole system (Petersen et al., 2018). Political orientation
has also been noted as a motivating factor for sharing misinformation (Jamieson & Albarracín, 2020). In our
study, the sharing of political news stories revealed differences across countries. A country’s political culture
and media system seemed to be linked to the way users interact with false information. In Zimbabwe, where
press freedom is weak and authoritarianism is still a reality, the sharing of political (mis)information was
presented as a courageous act, even if done in WhatsApp groups, where encryption is sophisticated. At the
other end, in South Africa and Kenya, both of which have a vibrant media sector and a (flawed) but functioning
democracy, students appeared to be the least motivated to share political news. South African and Kenyan
students seemed to be much less politically engaged than those in countries where participants said they would
share not only the stimulus we presented but also other similar stories about politics. Some participants said
they would do so because it could help them advance their political motives, while others suggested that their
goal when sharing political (mis)information is ridiculing those in power.
It is especially this latter practice that points to a gap in the most recent literature on
misinformation. The political use of humor we see in sharing practices differs in nature from the orchestrated
political campaigns described in the literature on misinformation elsewhere. Our findings seem better aligned
with the extensive literature on the political uses of satire in scholarship on African media and political
communication, especially in contexts where news media is repressed by the state or captured by elites.
This finding also emphasizes the need to better root studies on politically motivated disinformation within
the sub-Saharan African context. Though the boundaries between satire used for political ends and malicious
or misleading information may be nebulous, the long social history of such practices in Africa makes this an
important factor to consider. Given the entrenched role of satirical and humorous content in informal
networks of media use in Africa (Nyamnjoh, 2005; Willems, 2011), and the progressive uses to which these
types of intentionally false—albeit not misleading—content have been put, media users on the continent
might be less resistant to sharing information that they know is untrue.
1216 Dani Madrid-Morales et al. International Journal of Communication 15(2021)
References
Bigman, C. A., Smith, M. A., Williamson, L. D., Planey, A. M., & Smith, S. M. (2019). Selective sharing on
social media: Examining the effects of disparate racial impact frames on intentions to retransmit
news stories among US college students. New Media & Society, 21(11/12), 2691–2709.
doi:10.1177/1461444819856574
Chadwick, A., & Vaccari, C. (2019). News sharing on UK social media: Misinformation, disinformation, and
correction (03C1). Online Civic Culture Centre. Retrieved from
https://www.lboro.ac.uk/research/online-civic-culture-centre/news-events/articles/o3c-1-survey-
report-news-sharing-misinformation/
Chakrabarti, S., Rooney, C., & Kweon, M. (2018). Verification, duty, credibility: Fake news and ordinary
citizens in Kenya and Nigeria. BBC. Retrieved from http://downloads.bbc.co.uk/mediacentre/bbc-
fake-news-research-paper-nigeria-kenya.pdf
Duffy, A., Tandoc, E., & Ling, R. (2019). Too good to be true, too good not to share: The social utility of
fake news. Information, Communication & Society. Advanced online publication.
doi:10.1080/1369118X.2019.1623904
Dwyer, M., & Molony, T. (Eds.). (2019). Social media and politics in Africa: Democracy, censorship and
security. London, UK: Zed.
The Economist Intelligence Unit. (2020). Democracy index 2019: A year of democratic setbacks and
popular protest. Retrieved from https://www.eiu.com/topic/democracy-index
Electoral Commission of South Africa. (2019). Electoral Commission launches online reporting platform for
digital disinformation. Retrieved from https://www.elections.org.za/ieconline/Report-digital-
disinformation
Goldstein, J., & Rotich, J. (2010). Digitally networked technology in Kenya’s 2007–08 post-election crisis.
In S. Ekine (Ed.), SMS uprising: Mobile activism in Africa (pp. 124–137). Cape Town, South
Africa: Pambazuka.
Guess, A., Nagler, J., & Tucker, J. (2019). Less than you think: Prevalence and predictors of fake news
dissemination on Facebook. Science Advances, 5(1), eaau4586. doi:10.1126/sciadv.aau4586
Jamieson, K. H., & Albarracín, D. (2020). The relation between media consumption and misinformation at
the outset of the SARS-CoV-2 pandemic in the US. Harvard Kennedy School Misinformation
Review, 1(2), 1‒22. doi:10.37016/mr-2020-012
Kamberelis, G., & Dimitriadis, G. (2013). Focus groups: From structured interviews to collective
conversations. London, UK: Routledge. doi:10.4324/9780203590447
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1217
Lewis, S. C., & Molyneux, L. (2018). A decade of research on social media and journalism: Assumptions,
blind spots, and a way forward. Media and Communication, 6(4), 11–23.
doi:10.17645/mac.v6i4.1562
London School of Economics and Political Science. (2020). Tackling the information crisis: A policy
framework for media system resilience. Retrieved from http://www.lse.ac.uk/media-and-
communications/assets/documents/research/T3-Report-Tackling-the-Information-Crisis-v6.pdf
Mäkinen, M., & Kuira, M. W. (2008). Social media and postelection crisis in Kenya. The International
Journal of Press/Politics, 13(3), 328–335. doi:10.1177/1940161208319409
Mano, W. (2007). Popular music as journalism in Zimbabwe. Journalism Studies, 8(1), 61–78.
doi:10.1080/14616700601056858
Mare, A. (2020). Popular communication in Africa: An empirical and theoretical exposition. Annals of the
International Communication Association, 44(1), 81–99. doi:10.1080/23808985.2019.1623060
Moehler, D. C., & Singh, N. (2011). Whose news do you trust? Explaining trust in private versus public
media in Africa. Political Research Quarterly, 64(2), 276–292. doi:10.1177/1065912909349624
Mutsvairo, B., & Bebawi, S. (2019). Journalism educators, regulatory realities, and pedagogical
predicaments of the “fake news” era: A comparative perspective on the Middle East and Africa.
Journalism & Mass Communication Educator, 74(2), 143–157. doi:10.1177/1077695819833552
Nyamnjoh, F. B. (2005). Africa’s media, democracy, and the politics of belonging. London, UK: Zed.
Ogola, G. O. (2010). “If you rattle a snake, be prepared to be bitten”: Popular culture, politics and the
Kenyan news media. In H. Wasserman (Ed.), Popular media, democracy and development in
Africa (pp. 123–136). London, UK: Routledge.
Okoro, N., & Emmanuel, N. O. (2018). Beyond misinformation: Survival alternatives for Nigerian media in
the “post-truth” era. African Journalism Studies, 39(4), 67–90.
doi:10.1080/23743670.2018.1551810
Orji, N. (2019). Social media and elections in Nigeria: Digital influence on election observation,
campaigns, and administration. In M. Dwyer & T. Molony (Eds.), Social media and politics in
Africa: Democracy, censorship and security (pp. 152–172). London, UK: Zed.
Petersen, M. B., Osmundsen, M., & Arceneaux, K. (2018). The “need for chaos” and the sharing of hostile
political rumors in advanced democracies. PsyArXiv Preprints. doi:10.31234/osf.io/6m4ts
Reporters Without Borders. (2020). 2020 world press freedom index. Retrieved from
https://rsf.org/en/ranking
1218 Dani Madrid-Morales et al. International Journal of Communication 15(2021)
Roper, C. (2019). South Africa. In N. Newman, R. Fletcher, A. Kalogeropoulos, & R. K. Nielsen (Eds.),
Reuters Institute digital news report 2019 (pp. 148–149). Retrieved from
https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2019-06/DNR_2019_FINAL_0.pdf
Shoki, W. (2020, January 15). On conspiracy theories. Retrieved from
https://africasacountry.com/2020/01/on-conspiracy-theories
Sterrett, D., Malato, D., Benz, J., Kantor, L., Tompson, T., Rosenstiel, T., . . . & Loker, K. (2019). Who
shared it? Deciding what news to trust on social media. Digital Journalism, 7(6), 783–801.
doi:10.1080/21670811.2019.1623702
Tandoc, E. C. (2019). The facts of fake news: A research review. Sociology Compass, 13(9), e12724.
doi:10.1111/soc4.12724
Tandoc, E. C., Lim, D., & Ling, R. (2020). Diffusion of disinformation: How social media users respond to
fake news and why. Journalism, 21(3), 381–398. doi:10.1177/1464884919868325
Tully, M., & Ekdale, B. (2014). Sites of playful engagement: Twitter hashtags as spaces of leisure and
development in Kenya. Information Technologies & International Development, 10(3), 67–82.
doi:10.1057/9781137404299_6
Udupa, S., & McDowell, S. D. (Eds.). (2017). Media as politics in South Asia. London, UK: Routledge.
doi:10.4324/9781315267159
UNESCO. (2018). Journalism, “fake news” and disinformation. Paris, France: Author.
Uzuegbunam, C. E. (2020). A critical analysis of transgressive user-generated images and memes and
their portrayal of dominant political discourses during Nigeria’s 2015 general elections. In M. N.
Ndlela & W. Mano (Eds.), Social media and elections in Africa (Vol. 2, pp. 223–243). Cham,
Switzerland: Palgrave Macmillan. doi:10.1007/978-3-030-32682-1_12
Valenzuela, S., Halpern, D., Katz, J. E., & Miranda, J. P. (2019). The paradox of participation versus
misinformation: Social media, political engagement, and the spread of misinformation. Digital
Journalism, 7(6), 802–823. doi:10.1080/21670811.2019.1623701
Wagner, M. C., & Boczkowski, P. J. (2019). The reception of fake news: The interpretations and practices
that shape the consumption of perceived misinformation. Digital Journalism, 7(7), 870–885.
doi:10.1080/21670811.2019.1653208
Wahutu, J. S. (2019). Fake news and journalistic “rules of the game.” African Journalism Studies, 40(4),
13–26. doi:10.1080/23743670.2019.1628794
International Journal of Communication 15(2021) Motivations for Sharing Misinformation 1219
Wasserman, H. (2020). Fake news from Africa: Panics, politics and paradigms. Journalism, 21(1), 3–16.
doi:10.1177/1464884917746861
Wasserman, H., & Madrid-Morales, D. (2019). An exploratory study of “fake news” and media trust in
Kenya, Nigeria and South Africa. African Journalism Studies, 40(1), 107–123.
doi:10.1080/23743670.2019.1627230
Willems, W. (2011). Comic strips and “the crisis”: Postcolonial laughter and coping with everyday life in
Zimbabwe. Popular Communication: The International Journal of Media and Culture, 9(2), 126–
145. doi:10.1080/15405702.2011.562099