Filter Bubbles and Fake News
The results of the 2016 Brexit referendum in the U.K. and presidential election in the U.S.
surprised polls and traditional media, and social media is now being blamed in part for
creating echo chambers that encourage the spread of fake news that influenced the voters.
By Dominic DiFranzo and Kristine Gloria-Garcia
The EU referendum in the U.K. and the U.S. presidential election both shocked journalists,
pollsters, and citizens around the world. The outcomes—the U.K. voting to leave the EU and
Donald Trump winning the presidency—raise the question of how traditional media and polls
could have been so wrong in their predictions . While plenty of fingers have pointed to
outside interference, changing demographics, and economic concerns, one scapegoat—social
media—has received special attention. Some critics place the fault with Facebook, Google, and
other social media platforms for allowing the spread of “fake news,” pointing to, for example,
the creation and facilitation of echo chambers, where users are not exposed to outside options
and views . According to The New York Times, following Trump’s victory, executives at
Facebook began to privately chat about the role their company and platform had on the election
. However, Facebook CEO Mark Zuckerberg downplayed the company’s role in the election,
saying, “Voters make decisions based on their lived experience” and the theory that fake news
shared on Facebook “influenced the election in any way, is a pretty crazy idea.” .
So we ask, did social media play a role in these election upsets? In this review, we examine
whether social media really played a role, if fake news and filter bubbles had an effect, and if so,
what can be done about it in the future.
Social Media Effect
Social media trends (in terms of, say, numbers of posts, shares, and likes) on the day of the
elections favored both Brexit and Trump,  which was counter to the narrative of reputable
polling and traditional media. Trump had more followers across social media platforms. He did,
however, and continues to, push messages using social media rather than traditional media
channels, and had higher engagement rates compared to his opponent. One of the most shared
posts on social media leading to the election was “Why I’m Voting For Donald Trump,” a blog
post by a well-known conservative female blogger . The trend was similar during the EU
Referendum in the U.K. The official “Leave” campaign had more followers and engagement on
social media platforms, as well as more success in spreading pro-leave hashtags and messages
Moreover, according to Pew Research, 61% of millennials use Facebook as their primary source
for political news . Such news stories have been shown to have a direct effect on political
actions, attitudes and outcomes. In 2012, a study reported in Nature described a randomized
controlled trial of political mobilization messages delivered to 61 million Facebook users during
the 2010 U.S. congressional elections. It found the messages directly influenced political self-
expression, information seeking, and real-world voting behavior . This is not surprising, as the
dominant funding strategy of most social media platforms follows the assumption that sponsored
social media posts, or advertisements, can change the buying behavior of their users . In
another example, the past decade has seen a rise in political movements (such as the Arab Spring
and Black Lives Matter) that start on and are sustained through social media platforms like
Twitter and Facebook.
Hampton and Hargittai  showed evidence that the demographic most disconnected from social
media and the web were also the most likely to be Trump supporters. Voters without a college
degree supported Trump by a nine-percentage-point margin, whereas in past elections they were
equally likely to support Democrats as Republicans. These gaps widen greatly when pollsters
look at white non-college educated voters, who supported Trump by 39 percentage points, the
largest margin of support from this demographic since 1980 . This group in particular—white
non-college educated—were also most likely to not have access to the Internet, and of those who
do are most likely to not use social media .
Additionally, research from Pew Research shows that while millennials (people born 1977 to
1995) get their political news from social media platforms, most Americans still rely on their
local TV news stations and other traditional mass-media sources . These are the same mass-
media sources where Trump received significantly more attention and coverage compared to
Hillary Clinton . And while social-media-savvy millennials overwhelmingly supported
Clinton, voter turnout from this demographic was lower than in both the 2008 and 2012
presidential elections. Further data analysis shows Clinton supporters were most likely to be
engaged on social media platforms like Twitter and Reddit .
Why then did a first pass look at social data trends show more support for Trump than Clinton if
social media users seemed to support Clinton more? The answer is still being investigated, but
one answer may be in the use of botnets. Two recent studies—one from researchers at the
University of Southern California and another from Oxford University, the University of
Washington, and Corvinus University of Budapest—both showed AI-controlled bots were
spreading pro-Trump content in overwhelming numbers. Kollanyi, Howard and Woolley 
from Oxford University estimate that one-third of pro-Trump tweets came from automated bots,
which they classified by how often these accounts tweet, at what time of day, and their relation
to other accounts. This created the illusion of more support for Trump on Twitter than there may
have been naturally [10,11]. Kollanyi, Howard and Woolley  noted similar automated
patterns on Twitter leading up to the EU referendum in the U.K. in which pro-Leave tweets
greatly outnumbered pro-Stay tweets.
Another criticism of social media is that it constructs “filter bubbles,” digital echo chambers
where users see content and posts that agree only with their preexisting beliefs . While there
is an active dialogue taking place as to whether filter bubbles exist, here, we highlight work that
explores whether it contributed to the 2016 election results.
In 2015, Facebook funded a study that showed that while Facebook’s own newsfeed algorithm
might favor posts that support a user’s political beliefs, the related filter-bubble effect is due to
the user’s network and past engagement behavior (such as a clicking only on certain news
stories); that is, it is not the fault of the newsfeed algorithm but the choices of users themselves.
They also found that this favoritism effect is small overall. The study showed users are only 6%
less likely to see a post that conflicts with their political views when compared to an unfiltered
Personal recommendation systems, or systems that learn and react to individual users, have been
claimed to be one cause of filter bubbles . Other studies have shown that personalized
recommendations can actually expose users to content they might not have found otherwise 
and that personalized recommendations are not used as extensively as once thought . A 2011
national survey by Pew Research found Facebook use is actually correlated with knowing and
interacting with a greater variety people from different backgrounds and demographics . This
correlation persists despite controlling for the demographic characteristics of Facebook users
compared the U.S. population as a whole. In a sense, social media may actually be bursting filter
bubbles. This same survey showed that people who are offline are more likely to be socially
isolated and have less diverse social relationships, thereby being exposed to less-diverse ideas
and viewpoints .
While these studies are compelling, evidence of filters bubbles and their effect on users
continues to grow. For example, several scholars have criticized the 2015 Facebook Newsfeed
Study cited earlier. Specifically, Zeynep Tufekci rebutted many of the findings and methodology
of the study , accusing the study of underplaying its most important conclusion indicating the
newsfeed algorithm decides placement of posts and this placement greatly influences what users
click and read. Tufekci also highlighted that the sampling was not random and thus cannot be
generalized across all Facebook users. Even if one takes the Facebook study at face value, it still
shows this filter bubble effect is real and the algorithm actively suppresses posts that conflict
with a user’s political viewpoint. Other recent studies (such as Del Vicario et al.  on the
sharing of scientific and conspiracy stories on Facebook) found evidence of the formation of
echo chambers that cause “confusion about causation, and thus encourage speculation, rumors,
On the theme of “speculation, rumors, and mistrust,” fake news is another issue that has plagued
social media platforms during, as well as after, the U.S. elections. Fake news is a recent popular
and purposefully ambiguous term for false news stories that are packaged and published as if
they were genuine. The ambiguity of this term—an inherent property of what it tries to label—
makes its use attractive across the political spectrum, where information that conflicts with an
ideology can be labeled “fake.” The Times published an article  chronicling the spread of
fake news on social media, saying one such fake story was shared at least 16,000 times on
Twitter and more than 350,000 times on Facebook. According to BuzzFeed , in the months
before the U.S. elections, fake news stories on Facebook actually outperformed real news from
mainstream news outlets. BuzzFeed  said these fake news stories overwhelmingly favored
Trump. For example, a fake news story reported that Pope Francis endorsed Trump and was
shared more than one million times on social media feeds. Pope Francis, an advocate for
refugees, made no such endorsement. Not only were these fake news sources shared on social
media platforms they were also shared by Trump and members of his campaign .
Fake news stories have a real effect offline as well. A shooting took place in a Washington, D.C.,
pizzeria, Comet Ping Pong, after fake news stories and conspiracy theories spread about it being
part of a child trafficking ring . Army Lt. Gen. Michael Flynn, Trump’s current National
Security Adviser shared fake news stories related to this so-called “Pizzagate” scandal more than
16 times, according to a Politico review of his Twitter posts .
Although fake news may be a problem (though not unique to social media), its dissemination
may not break out of the filter bubble at its point of origin. Several studies have shown the
spread of fake news is similar to epidemics compared to real news stories and that such stories
usually stay within the same communities ; that is, these stories tend to not reach or convince
outsiders. Likewise, Mark Zuckerberg said in a Facebook post following the U.S. election, “Of
all the content on Facebook, more than 99% of what people see is authentic. Only a very small
amount is fake news and hoaxes. Overall, this makes it extremely unlikely hoaxes changed the
outcome of this election in one direction or the other” . He did not provide any data or
evidence to back this claim.
As the dust continues to settle from the elections, research into the role of social media and
digital media platforms as key influencers will continue. We acknowledge how unlikely it is
analysts will ever reach a commonly accepted explanation for the election outcomes. However, it
is imperative to acknowledge the need to study such potential cause and effects. This has laid out
the potential research questions; now, we turn to possible solutions.
Even though fake news and filter bubbles are a problem that indeed affected the U.S. presidential
election, social media platforms like Facebook and Google are exploring ways to reduce these
influences on their platforms. Both Google and Facebook recently, November 2016, announced
the banning of websites that publish fake news from their advertising networks, effectively
killing the revenue stream of these sites . Facebook has also created new tools to flag fake
content and is partnering with third-party fact-checking organizations like Snopes and Politifact
. Facebook is also developing better automatic fake-news-detection systems that will limit
the spread of such content.
Researchers and software developers have been looking into tools to help break out of filter
bubbles , including filtering algorithms and user interfaces that give users better control and
allow more diversity. Other tools (such as browser plugin Ghostery and search engine
DuckDuckGo) are being developed to help anonymize users actions online, thus disabling
Bot and spam detection is another major area of research. Many social media platforms already
use a range of tools, from machine learning to social network analysis, to detect and stop bots.
Independent groups and researchers have also developed tools to detect bots; for example,
researchers at Indiana University have developed BotOrNot (http://truthy.indiana.edu/botornot/),
a service that allows anyone to check if a particular Twitter user is a bot.
In addition to technical enhancements and design choices, what other avenues, even public
policymaking, are available for combating these issues? This may be a particularly difficult
question in the U.S. due to free-speech protections under the First Amendment of the U.S.
Constitution. We already see legal and political tensions as Twitter implements internal policies
for flagging hate speech and closing specific accounts. Others have suggested a reinstatement of
media- and civic-literacy initiatives to help users discern trustworthy, genuine news sources.
These issues of fake news and filter bubbles are vague, nuanced and pre-date social media, with
no easy solution, but it is vital that researchers continue to explore and investigate them from
diverse technical and social perspectives. Their skills, knowledge, and voices are needed now
more than ever, and for the good of all, to address them.
 Isaac, M. Facebook, in cross hairs after election, is said to question its influence. The New
York Times (Nov. 12, 2016); https://www.nytimes.com/2016/11/14/technology/facebook-is-said-
 Isaac, M. and Ember, S. For election day influence, Twitter ruled social media. The New York
Times (Nov. 8, 2016); http://www.nytimes.com/2016/11/09/technology/for-election-day-chatter-
 Kokalitcheva, K. Mark Zuckerberg says fake news on Facebook affecting the election is a
‘crazy idea.’ Fortune (Nov. 11, 2016); http://fortune.com/2016/11/11/facebook-election-fake-
 El-Bermawy, M.M. Your filter bubble is destroying democracy. Wired (Nov. 18, 2016);
 Sigdyal, P. and Wells, N. At least on Twitter, the ‘leave’ Britain vote has the edge Twitter
users scream ‘leave’ in Brexit vote, but ‘remain’ gains ground (June 23, 2016);
 Mitchell, A., Gottfried, J., and Matsa, K.E. Facebook top source for political news among
millennials. Pew Research Center (June 1, 2015);
 Bond, R.M., Fariss, C.J., Jones, J.J., Kramer, A.D.I., Marlow, C., Se le, J.E., and Fowler, J.H.
A 61-million-person experiment in social influence and political mobilization. Nature 489, 7415
(Sept. 2012), 295–298.
 Taylor, D.G., Lewin, J.E., and Strutton, D. Friends, fans, and followers: Do ads work on
social networks? Journal of Advertising Research 51, 1 (2011), 258–275.
 Hampton, K. and Hargittai, E. Stop blaming Facebook for Trump’s election win. The Hill
(Nov. 2016); http://thehill.com/ blogs/pundits-blog/presidential-campaign/307438-stop-blaming-
 Fields, J., Sengupta, S., White, J., Spetka, S. et al. Botnet Campaign Detection on Twitter.,
Master of Science Thesis in Computer and Information Sciences, Department of Computer
Sciences, SUNY Polytechnic Institute, Utica, NY, 2016;
 Kollanyi, B., Howard, P.N. and Woolley, S.C. Bots and automation over Twitter during the
third U.S. presidential debate. Political Bots (Oct. 31 2016); http://politicalbots.org/wp-
 Pariser, E. The Filter Bubble: How the New Personalized Web Is Changing What We Read
and How We Think. Penguin, New York, 2011.
 Bakshy, E., Messing, S., and Adamic, L. Exposure to ideologically diverse news and
opinion on Facebook. Science 348, 6239 (2015), 1130–1132.
 Hosanagar, K., Fleder, D., Lee, D., and Buja, A. Will the global village fracture into tribes?
Recommender systems and their effects on consumer fragmentation. Management Science 60, 4
 Weisberg, J. Bubble trouble: Is web personalization turning us into solipsistic twits. Slate
(June 10 2011),
 Hampton, K., Sessions Goulet, L., Rainie, L., and Purcell, K. Social networking sites and
our lives. Pew Research (2011) http://www.pewinternet.org/2011/06/16/social-networking-sites-
 Tufekci, Z. How Facebook’s algorithm suppresses content diversity (modestly) and how the
newsfeed rules the clicks. Medium (2015); https://medium.com/message/how-facebook-s-
 Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Stanley, E., and
Quattrociocchi, W. The spreading of misinformation online. Proceedings of the National
Academy of Sciences 113, 3 (2016), 554–559.
 Maheshwari, S. How fake news goes viral: A case study. The New York Times (Nov. 20
 Silverman, C. This analysis shows how viral fake election news stories outperformed real
news on Facebook. BuzzFeed (Nov. 16, 2016); https://www.buzzfeed.com/craigsilverman/viral-
 Kang, C. Fake news onslaught targets pizzeria as nest of child-trafficking. The New York
Times (Nov. 21, 2016); http://www.nytimes.com/2016/11/21/technology/fact-check-this-
 Bender, B. and Hanna, A. Flynn under fire for fake news. Politico (Dec. 5, 2016);
 Jin, F., Dougherty, E., Saraf, P., Cao, Y., and Ramakrishnan, N. Epidemiological modeling
of news and rumors on Twitter. In Proceedings of the Seventh Workshop on Social Network
Mining and Analysis (2013). ACM Press, New York, Article No. 8.
 Kottasova, I. Facebook and Google to stop ads from appearing on fake news sites. CNN
(Nov. 15, 2016); http://money.cnn.com/2016/11/15/technology/facebook-google-fake-news-
 Heath, A. Facebook is going to use Snopes and other fact-checkers to combat and bury ‘fake
news.’ Business Insider (Dec. 15, 2016); http://www.businessinsider.com/facebook-will-fact-
 Resnick, P., Kelly Garrett, R., Kriplean, T., Munson, S.A., and Jomini Stroud, N. Bursting
your (filter) bubble: Strategies for promoting diverse exposure. In Proceedings of the 2013
Conference on Computer Supported Cooperative Work (San Antonio, TX, Feb. 23–27). ACM
Press, New York, 2013, 95–100.
Dominic DiFranzo is a post-doctoral associate in the Social Media Lab at Cornell
University, Ithaca, NY; he holds a Ph.D. in computer science from the Rensselaer
Polytechnic Institute, Troy, NY, and was a member of the Tetherless World
Kristine Gloria-Garcia joined the Aspen Institute Communications and Society Program
as a project manager in September 2016; previously, she served as a visiting
researcher at the Internet Policy Research Initiative at MIT, Cambridge, MA, and as a
privacy research fellow at the Startup Policy Lab. She holds a Ph.D. in cognitive science
from Rensselaer Polytechnic Institute, Troy, NY, and a master’s in media studies from
the University of Texas at Austin.
pick 2 of 3??
4 Why then did a first pass look by pollsters?? at social data trends show more support for
Trump than Clinton if social media users seemed to support Clinton more?
5 Personal recommendation systems, or systems that learn and react to individual users, have
been argued claimed?? to be one cause of filter bubbles.
9 We already see legal and political?? tensions as Twitter implements internal policies for
flagging hate speech and closing specific accounts.