PresentationPDF Available

The filter bubble: a perspective for information behaviour research

The filter bubble: a perspective for
information behaviour research
Sabina Cisek
Monika Krakowska
Jagiellonian University in Kraków
Why do we discuss the concept and phenomenon of filter
Filter bubble and information behaviour: the current state of
The filter bubble definition, genesis, characteristics and
widening of the concept
The psychological dimension
The socio-cultural dimension
The epistemological/methodological dimension
How to burst your filter/epistemic bubble?
Conclusions and ideas
Selected literature
Our main objectives
To discuss the concept of filter bubble and its various
dimensions psychological, socio-cultural, epistemological
To show that this notion might be a fruitful
methodological/theoretical framework for the information
behaviour research, to be used for:
disclosing unknown aspects of information behaviour
organizing/seeing in a new way what we already know
about human information behavior
We do not cover all possible aspects of the filter bubble related issues.
We rather ask questions than provide answers.
It concerns billions of people has not
only cognitive/epistemic aspects but also
moral, political, social … .
Fake news, manipulation
Own (bad) experiences with social media
There is not much research on that
problem but we think it may be
interesting and scholarly fruitful.
1.47 billion daily active users on Facebook on average for June 2018
2.23 billion monthly active users on Facebook as of June 30, 2018 (Facebook Stat, 2018)
It may be assumed that over 2 billion users of social media (Facebook) are
exposed to personalised messages, manipulation, fake news.
Example from the scientific perspective:
Danger of the anti-vaccination movement
Jaret, Peter (2016). The Danger of the Anti-vaccination movement.
Fake news
It was the Abdul Aziz Al-Otaibi’s artistic project that went viral
through many social media channels.
This also shows the political dimension of filter bubbles.
Hooton, 2014
Did this boy really
sleep between his
parents graves?
Scholarly publications on filter bubbles [1]
Google Scholar about 5430 results
Scopus (in Title-Abstract-Keywords) 21 results, from years
Web of Science (all databases) in the Topic category 17
results, from years 2011-2018
Wiley Online Library 40 results, from years 2011-2018
LISTA Library, Information Science and Technology Abstracts
13 results, from years 2011-2018
Query „filter bubble” AND internet, September 2018
Scholarly publications on filter bubbles [2]
Google Scholar about 135 results
Scopus (in Title-Abstract-Keywords) 0 results
Web of Science (all databases) in the Topic category 0
Wiley Online Library 2 results, from years 2012-2018
LISTA Library, Information Science and Technology
Abstracts 1 result (Tran, Yerbury, 2015)
Query „filter bubble” AND („information behavior
OR „information behaviour”), September 2018
The concept and theory of filter bubble and
related ideas/notions have to be carefully
analysed to become useful for information
behaviour research,
because in popular conversation they are
frequently approached emotionally, without
deeper scholarly reflection especially in the
political context.
The Daily
What is a filter bubble in its narrow,
„algorithmic” sense? [1]
The term „filter bubble” was coined
by Eli Pariser in his book The filter
bubble: what the internet is hiding
from you (2011), cited 2886 times
(Google Scholar)
But the problem itself had been
discussed earlier (e.g. Sunstein,
2001; 2007)
What is a filter bubble in its narrow,
„algorithmic” sense? [2]
A filter bubble is the intellectual isolation that can occur when websites make
use of algorithms to selectively assume the information a user would want to
see, and then give information to the user according to this assumption.
Websites make these assumptions based on the information related to the user,
such as former click behavior, browsing history, search history and location. For
that reason, the websites are more likely to present only information that will
abide by the user's past activity.
A filter bubble, therefore, can cause users to get significantly less contact with
contradicting viewpoints, causing the user to become intellectually isolated.
Personalized search results from Google and personalized news stream from
Facebook are two perfect examples of this phenomenon.
Websites (search
engines, social media)
need satisfied clients
Users want „nice”, easy-to-
get content Principle of
Least Effort
Features of the contemporary information ecosystem: enormous chaos,
information overload, manipulation and fake content, rapid changes
of content
Reinforcement, feedback-
loop between users and
How filter bubbles are created?
The three main questions
1) Do filter bubbles actually exist? If yes are
they important? Do personalising algorithms
really have deleterious effects?
2) The prevailing opinion is that filter bubbles
are bad. But why is getting personalised,
tailored information considered wrong?
3) Is it a completely new problem/situation?
1) Voices against the existence/importance of filter bubbles
or harmful effects of personalisation by algorithms
Within the population under study here,
individuals choices () more than algorithms ()
limit exposure to attitude-challenging content in
the context of Facebook. Despite the differences in
what individuals consume across ideological lines,
our work suggests that individuals are exposed to
more cross-cutting discourse in social media (…)”
(Bakshy, Messing, Adamic, 2015).
We distinguish between self
selected personalisation, where
people actively choose which
content they see, and pre-
selected personalisation, where
algorithms personalise content
for users without any deliberate
user choice. (…) We conclude
that in spite of the serious
concerns voiced at present,
there is no empirical evidence
that warrants any strong worries
about filter bubbles (Borgesius
et al. 2016).
We conducted two exploratory studies to test the
effect of both implicit and explicit personalization
on the content and source diversity of Google
News. Except for small effects of implicit
personalization on content diversity, we found no
support for the filter-bubble hypothesis” (Haim,
Graefe, Brosius, 2018).
2) Why are filter bubbles (as „products” of
personalisation) harmful?
Algorithms are content/information gatekeepers censors. They hinder:
access to content data, documents, news, resources,
awareness that there are different/other opinions, worldviews on known issues,
awareness of the mere existence of some issues, problems, questions.
Personalisation is done by algorithms, not by experts. Algorithms do not
base on ethical principles.
Filter bubbles are invisible (implicit personalisation). Users do not know that
information they get is personalised. They may assume it is complete and
Filter bubbles are involuntary.
Filter bubbles contribute to creation of echo chambers (and those cause
political and social problems).
3) Is this a completely new situation?
People have always experienced filter/epistemic
bubbles and always there were information
gatekeepers families (parents!), political powers,
religions, social groups … . And – in many cases
those bubbles have been invisible and
In addition we need to filter/select content to
get relevant, reliable, useful information.
Widening of the filter bubble concept [1]
Three „layers” of filter bubbles users lack
knowledge about
the existence, way of working, or effects of filtering
algorithms and personalisation (Facebook, Google,
more advanced search techniques within major
the mere existence of (1) other search services and (2)
the Deep Web.
Widening of the filter bubble concept [2]
Filter bubbles and epistemic bubbles
Information behaviour/seeking is not only
technology-driven and affected solely by cognitive
factors. It also has biological, political, psychological,
socio-cultural, and time-space determinants.
(Pariser, 2011)
(Nguyen, 2018)
Cognitive/epistemic content
filtered by the websites’ ways of
working, in particular by
personalising algorithms
content filtered by
biology, culture, history
(authorities, paradigms,
social groups, traditions )
Attitudes, emotion, people,
social relations filtered by
the websites’ ways of
working, in particular by
personalising algorithms
The formation of filter/information
bubbles may be related to the
emotional interactions of internal
cognitive structures with the real
world = relationship with mental
models (Craik’s concept)
Emotions and motivations are
incentives, but also regulators of
mental processes that influence the
assessment of an event, situation,
context, object thus imply (or not)
favourable, indifferent or unfavourable
behaviour (Lazarus, James-Lange
theories of emotions)
In biology, psychology and cognitive sciences, all human reactions to the outside
world refer to the evolutionary mechanism of coping, adapting to ecosystems
This mechanism is a combination of the endocrine system, the nervous
system, an individual phenotype that also affects the construction of the
so-called emotional brain (LeDoux’s theory)
Codification of aspirations to
regulate and control reactions and
actions in accordance with human
intention = filter bubbles deepen
the effect of self-satisfaction
Filter/information bubble as the
reaction to excessive psychological
costs, need for self-protection and
low self-efficacy
Evolutionary information behaviour model
Negative aspects
Creating a misleading and erroneous image of reality, an
individual mental model = closure in a limited, hermetic
circle of information, opinions, views, worldviews, limiting
the acquisition of knowledge
Confirmation bias and cognitive bias formation
Promoting intellectual and emotional laziness = because it
does not expose you to cognitive dissonance and
intolerance for uncertainty
Not developing collective emotions = because it is better to
experience affective emotions (safe and often positive) of
„being in the same boat
Positive aspects
Filter bubble
aims to hedge against information chaos, overload safety
has the impact on emotional well-being, reduction of
excessive psychological costs by constructing a subjective
information space
According to social psychology, positive cognitive
stimulation of users, affirmative mood, beneficial
affects increases the expressiveness of cognitive
and informational activities and processes, has an
impact on the increase of creativity, understanding
of relationships, hierarchizing tasks, individual
involvement and its informational behavior.
Echo chambers = diffused or
homogeneous user groups
duplicate, strengthen beliefs,
knowledge within the group; there
is a reluctance to consider
alternatives to preferred views
(tunnel vision)
The strength of weak ties
theory (Granovetter) in
information diffusion
Group polarization the tendency
to form very extreme positions and
to adopt extreme attitudes in the
The concept of persuasive
argumentation while users seek for
support and argumentation for the
opinions held, which causes
polarization and the concept of social
comparisons and adaptation of
individual views to the extreme and
image building
Can hinder rather than
facilitate „adding the diversity
to our lives”
Small worlds theory and life in the round theory
(Chatman) = striving for participation in
normative collectives due to their attractiveness
and benefits and intensification of beliefs
Can intensify the fear of rejection
Lost of the ability to discuss
May lead to the groupthink = a
psychological phenomenon wherein
groups of people experience a temporary
loss of the ability to think in a rational,
moral and realistic manner
Petrification of attitudes and behaviours
It is not conducive to learning something
Shapes opinions and attitudes of
individuals and groups
Positive collective reinforcement
(support of information
processes, development of
interpersonal relations -
support, help, etc.)
Social constructivism assumes
that user perceives and
understands reality subjectively,
through its permanent
interpretation and interaction
with others
Epistemological/methodological filter bubbles
domains (Hjørland)
paradigms (Kuhn)
research frameworks
research traditions
as filtering entities.
How do they filter scholarly content and values?
How does that influence researchers/scholars/students’
information behaviour?
Epistemological/methodological filter bubbles
Circular reasoning
A limited repertoire of
research methods and
Imposing the researcher’s
worldview on the subjects, not
seeing worlds of others
Unwitting acceptance of
philosophical (axiological,
epistemological, ontological)
Ability to extend existing
Common „starting point
Facilitated communication
Social acceptance/inclusion
Justificatory status of beliefs/claims
The classic, JTB definition of knowledge:
A subject S knows that a proposition P is true if and
only if:
P is true, and
S believes that P is true, and
S is justified in believing that P is true
The problem resulting from
epistemic/filter bubbles is here.
How to escape from academic/research
filter bubbles?
Critical self-reflection, metacognition
Conferences „breaking out” of own scholarly
Thorough critical literature review or
systematic review
Does anyone really have to burst
their filter bubbles?
No and yes
But there are groups that certainly should:
librarians and information specialists
professional responsibility
scholars/scientist/researchers epistemic
General advice
Realize that filter/epistemic bubbles exist
Develop critical thinking
Develop information literacy (broadly
A few pieces of advice for internet
searching (1)
Actively seek for information rather than
passively consume what algorithms have chosen
for you
Benefit from different search tools offered by
Google (or other major search engines) the
Boolean operators, commands, phrase, advanced
search, etc.
Employ various search engines/tools, databases,
portals etc. and compare results
A few pieces of advice for internet
searching (2)
Use search engines that do not track users and as a
result do not personalise, e.g. DuckDuckGo, Qwant,
Use software, that helps to get out of your filter
bubble, e.g. Escape Your Bubble (Chrome extension),
FleepFeed (Twitter), Pop Your Bubble (Facebook)
Remember there is the Deep Web
Information behaviour concepts, models and
questions that may be seen from the perspective of
filter/epistemic bubbles
evaluation of information quality
information barriers
information needs
information sharing
perception of relevance
principle of least effort
stopping behavior
thoroughness of information seeking and filtering
time spent on information seeking
Filter bubbles and various types of
information behaviour
Information acquisition
Active acquisition = seeking (searching, browsing, and
Passive acquisition
Serendipity, chance encounters
When others share information with you
Information evaluation and selection
Avoiding or destroying information
Personal and group information management
Features of the contemporary information ecosystem:
enormous chaos, information overload, manipulation
and fake content, rapid changes
Different strategies users
(individuals and groups) apply to
manage their dynamic,
overwhelming and often uncertain
information environment
Offer of major content/search
providers: filtering and
personalizing algorithms.
Filter bubbles
Communities of practice, domain-
specific behaviour, gatekeeping, good
enough/satisficing, information
horizons, paradigms, principle of least
effort, sense-making, small worlds,
stopping behaviour Epistemic bubbles
Concept map of
potential research:
filter bubbles and
It is most probable that every human being has one
information space, built up from multiple layers and many
epistemic/filter/information bubbles.
Filter bubbles show distorted elements and not the context,
they strengthen the sense of unreality, misperception of
everything that surrounds the users.
It is natural that with the growth of information resources
and development of new technologies, both the
technology, culture, and communities systems will
increasingly filter content/information.
The filter bubble approach may become fruitful in
information behaviour research if the original,
„algorithmic” concept is widened.
Fish don’t know they are in the
water and people don’t know
they are in a filter bubble unless
they take the effort to leave the
capsule if anyone dare.
(FS Farnam Street Media Inc., 2018)
Selected literature (1)
Arfini, Selene; Bertolotti, Tommaso; Magnani, Lorenzo (2018). The diffusion of ignorance in on-line communities.
International Journal of Technoethics, Vol. 10, No. 1, p. 37-50.
Bakshy, Eytan; Messing, Solomon; Adamic, Lada A. (2015). Exposure to ideologically diverse news and opinion on Facebook,
Science, Vol. 348, No. 6239, p. 1130-1132.
Borgesius, Frederik J. Zuiderveen et al. (2016). Should we worry about filter bubbles? Internet Policy Review, Vol. 5,
Calero Valdez, Andre; Ziefle, Martina (2018). Human factors in the age of algorithms. Understanding the human-in-the-loop
using agent-based modeling. Lecture Notes in Computer Science, Vol. 10914 LNCS, p. 357-371.
Chatman, Elfrieda A. (1991). Life in a small world: applicability of gratification theory to information-seeking behavior. Journal
of the American Society for Information Science, Vol. 42, p. 438-449.
Erdelez, Sanda; Jahnke, Isa (2018). Personalized systems and illusion of serendipity: a sociotechnical lens.
Framework for information literacy for higher education (2016). Association of College and Research Libraries, American
Library Association.
FS Farnam Street Media Inc. (2018). How filter bubbles distort reality: everything you need to know.
Haim, Mario; Graefe, Andreas; Brosius, Hans-Bernd (2018). Burst of the filter bubble? Effects of personalization on the
diversity of Google News. Digital Journalism, Vol. 6, No. 3.
Holone, Harald (2016). The filter bubble and its effect on online personal health information. Croatian Medical Journal, Vol.
57, No. 3, p. 298-301.
Johnson-Laird, Philip; Goodwin, Geoffrey; Khemlani, Sangeet S. (2017). Mental models and Reasoning.
Selected literature (2)
Lazarus, Richard (1982). Thoughts on the relation between cognition and emotion. American Psychologist, Vol. 37, No. 9, p.
Miller, Boaz; Record, Isaac (2013). Justified belief in a digital age: on the epistemic implications of secret internet
technologies. Episteme, Vol. 10, No. 2, p. 117-134.
Ngyen, C. Thi (2018). Echo chambers and epistemic bubbles. Episteme, p. 1-21.
Nickerson, Raymond S. (1998). Confirmation bias: a ubiquitous phenomenon in many guises. Review of General Psychology,
Vol. 2, No. 2, p. 175-220.
Pariser, Eli (2015). Did Facebook’s big new study kill my filter bubble thesis?
Pariser, Eli (2011). The filter bubble: what the internet is hiding from you. New York: The Penguin Press.
Salehi, Sara; Du, Jia Tina; Ashman, Helen (2018). Use of Web search engines and personalisation in information searching for
educational purposes. Information Research, Vol. 23, No. 2, paper 788.
Sonnenwald, Diane H.; Iivonen, Mirja (1999). An integrated human information behavior research framework for information
studies. Library and Information Science Research, Vol. 21, No. 4, p. 429-457.
Spink, Amanda; Currier, James (2006). Towards an evolutionary perspective for human information behaviour. Journal of
Documentation, Vol. 62, No. 2, p. 171-193.
Sunstein, Cass R. (2001). Princeton: Princeton University Press.
Sunstein, Cass R. (2007). 2.0. Princeton: Princeton University Press.
The filter bubble.
Tran, Theresa; Yerbury, Hilary (2015). New perspectives on personalised search results: expertise and institutionalisation.
Australian Academic and Research Libraries, Vol. 46, No. 4, p. 275-288.
... The last theme suggests living within filter or epistemic bubbles (Cisek and Krakowska, 2018;Nguyen, 2018), what merits further research. ...
Full-text available
Introduction. This paper focuses on a research strategy combining the concept of mental models, diary method, drawings technique and thematic analysis to study individual information spaces. The possibility of implementing such approach was tested on a case study of personal information spaces of undergraduate information management students. Method. Methods of critical literature review and case study were used. Empirical data were gathered by means of the diary technique in its verbal, written and open form and by participant-generated drawings, and analysed with thematic analysis. Analysis. The analysis allowed to: (1) elicit selected components of individual information spaces as represented in their mental models, such as information activities, sources, people, places, affective and socio-cultural factors, and (2) to capture the main features of mental models, i.e. themes (cross-data patterns). Results. The pursuit of positive emotions, comfort, peace and safety is the dominant force shaping information behaviour and personalized information spaces. Conclusions. Triangulation of concepts of mental models, diary method, drawings technique and thematic analysis proved to be a fruitful study approach, not only offering holistic insight into personal information spaces but also opening new research questions
Conference Paper
Full-text available
In this position paper we shed light on the problem of the relation between digital environments and serendipity from a sociotechnical perspective. The paper brings attention to a potential impact of personalized information systems on the loss of genuine serendipity as a valued aspect of human information interaction. This issue is presented in the context of a growing sociotechnical environment that increasingly removes a sense of awareness and control from people and formalizes the ways they live their lives.
Full-text available
Discussion of the phenomena of post-truth and fake news often implicates the closed epistemic networks of social media. The recent conversation has, however, blurred two distinct social epistemic phenomena. An epistemic bubble is a social epistemic structure in which other relevant voices have been left out, perhaps accidentally. An echo chamber is a social epistemic structure from which other relevant voices have been actively excluded and discredited. Members of epistemic bubbles lack exposure to relevant information and arguments. Members of echo chambers, on the other hand, have been brought to systematically distrust all outside sources. In epistemic bubbles, other voices are not heard; in echo chambers, other voices are actively undermined. It is crucial to keep these phenomena distinct. First, echo chambers can explain the post-truth phenomena in a way that epistemic bubbles cannot. Second, each type of structure requires a distinct intervention. Mere exposure to evidence can shatter an epistemic bubble, but may actually reinforce an echo chamber. Finally, echo chambers are much harder to escape. Once in their grip, an agent may act with epistemic virtue, but social context will pervert those actions. Escape from an echo chamber may require a radical rebooting of one's belief system.
Full-text available
This article aims to investigate how information-sharing mechanisms in online communities favor activities of ignorance distribution on their platforms, such as fake data, biased beliefs, and inaccurate statements. In brief, the authors claim that online communities provide more ways to connect the users to one another rather than to control the quality of the data they share and receive. This, in turn, diminishes the value of fact-checking mechanisms in online news-consumption. The authors contend that while digital environments can stimulate the interest of groups of students and amateurs in scientific and political topics, the diffusion of false, poor, and un-validated data through digital media contributes to the formation of bubbles of shallow understanding in the digitally informed public. In brief, the present article is a philosophical research that applies the virtual niche construction theory to the cognitive behavior of internet users, as it is described by the current psychological, sociological, and anthropological literature. Copyright © 2018, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Full-text available
In offering personalized content geared toward users’ individual interests, recommender systems are assumed to reduce news diversity and thus lead to partial information blindness (i.e., filter bubbles). We conducted two exploratory studies to test the effect of both implicit and explicit personalization on the content and source diversity of Google News. Except for small effects of implicit personalization on content diversity, we found no support for the filter-bubble hypothesis. We did, however, find a general bias in that Google News over-represents certain news outlets and under-represents other, highly frequented, news outlets. The results add to a growing body of evidence, which suggests that concerns about algorithmic filter bubbles in the context of online news might be exaggerated.
Full-text available
Some fear that personalised communication can lead to information cocoons or filter bubbles. For instance, a personalised news website could give more prominence to conservative or liberal media items, based on the (assumed) political interests of the user. As a result, users may encounter only a limited range of political ideas. We synthesise empirical research on the extent and effects of self-selected personalisation, where people actively choose which content they receive, and pre-selected personalisation, where algorithms personalise content for users without any deliberate user choice. We conclude that at present there is little empirical evidence that warrants any worries about filter bubbles.
Full-text available
Exposure to news, opinion and civic information increasingly occurs through social media. How do these online networks influence exposure to perspectives that cut across ideological lines? Using de-identified data, we examined how 10.1 million U.S. Facebook users interact with socially shared news. We directly measured ideological homophily in friend networks, and examine the extent to which heterogeneous friends could potentially expose individuals to cross-cutting content. We then quantified the extent to which individuals encounter comparatively more or less diverse content while interacting via Facebook's algorithmically ranked News Feed, and further studied users' choices to click through to ideologically discordant content. Compared to algorithmic ranking, individuals' choices about what to consume had a stronger effect limiting exposure to cross-cutting content. Copyright © 2015, American Association for the Advancement of Science.
Introduction: Students increasingly depend on Web search for educational purposes. This causes concerns among education providers as some evidence indicates that in higher education, the disadvantages of Web search and personalised information are not justified by the benefits. Method: One hundred and twenty university students were surveyed about their information-seeking behaviour for educational purposes. We also examined students’ information access while using Web search, through twenty-eight one-on-one study sessions. Analysis: Survey participants ranked their preference towards different information resources on a 5-point Likert scale. Given equal exposure to the first five standard pages of the search results during the study sessions, students’ explicit and implicit feedback was used to evaluate the relevance of the search results. Results: First, most participating students declared that they use Google search engine as their primary or only information-seeking tool. Second, about 60% of the clicked result links during the study sessions were located in pages 2+ of the search results without personalisation influencing the relevance of the top-ranked search results. In real-life scenarios pages 2+ of the search results receive only ~10% of the clicks. Students also expressed more satisfaction with the relevance of non-personalised over personalised search results. These differences presented a missed information opportunity, an opportunity bias, for students.
Search engines and social networks are increasingly used for health related inquiry by the public. The information found through search engines, or presented by social network services are typically tailored to the individual through the use of complex algorithms taking into consideration comprehensive information about the individual performing the search, often without the knowledge of the searcher. In this paper, I discuss how the technology poses challenges both for patients and clinicians, and present some ideas to mitigate these problems.
Embedded in literature about the personalisation of search engine results, one can see the sociological idea that institutionalised ideas and practices, filtered down from expert knowledge, have constructed the way that people understand information search, that personalised search engines have become acknowledged as socially accepted sources of knowledge and that their discursive practices are becoming dominant. This interpretivist study of 13 Google users sought to investigate how young people perceive personalisation of search results by Google and includes a focus on the information search strategies they used to identify quality and authority in personalised information. Through this, it explores processes of the institutionalisation of expertise. The results from this study demonstrate that although Google is becoming institutionalised as a source of expertise, users significantly control the way they use search results and conceptualise the authority of these results.
Source: Democracy Now! JUAN GONZALEZ: When you follow your friends on Facebook or run a search on Google, what information comes up, and what gets left out? That's the subject of a new book by Eli Pariser called The Filter Bubble: What the Internet Is Hiding from You. According to Pariser, the internet is increasingly becoming an echo chamber in which websites tailor information according to the preferences they detect in each viewer. Yahoo! News tracks which articles we read. Zappos registers the type of shoes we wear, we prefer. And Netflix stores data on each movie we select. AMY GOODMAN: The top 50 websites collect an average of 64 bits of personal information each time we visit and then custom-designs their sites to conform to our perceived preferences. While these websites profit from tailoring their advertisements to specific visitors, users pay a big price for living in an information bubble outside of their control. Instead of gaining wide exposure to diverse information, we're subjected to narrow online filters. Eli Pariser is the author of The Filter Bubble: What the Internet Is Hiding from You. He is also the board president and former executive director of the group Eli joins us in the New York studio right now after a whirlwind tour through the United States.