Content uploaded by Sabina Cisek
Author content
All content in this area was uploaded by Sabina Cisek on Oct 10, 2018
Content may be subject to copyright.
The filter bubble: a perspective for
information behaviour research
Sabina Cisek
Monika Krakowska
Jagiellonian University in Kraków
1
Contents
•Why do we discuss the concept and phenomenon of filter
bubble?
•Filter bubble and information behaviour: the current state of
research
•The filter bubble – definition, genesis, characteristics and
widening of the concept
•The psychological dimension
•The socio-cultural dimension
•The epistemological/methodological dimension
•How to burst your filter/epistemic bubble?
•Conclusions and ideas
•Selected literature
2
Our main objectives
•To discuss the concept of filter bubble and its various
dimensions – psychological, socio-cultural, epistemological
•To show that this notion might be a fruitful
methodological/theoretical framework for the information
behaviour research, to be used for:
–disclosing unknown aspects of information behaviour
–organizing/seeing – in a new way – what we already know
about human information behavior
3
We do not cover all possible aspects of the filter bubble related issues.
We rather ask questions than provide answers.
WHY DO WE DISCUSS THE CONCEPT
AND PHENOMENON OF FILTER
BUBBLE?
4
•It concerns billions of people – has not
only cognitive/epistemic aspects but also
moral, political, social … .
•Fake news, manipulation
•Own (bad) experiences with social media
•There is not much research on that
problem – but we think it may be
interesting and scholarly fruitful.
5
•1.47 billion daily active users on Facebook on average for June 2018
•2.23 billion monthly active users on Facebook as of June 30, 2018 (Facebook Stat, 2018)
It may be assumed that over 2 billion users of social media (Facebook) are
exposed to personalised messages, manipulation, fake news.
6
https://wearesocial.com/blog/2018/01/global-digital-report-2018
Fake news
It was the Abdul Aziz Al-Otaibi’s artistic project that went viral
through many social media channels.
This also shows the political dimension of filter bubbles.
8
Hooton, 2014
Did this boy really
sleep between his
parents graves?
FILTER BUBBLE AND INFORMATION
BEHAVIOUR: THE CURRENT STATE
OF RESEARCH
9
Scholarly publications on filter bubbles [1]
•Google Scholar – about 5430 results
•Scopus (in Title-Abstract-Keywords) – 21 results, from years
2012-2018
•Web of Science (all databases) in the Topic category – 17
results, from years 2011-2018
•Wiley Online Library – 40 results, from years 2011-2018
•LISTA Library, Information Science and Technology Abstracts –
13 results, from years 2011-2018
10
Query „filter bubble” AND internet, September 2018
Scholarly publications on filter bubbles [2]
•Google Scholar – about 135 results
•Scopus (in Title-Abstract-Keywords) – 0 results
•Web of Science (all databases) in the Topic category – 0
results
•Wiley Online Library – 2 results, from years 2012-2018
•LISTA Library, Information Science and Technology
Abstracts – 1 result (Tran, Yerbury, 2015)
11
Query „filter bubble” AND („information behavior”
OR „information behaviour”), September 2018
THE FILTER BUBBLE – DEFINITION,
GENESIS, CHARACTERISTICS AND
WIDENING OF THE CONCEPT
12
The concept and theory of filter bubble and
related ideas/notions have to be carefully
analysed to become useful for information
behaviour research,
because – in popular conversation – they are
frequently approached emotionally, without
deeper scholarly reflection – especially in the
political context.
13
14
Filter
bubbles
Echo
chambers
Cognitive
invisibility
Selective
exposure
Partial
information
blindness
The Daily
Me
Information
cocoons
Bias
What is a filter bubble – in its narrow,
„algorithmic” sense? [1]
•The term „filter bubble” was coined
by Eli Pariser in his book The filter
bubble: what the internet is hiding
from you (2011), cited 2886 times
(Google Scholar)
•But – the problem itself had been
discussed earlier (e.g. Sunstein,
2001; 2007)
15
What is a filter bubble – in its narrow,
„algorithmic” sense? [2]
„A filter bubble is the intellectual isolation that can occur when websites make
use of algorithms to selectively assume the information a user would want to
see, and then give information to the user according to this assumption.
Websites make these assumptions based on the information related to the user,
such as former click behavior, browsing history, search history and location. For
that reason, the websites are more likely to present only information that will
abide by the user's past activity.
A filter bubble, therefore, can cause users to get significantly less contact with
contradicting viewpoints, causing the user to become intellectually isolated.
Personalized search results from Google and personalized news stream from
Facebook are two perfect examples of this phenomenon.”
https://www.techopedia.com/definition/28556/filter-bubble
16
17
Websites (search
engines, social media)
need satisfied clients
Users want „nice”, easy-to-
get content – Principle of
Least Effort
Features of the contemporary information ecosystem: enormous chaos,
information overload, manipulation and fake content, rapid changes
Relevance
Filtering
algorithms
Personalisation
of content
Reinforcement, feedback-
loop between users and
algorithm
How filter bubbles are created?
The three main questions
1) Do filter bubbles actually exist? If yes – are
they important? Do personalising algorithms
really have deleterious effects?
2) The prevailing opinion is that filter bubbles
are bad. But – why is getting personalised,
tailored information considered wrong?
3) Is it a completely new problem/situation?
18
1) Voices against the existence/importance of filter bubbles
or harmful effects of personalisation by algorithms
19
„Within the population under study here,
individuals choices (…) more than algorithms (…)
limit exposure to attitude-challenging content in
the context of Facebook. Despite the differences in
what individuals consume across ideological lines,
our work suggests that individuals are exposed to
more cross-cutting discourse in social media (…)”
(Bakshy, Messing, Adamic, 2015).
„We distinguish between self
selected personalisation, where
people actively choose which
content they see, and pre-
selected personalisation, where
algorithms personalise content
for users without any deliberate
user choice. (…) We conclude
that – in spite of the serious
concerns voiced – at present,
there is no empirical evidence
that warrants any strong worries
about filter bubbles” (Borgesius
et al. 2016).
„We conducted two exploratory studies to test the
effect of both implicit and explicit personalization
on the content and source diversity of Google
News. Except for small effects of implicit
personalization on content diversity, we found no
support for the filter-bubble hypothesis” (Haim,
Graefe, Brosius, 2018).
2) Why are filter bubbles (as „products” of
personalisation) harmful?
•Algorithms are content/information gatekeepers – censors. They hinder:
–access to content – data, documents, news, resources,
–awareness that there are different/other opinions, worldviews on known issues,
–awareness of the mere existence of some issues, problems, questions.
•Personalisation is done by algorithms, not by experts. Algorithms do not
base on ethical principles.
•Filter bubbles are invisible (implicit personalisation). Users do not know that
information they get is personalised. They may assume it is complete and
neutral/objective.
•Filter bubbles are involuntary.
•Filter bubbles contribute to creation of echo chambers (and those cause
political and social problems).
20
3) Is this a completely new situation?
•People have always experienced filter/epistemic
bubbles and always there were information
gatekeepers – families (parents!), political powers,
religions, social groups … . And – in many cases –
those bubbles have been invisible and
involuntary.
•In addition – we need to filter/select content – to
get relevant, reliable, useful information.
21
Widening of the filter bubble concept [1]
•Three „layers” of filter bubbles – users lack
knowledge about
–the existence, way of working, or effects of filtering
algorithms and personalisation (Facebook, Google,
Twitter),
–more advanced search techniques within major
websites,
–the mere existence of (1) other search services and (2)
the Deep Web.
22
Widening of the filter bubble concept [2]
•Filter bubbles and epistemic bubbles
•Information behaviour/seeking is not only
technology-driven and affected solely by cognitive
factors. It also has biological, political, psychological,
socio-cultural, and time-space determinants.
23
24
Filter
bubbles
(Pariser, 2011)
Epistemic
bubbles
(Nguyen, 2018)
Cognitive/epistemic content
filtered by the websites’ ways of
working, in particular –by
personalising algorithms
Cognitive/epistemic
content filtered by
biology, culture, history
(authorities, paradigms,
social groups, traditions )
Attitudes, emotion, people,
social relations filtered by
the websites’ ways of
working, in particular – by
personalising algorithms
THE PSYCHOLOGICAL DIMENSION
25
26
The formation of filter/information
bubbles may be related to the
emotional interactions of internal
cognitive structures with the real
world = relationship with mental
models (Craik’s concept)
Emotions and motivations are
incentives, but also regulators of
mental processes that influence the
assessment of an event, situation,
context, object – thus imply (or not)
favourable, indifferent or unfavourable
behaviour (Lazarus, James-Lange
theories of emotions)
In biology, psychology and cognitive sciences, all human reactions to the outside
world refer to the evolutionary mechanism of coping, adapting to ecosystems
This mechanism is a combination of the endocrine system, the nervous
system, an individual phenotype that also affects the construction of the
so-called emotional brain (LeDoux’s theory)
Codification of aspirations to
regulate and control reactions and
actions in accordance with human
intention = filter bubbles deepen
the effect of self-satisfaction
Filter/information bubble as the
reaction to excessive psychological
costs, need for self-protection and
low self-efficacy
INFORMATION CHANNELS
27
Evolutionary information behaviour model
MASTERY OF LIFE
ROLE of OPTIMAL FORAGING
ELIS
Negative aspects
•Creating a misleading and erroneous image of reality, an
individual mental model = closure in a limited, hermetic
circle of information, opinions, views, worldviews, limiting
the acquisition of knowledge
•Confirmation bias and cognitive bias formation
•Promoting intellectual and emotional laziness = because it
does not expose you to cognitive dissonance and
intolerance for uncertainty
•Not developing collective emotions = because it is better to
experience affective emotions (safe and often positive) of
„being in the same boat”
28
Positive aspects
•Filter bubble
–aims to hedge against information chaos, overload safety
–has the impact on emotional well-being, reduction of
excessive psychological costs by constructing a subjective
information space
29
According to social psychology, positive cognitive
stimulation of users, affirmative mood, beneficial
affects increases the expressiveness of cognitive
and informational activities and processes, has an
impact on the increase of creativity, understanding
of relationships, hierarchizing tasks, individual
involvement and its informational behavior.
THE SOCIO-CULTURAL DIMENSION
30
31
THE SOCIO-CULTURAL AND RELATIONSHIP FILTER BUBBLE
Echo chambers = diffused or
homogeneous user groups
duplicate, strengthen beliefs,
knowledge within the group; there
is a reluctance to consider
alternatives to preferred views
(tunnel vision)
„The strength of weak ties”
theory (Granovetter) in
information diffusion
Group polarization – the tendency
to form very extreme positions and
to adopt extreme attitudes in the
group
The concept of persuasive
argumentation – while users seek for
support and argumentation for the
opinions held, which causes
polarization and the concept of social
comparisons and adaptation of
individual views to the extreme and
image building
Can hinder rather than
facilitate „adding the diversity
to our lives”
Small worlds theory and life in the round theory
(Chatman) = striving for participation in
normative collectives due to their attractiveness
and benefits and intensification of beliefs
32
•Can intensify the fear of rejection
•Lost of the ability to discuss
•May lead to the groupthink = a
psychological phenomenon wherein
groups of people experience a temporary
loss of the ability to think in a rational,
moral and realistic manner
•Petrification of attitudes and behaviours
•It is not conducive to learning something
new
•Shapes opinions and attitudes of
individuals and groups
•Positive collective reinforcement
(support of information
processes, development of
interpersonal relations -
support, help, etc.)
•Social constructivism assumes
that user perceives and
understands reality subjectively,
through its permanent
interpretation and interaction
with others
THE SOCIO-CULTURAL FILTER BUBLE
NEGATIVE ASPECTS POSITIVE ASPECTS
THE EPISTEMOLOGICAL /
METHODOLOGICAL DIMENSION
33
•Epistemological/methodological filter bubbles
–domains (Hjørland)
–paradigms (Kuhn)
–research frameworks
–research traditions
as filtering entities.
•Questions:
–How do they filter scholarly content and values?
–How does that influence researchers/scholars/students’
information behaviour?
34
Epistemological/methodological filter bubbles
POSSIBLE PROBLEMS
•Circular reasoning
•A limited repertoire of
research methods and
questions
•Imposing the researcher’s
worldview on the subjects, not
seeing worlds of others
•Unwitting acceptance of
philosophical (axiological,
epistemological, ontological)
assumptions
POSITIVE ASPECTS
•Ability to extend existing
knowledge
•Common „starting point”
•Facilitated communication
•Social acceptance/inclusion
35
Justificatory status of beliefs/claims
The classic, JTB definition of knowledge:
A subject S knows that a proposition P is true if and
only if:
•P is true, and
•S believes that P is true, and
•S is justified in believing that P is true
36
The problem – resulting from
epistemic/filter bubbles – is here.
How to escape from academic/research
filter bubbles?
•Critical self-reflection, metacognition
and
•Conferences – „breaking out” of own scholarly
milieu
•Interdisciplinarity
•Thorough critical literature review or
systematic review
37
HOW TO BURST YOUR
FILTER/EPISTEMIC BUBBLE?
38
Does anyone really have to burst
their filter bubbles?
•No and yes
•But there are groups that certainly should:
–librarians and information specialists –
professional responsibility
–scholars/scientist/researchers – epistemic
responsibility
39
General advice
•Realize that filter/epistemic bubbles exist
•Develop critical thinking
•Develop information literacy (broadly
understood)
40
A few pieces of advice for internet
searching (1)
•Actively seek for information – rather than
passively consume what algorithms have chosen
for you
•Benefit from different search tools offered by
Google (or other major search engines) – the
Boolean operators, commands, phrase, advanced
search, etc.
•Employ various search engines/tools, databases,
portals etc. and compare results
41
A few pieces of advice for internet
searching (2)
•Use search engines that do not track users and – as a
result – do not personalise, e.g. DuckDuckGo, Qwant,
StartPage
•Use software, that helps to get out of your filter
bubble, e.g. Escape Your Bubble (Chrome extension),
FleepFeed (Twitter), Pop Your Bubble (Facebook)
and
•Remember there is the Deep Web
42
CONCLUSIONS AND IDEAS
43
Information behaviour concepts, models and
questions that may be seen from the perspective of
filter/epistemic bubbles
•evaluation of information quality
•information barriers
•information needs
•information sharing
•perception of relevance
•principle of least effort
•stopping behavior
•thoroughness of information seeking and filtering
•time spent on information seeking
44
Filter bubbles and various types of
information behaviour
•Information acquisition
–Active acquisition = seeking (searching, browsing, and
monitoring)
–Passive acquisition
–Serendipity, chance encounters
–When others share information with you
•Information evaluation and selection
•Avoiding or destroying information
•Personal and group information management
45
46
Features of the contemporary information ecosystem:
enormous chaos, information overload, manipulation
and fake content, rapid changes
Different strategies users
(individuals and groups) apply to
manage their dynamic,
overwhelming and often uncertain
information environment
Offer of major content/search
providers: filtering and
personalizing algorithms.
Filter bubbles
Communities of practice, domain-
specific behaviour, gatekeeping, good
enough/satisficing, information
horizons, paradigms, principle of least
effort, sense-making, small worlds,
stopping behaviour Epistemic bubbles
Concept map of
potential research:
filter bubbles and
information
behaviour
•It is most probable that every human being has one
information space, built up from multiple layers and many
epistemic/filter/information bubbles.
•Filter bubbles show distorted elements and not the context,
they strengthen the sense of unreality, misperception of
everything that surrounds the users.
•It is natural that with the growth of information resources
and development of new technologies, both the
technology, culture, and communities systems will
increasingly filter content/information.
•The filter bubble approach may become fruitful in
information behaviour research – if the original,
„algorithmic” concept is widened.
47
48
Fish don’t know they are in the
water and people don’t know
they are in a filter bubble unless
they take the effort to leave the
capsule — if anyone dare.
(FS Farnam Street Media Inc., 2018)
Selected literature (1)
•Arfini, Selene; Bertolotti, Tommaso; Magnani, Lorenzo (2018). The diffusion of ignorance in on-line communities.
International Journal of Technoethics, Vol. 10, No. 1, p. 37-50.
•Bakshy, Eytan; Messing, Solomon; Adamic, Lada A. (2015). Exposure to ideologically diverse news and opinion on Facebook,
Science, Vol. 348, No. 6239, p. 1130-1132.
•Borgesius, Frederik J. Zuiderveen et al. (2016). Should we worry about filter bubbles? Internet Policy Review, Vol. 5,
https://doi.org/10.14763/2016.1.401
•Calero Valdez, Andre; Ziefle, Martina (2018). Human factors in the age of algorithms. Understanding the human-in-the-loop
using agent-based modeling. Lecture Notes in Computer Science, Vol. 10914 LNCS, p. 357-371.
•Chatman, Elfrieda A. (1991). Life in a small world: applicability of gratification theory to information-seeking behavior. Journal
of the American Society for Information Science, Vol. 42, p. 438-449.
•Erdelez, Sanda; Jahnke, Isa (2018). Personalized systems and illusion of serendipity: a sociotechnical lens.
https://wepir.adaptcentre.ie/papers/WEPIR_2018_paper_6.pdf
•Framework for information literacy for higher education (2016). Association of College and Research Libraries, American
Library Association. http://www.ala.org/acrl/standards/ilframework
•FS Farnam Street Media Inc. (2018). How filter bubbles distort reality: everything you need to know.
https://fs.blog/2017/07/filter-bubbles/
•Haim, Mario; Graefe, Andreas; Brosius, Hans-Bernd (2018). Burst of the filter bubble? Effects of personalization on the
diversity of Google News. Digital Journalism, Vol. 6, No. 3. https://doi.org/10.1080/21670811.2017.1338145
•Holone, Harald (2016). The filter bubble and its effect on online personal health information. Croatian Medical Journal, Vol.
57, No. 3, p. 298-301. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4937233/
•Johnson-Laird, Philip; Goodwin, Geoffrey; Khemlani, Sangeet S. (2017). Mental models and Reasoning.
http://mentalmodels.princeton.edu/papers/2017MMs&reasoning.pdf
49
Selected literature (2)
•Lazarus, Richard (1982). Thoughts on the relation between cognition and emotion. American Psychologist, Vol. 37, No. 9, p.
1019-1024.
•Miller, Boaz; Record, Isaac (2013). Justified belief in a digital age: on the epistemic implications of secret internet
technologies. Episteme, Vol. 10, No. 2, p. 117-134.
•Ngyen, C. Thi (2018). Echo chambers and epistemic bubbles. Episteme, p. 1-21. https://doi.org/10.1017/epi.2018.32
•Nickerson, Raymond S. (1998). Confirmation bias: a ubiquitous phenomenon in many guises. Review of General Psychology,
Vol. 2, No. 2, p. 175-220.
https://www.researchgate.net/publication/280685490_Confirmation_Bias_A_Ubiquitous_Phenomenon_in_Many_Guises
•Pariser, Eli (2015). Did Facebook’s big new study kill my filter bubble thesis? https://www.wired.com/2015/05/did-facebooks-
big-study-kill-my-filter-bubble-thesis/
•Pariser, Eli (2011). The filter bubble: what the internet is hiding from you. New York: The Penguin Press.
•Salehi, Sara; Du, Jia Tina; Ashman, Helen (2018). Use of Web search engines and personalisation in information searching for
educational purposes. Information Research, Vol. 23, No. 2, paper 788. http://www.informationr.net/ir/23-2/paper788.html
•Sonnenwald, Diane H.; Iivonen, Mirja (1999). An integrated human information behavior research framework for information
studies. Library and Information Science Research, Vol. 21, No. 4, p. 429-457.
•Spink, Amanda; Currier, James (2006). Towards an evolutionary perspective for human information behaviour. Journal of
Documentation, Vol. 62, No. 2, p. 171-193.
•Sunstein, Cass R. (2001). Republic.com. Princeton: Princeton University Press.
•Sunstein, Cass R. (2007). Republic.com 2.0. Princeton: Princeton University Press.
•The filter bubble. https://dontbubble.me/
•Tran, Theresa; Yerbury, Hilary (2015). New perspectives on personalised search results: expertise and institutionalisation.
Australian Academic and Research Libraries, Vol. 46, No. 4, p. 275-288.
50