Content uploaded by Stephen McCarthy
Author content
All content in this area was uploaded by Stephen McCarthy on May 20, 2021
Content may be subject to copyright.
‘I’ll Be Watching You’: A Social Cognitive Theory
Perspective of Citizens’ Relationship with Surveillance
Capitalism Practices
STEPHEN MCCARTHY, UNIVERSITY OF COLLEGE CORK, IRELAND.
WENDY ROWAN, UNIVERSITY OF COLLEGE CORK, IRELAND.
ZAINAB H. ALSHABEEB, UNIVERSITY OF COLLEGE CORK, IRELAND.
CAROLANNE MAHONY, UNIVERSITY OF COLLEGE CORK, IRELAND.
MICHAEL O’DRISCOLL, UNIVERSITY OF COLLEGE CORK, IRELAND.
1. INTRODUCTION
Modern IT organizations are increasingly leveraging data analytics to better understand the behavior
of diverse user groups online. This has been made possible by advances in machine learning, data
mining, and interactive visualizations for both recognizing patterns of user behavior and promoting
desired outcomes. There are several proposed benefits (i.e. ‘bright sides’) of such data analytic
practices for a range of stakeholders; this includes the personalization of content through user
recommendation engines [1], new commercialization opportunities through targeted display
advertising [2], and improved customer retention across markets. For instance, recommendation
algorithms developed by Netflix have previously been valued at $1 billion a year, with 80% of
streaming views attributed to data analytics [3,4]. In particular, ‘Big Tech’ firms such as Facebook,
Apple, Amazon, and Google have been at the forefront of advances in data analytics, continuously
developing solutions that seek to deliver new forms of value [5].
While customers have no doubt benefited from advanced data analytics in terms of convenience, other
stakeholders such as regulators, academics, and civil society have also pointed towards corresponding
‘dark sides’ of this technology [6,7]. Principle among these is the compromised data privacy of citizens
due to surveillance capitalism [8] where companies seek to maximize profit by collecting and
analyzing large volumes of personal data to predict present and future user behavior. In this business
model, data is continuously collected through users’ engagement with an online service and is then
used to develop predictive products which are sold to business customers e.g. advertisers. Despite the
introduction of regulatory mechanisms such as the GDPR in Europe, questions around the ethical use
of data analytics for this purpose remain. This also extends to the smart devices (e.g. digital
assistants, and smartphones) that data analytics rely on to capture large volumes of data. Lau et al.
[9] suggest that many non-users distrust smart speaker companies, citing privacy concerns around
data analytics and the ‘listening’ practices used for collecting information. In addition, concerns have
been raised around the market inequalities created by the gap between the advanced technologies at
the disposal of Big Tech and more incumbent firms.
However, despite the unmitigated impact these technologies have on their personal lives, the voice of
citizens is often absent from conversations on the ethical use of data analytics and surveillance
capitalism practices [9,10]. In this paper, we seek to gain insights into citizens’ perspectives of the
data analytics practices utilized by Big Tech, directing attention towards the ‘dark sides’ associated
Collective Intelligence 2021
1
2
with these technologies. In doing so, we seek to answer the following research question: How do
citizens perceive the consequences of surveillance capitalism practices?
2. THEORETICAL BACKGROUND
Social Cognitive Theory (SCT) centers on the study of three interrelated areas: the individual, their
behavior, and the environment [11,12]. SCT teaches us that to understand our actions we need to look
at it as a triadic reciprocal model where human action is a function of the interplay between
intrapersonal influences, individual behaviors, and environmental forces. No element can be taken in
isolation as each determines the other. It is a two-way regulatory system; the individual is both an
agent and object of control. Reinforcements and punishments are potentials in the environment, only
actualized by behavior patterns. Similarly, our behaviors can also influence the environment and
these determinants are intimately linked with our personal beliefs [13]. More recently social cognitive
theory has focused on human agency – the planning, intention, and execution of actions to influence
future events – where humans are agents of experience. Core features of human agency include self-
reactiveness, self-reflectiveness, and self-efficacy [14].
Self-reactiveness encompasses several subsidiary features such as self-monitoring and self-guided
performance standards [11,13]. Closely aligned to this is the self-reflection on the adequacy of one’s
thoughts and actions. The role of self-reflection is to offer the opportunity to evaluate motivation,
values, and the meaning of pursuits. Moral agency involves consciously thinking about the action (the
rightness or wrongness of conduct) evaluated against personal standards and
situational/environmental conditions, whether changes are required [11]. Lastly, SCT includes self-
efficacy mechanisms which people use to understand their capabilities to control their agency. This
mechanism centers on many decision-making factors, like peoples’ desire for perseveration when they
face difficulties, their ability to cope with stress and taxing environmental requirements [11,15].
3. RESEARCH DESIGN AND FINDINGS
To answer our research question, we draw on findings from focus groups involving over thirty
participants in Ireland. The focus groups (two synchronous four-hour sessions across two days) were
conducted online based on national restrictions during the Covid-19 pandemic. The focus groups were
subdivided into seven sessions on dedicated topics, including “The Internet and Me”, “My Data, Your
Data, Our Data”, and “A Strong Digital Public Sphere”. Volunteers did not require expertise on the
topic to register, and the final sample had representation for different demographics. Table 1 provides
a sample of findings based on concepts from SCT.
Category Finding Illustrative Quotes
Self-
reactiveness Awareness of privacy
concerns arising from
targeted advertising.
“I do not want to feel that as soon as I google something, that
information is then held and passed on and pushed back at
me in ways which I do not want. That's without even going
into the more sinister uses.”
“I used to hear a lot of people years ago saying… ‘what does
it matter if they know that I went to (get) another pizza’…
Collective Intelligence 2021
3
and now people are really saying yes it actually does make a
difference and it's completely invasive to your privacy. It's
not okay to do this (more) than it'd be okay to… have an
insurance guy or a car person walking into your home and
having a look.”
Collective Intelligence 2021
4
Table 1 Continued
Self-
reflection Trade-offs with values
(e.g. conveniences and
expectations of free /
personalized content)
“Spotify know your taste in music or Netflix know what type
of films or TV shows you like or YouTube so I suppose things
like that are advantages and it's a trade-off because when
you sign up to the service you kind of know that you're going
to get more informed content, personalized for you. But to do
that you have to give them access to your data.”
The need for education to
inform values “Young adults who don't understand the concept of (data)
being not necessarily private… they're assuming something
is private. They think they've done it in a very private
manner and they're not necessarily quite clued into the full
complexities of when it comes to putting anything online.”
Self-efficacy Information overload
designed to limit
informed action
“(Big Tech companies) are giving people too much
information so that they can't find what they need. I mean
have you ever tried to read the Google terms and conditions?
It would take a lawyer 10 days or something to read through.
And it's so much legalese, it's not in plain English.”
Illusion of ability to
control agency “There are so many data out there available so that it's
basically impossible to control… you don't really (know) it's
you, don't really know who's downloading it, and it's not
possible to control it in any way.”
Table 1: SCT Concepts and Sample Findings
4. CONCLUSION
Our proposed theoretical and practical contributions are as follows. We firstly answer recent calls for
research on citizens’ perceptions of data analytics and the ethical issues around the consequences of
associated practices (e.g. profiling, privacy breeches, exclusion) [10,16,17]. More specifically, our
research aims to provide citizens with a platform to share their thoughts, feelings, and concerns
around the data analytics practices of Big Tech. This addresses prior concerns that the voice of
citizens is often missing from such conversations [9]. While surveillance capitalism practices were
originally designed to be undetectable [8], we find that citizens are to a degree cognizant of their
application. However, they also recognize trade-offs between privacy and convenience when using
digital platforms (i.e. ‘the customer is the product’). Our findings also point towards power imbalances
between citizens and Big Tech firms, such as the illusion of control and information overload to limit
informed action.
In terms of our second proposed contribution, we adapt SCT [11] to study citizens’ behavior in the
online environment, and their interactions with the surveillance capitalism practices of Big Tech. SCT
provides us with an understanding of the impact that the online environment and data analytics
practices have on citizens' behavior. We find that digital life can therefore influence and change
citizen’s individual behaviors, attitudes, and environment. However, users also have the agency to
select between different moral motivations for action. Significantly, this includes impacts on the use of
data and their personal identity. Although there have been criticisms of Bandura’s ideas as being
based on the informal nature of theory as opposed to a computational approach [18], many of the
tenets expostulated by Bandura hold true today. Humans have evolved as a result of information
Collective Intelligence 2021
5
communication technologies, but the essential aspects of what it is to be human have not transformed,
we still love, hate, act intentionally and unintentionally, often without recourse to the consequences of
our actions.
REFERENCES
1. Yaman Kumar, Agniv Sharma, Abhigyan Khaund, Akash Kumar, Ponnurangam Kumaraguru,
Rajiv Ratn Shah, and Roger Zimmermann. 2018. IceBreaker: Solving Cold Start Problem for
Video Recommendation Engines. In 2018 IEEE International Symposium on Multimedia
(ISM), 217–222.
2. Hana Choi, Carl F Mela, Santiago R Balseiro, and Adam Leary. 2020. Online display
advertising markets: A literature review and future directions. Information Systems Research.
3. Nathan McAlone. 2016. Why Netflix thinks its personalized recommendation engine is worth
$1 billion per year. Business Insider. Retrieved from https://www.businessinsider.com/netflix-
recommendation-engine-worth-1-billion-per-year-2016-6?r=US&IR=T
4. Hind Benbya, Ning Nan, Hüseyin Tanriverdi, and Youngjin Yoo. 2020. Complexity and
Information Systems Rsearch in the Emerging Digital World. MIS Quarterly 44, 1: 1–18.
5. Rana Foroohar. 2019. Don’t be Evil: The Case Against Big Tech. Penguin UK.
6. Ofir Turel, Christian Matt, Manuel Trenz, Christy M K Cheung, John D’Arcy, Hamed Qahri-
Saremi, and Monideepa Tarafdar. 2019. Panel report: the dark side of the digitization of the
individual. Internet research.
7. Monideepa Tarafdar, Ashish Gupta, and Ofir Turel. 2015. Special issue on’dark side of
information technology use’: an introduction and a framework for research. Information
Systems Journal 25, 3: 161–170.
8. Shoshana Zuboff. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at
the New Frontier of Power. Profile Books.
9. Josephine Lau, Benjamin Zimmerman, and Florian Schaub. 2018. Alexa, are you listening?
privacy perceptions, concerns and privacy-seeking behaviors with smart speakers. Proceedings
of the ACM on Human-Computer Interaction 2, CSCW: 1–31.
10. Ida Someh, Michael Davern, Christoph F Breidbach, and Graeme Shanks. 2019. Ethical issues
in big data analytics: A stakeholder perspective. Communications of the Association for
Information Systems 44, 1: 34.
11. Albert Bandura. 1997. Social cognitive theory of personality. Handbook of personality 50, 2:
154–196.
12. Albert Bandura. 1989. Human agency in social cognitive theory. American psychologist 44, 9:
1175.
13. Albert Bandura. 1986. Social foundations of thought and action: A social-cognitive view.
Prentice-Hall Englewood Cliffs, NJ.
14. Albert Bandura. 2001. Social cognitive theory: An agentic perspective. Annual review of
psychology 52, 1: 1–26.
15. Albert Bandura. 1991. Social cognitive theory of moral thought and action. Handbook of moral
behavior and development 1: 45–103.
16. Sue Newell and Marco Marabelli. 2015. Strategic opportunities (and challenges) of algorithmic
decision-making: A call for action on the long-term societal effects of ‘datification.’ The Journal
of Strategic Information Systems 24, 1: 3–14.
17. M Lynne Markus. 2015. New games, new rules, new scoreboards: the potential consequences of
big data. Journal of Information Technology 30, 1: 58–59.
Collective Intelligence 2021
6
18. William T Powers. 1978. Quantitative analysis of purposive systems: Some spadework at the
foundations of scientific psychology. Psychological Review 85, 5: 417–435.
Collective Intelligence 2021