ChapterPDF Available

Empirical Research Methods in Usable Privacy and Security

Authors:
  • CISPA Helmholtz Center for Information Security

Abstract

A variety of methods and techniques are used in usable privacy and security (UPS) to study users’ experiences and behaviors. When applying empirical methods, researchers in UPS face specific challenges, for instance, to represent risk to research participants. This chapter provides an overview of the empirical research methods used in UPS and highlights associated opportunities and challenges. This chapter also draws attention to important ethical considerations in UPS research with human participants and highlights possible biases in study design.
Human Factors in
Privacy Research
Nina Gerber · Alina Stöver
Karola Marky Editors
Human Factors in Privacy Research
Nina Gerber • Alina Stöver Karola Marky
Editors
Human Factors in Privacy
Research
Editors
Nina Gerber
Institute for Psychology
Technical University of Darmstadt
Darmstadt, Germany
Karola Marky
Institute for IT Security
Gottlieb Wilhelm Leibniz University
Hannover, Germany
Alina Stöver
Institute for Psychology
Technical University of Darmstadt
Darmstadt, Germany
This work was supported by GRK2050 Privacy and Trust for Mobile Users
ISBN 978-3-031-28642-1 ISBN 978-3-031-28643-8 (eBook)
https://doi.org/10.1007/978-3-031-28643-8
© The Editor(s) (if applicable) and The Author(s) 2023. This book is an open access publication.
Open Access This book is licensed under the terms of the Creative Commons Attribution 4.0 Inter-
national License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation,
distribution and reproduction in any medium or format, as long as you give appropriate credit to the
original author(s) and the source, provide a link to the Creative Commons license and indicate if changes
were made.
The images or other third party material in this book are included in the book’s Creative Commons
license, unless indicated otherwise in a credit line to the material. If material is not included in the book’s
Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the
permitted use, you will need to obtain permission directly from the copyright holder.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors, and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.
This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Foreword
Every day, there is some new kind of privacy incident reported in the media. It
might be a data breach, some kind of smartphone app or new device that’s collecting
too much information, or some kind of scandal about how personal data is being
abused. The news articles about privacy are legion, ranging from the large scale
like Cambridge Analytica and Equifax data breach, to the small scale like people
stalking their ex-partners using smart home technologies or a priest outed as gay
due to Grindr selling location data of its users.
The thing is, it doesn’t have to be this way.
The good news is that things are starting to move in the right directions, slowly
but surely. There are new laws that govern how companies and large organizations
must handle data. There are new kinds of technologies, tools, standards, and
guidelines for helping developers create privacy-sensitive apps. Lastly, and the focus
of this book, there are new kinds of human-centered methods and empirical results
to help researchers and practitioners design better user interfaces and systems.
This book is a treasure trove for researchers and practitioners interested in usable
privacy. If you are interested in designing and building interactive systems that
everyday people can use and would want to use, or want to know best practices
in evaluating these kinds of systems in an ethical manner, this book is for you.
From a theoretical perspective, this book offers a foundation about theories, both
philosophical and behavioral, that can help explain people’s attitudes and behaviors
towards privacy. For example, this book touches on Nissenbaum’s conceptualization
of contextual integrity, as well as how the Technology Acceptance Model might
influence people’s willingness to adopt new privacy enhancing technologies.
From a more pragmatic perspective, this book also offers a number of tools to
help with practical concerns, ranging from survey scales to assess people’s privacy
concerns to human-centered design processes, from designing effective privacy
notices to applying nudges to influence people’s behaviors. These chapters contain
especially useful overviews of a wide range of topics related to privacy, regardless
of whether your background is in computer science, psychology, or design.
v
vi Foreword
This book also offers something unique for researchers and practitioners, namely
a deep discussion of the challenges that corporations face with compliance with laws
and the conflicts they face when rolling out privacy measures.
There are some books you skim over, and then put away, never to be looked at
again. There are other books you keep on your shelf just to look smart (yes, admit it,
you do it too). And then there are books like this one, which contain so much useful
information, that you will keep coming back to it time and time again.
October 2022 Jason Hong
Acknowledgements
This work has been co-funded by the Deutsche Forschungsgemeinschaft (DFG,
German Research Foundation, grant number 251805230/GRK 2050) and by the
German Federal Ministry of Education and Research and the Hessen State Ministry
for Higher Education, Research, Science and the Arts within their joint support of
the National Research Center for Applied Cybersecurity ATHENE.
We would also like to thank everyone who contributed to the creation of this
book—first and foremost, of course, the great authors of the chapters including
Jason Hong for contributing the foreword, as well as the thorough reviewers who
provided valuable feedback for the chapters, the coordination team of the RTG
“Privacy and Trust for Mobile Users” for their efforts to make funding for the
book possible, the Springer Nature team for their support, and last but not least
our families and friends who ensured that we (and our children) did not have to
starve during the final hot phase.
vii
About This Book
This book tackles the topic of human factors in privacy research from four different
angles: theoretically, methodically, specifically with reference to various application
areas, and solution-oriented by considering approaches for user privacy support.
Theory We start the book with the theoretical foundations of usable privacy
research. For this purpose, the chapter “Data Collection Is Not Mostly Harmless:
An Introduction to Privacy Theories and Basics” gives an introduction to the
most important concepts of the privacy topic. Subsequently, the chapter “From the
Privacy Calculus to Crossing the Rubicon: An Introduction to Theoretical Models of
User Privacy Behavior” shows how theoretical behavioral models from psychology,
health or economics can be used to explain human privacy behavior.
Methodology After that, we approach the topic of usable privacy research from a
methodological point of view. First, the chapter “Empirical Research Methods in
Usable Privacy and Security” gives an overview of the different research methods
that are commonly used in usable privacy research. Furthermore, an introduction
to ethical considerations is given. Then, the chapter “Toward Valid and Reliable
Privacy Concern Scales: The Example of IUIPC-8” takes a closer look at the
quantitative approach, using the IUIPC-8 as an example to describe how question-
naires can be examined for their validity and reliability as quantitative measurement
instruments. This is followed by a consideration of the more qualitative approach
in the chapter “Achieving Usable Security and Privacy Through Human-Centered
Design” by describing how research and design in the privacy context can be
conducted using methods of human-centered design. Here, approaches such as
mental models, user requirements, user group profiles, and personas are presented
as examples. Then, in the chapter “What HCI Can Do for (Data Protection) Law—
Beyond Design”, a bridge is built between law and HCI and the authors discuss
how both can be combined in order to do truly user-centered, law-compliant privacy
research. In the chapter “Expert Opinions as a Method of Validating Ideas: Applied
to Making GDPR Usable”, this combination is then presented through a case study
by describing the implementation and results of a study that uses expert interviews
from the legal and HCI sciences, among others, to investigate how requirements
ix
x About This Book
of the General Data Protection Regulation (GDPR) can be implemented in a user-
friendly way.
Application Areas Subsequently, we consider different application areas of pri-
vacy research. Here, we start in the chapter “Privacy Nudges and Informed Consent?
Challenges for Privacy Nudge Design” with the question to what extent nudges can
be used to help users to make better informed decisions when handling their private
data. The discussion is complemented by reflections on how a general use of nudges
should be designed from an ethical point of view. The chapter “The Hows and
Whys of Dark Patterns: Categorizations and Privacy” then discusses the use case
of dark patterns, in which the psychological principles used in nudging for positive
purposes are used to trick users into disclosing more of their data than initially
intended. The chapter “‘They see me scrollin’—Lessons Learned From Investi-
gating Shoulder Surfing Behavior and Attack Mitigation Strategies” discusses the
specific application of shoulder surfing studies and how Virtual Reality (VR) can
be used as a study methodology and what ethical aspects should be considered.
The chapter “Privacy Research on the Pulse of Time: COVID-19 Contact-Tracing
Apps” gives an overview of the current research on contact tracing apps, which
rapidly gained relevance in the research field of usable privacy during the COVID-
19 pandemic starting in 2020. Here, the issue of using different measurement
tools based on different privacy concepts is exemplified and thus the connection
to the methodological foundations discussed in the chapter “Toward Valid and
Reliable Privacy Concern Scales: The Example of IUIPC-8” is made. Finally,
the chapter “Privacy Perception and Behavior in Safety-Critical Environments”
presents various case studies investigating privacy perceptions and behaviors within
safety critical environments and elaborates on the relationship between security and
privacy behavior.
Solutions In the last part of the book, we look at various approaches that
are intended to support users in finding a meaningful, self-determined way of
dealing with their private data. For this, we first turn to the concern of obtaining
user consent for data collection and processing in a legally compliant and user-
friendly way. Here, the chapter “Generic Consents in Digital Ecosystems: Legal,
Psychological, and Technical Perspectives” discusses the extent to which users
could generically give their consent and the legal principles and challenges that
need to be considered in this regard. The chapter “Human-Centered Design for
Data-Sparse Tailored Privacy Information Provision” then describes a possible
solution in which transparency for users is increased through context-sensitive,
tailored privacy information provision. Thus, the chapter “Human-Centered Design
for Data-Sparse Tailored Privacy Information Provision” discusses transparency-
enhancing technologies (TETs), which are a subcategory of privacy-enhancing
technologies (PETs), which are described in the following chapter Acceptance
Factors of Privacy-Enhancing Technologies on the Basis of Tor and JonDonym”
using the examples of Tor and JonDonym. For this purpose, the chapter “Acceptance
Factors of Privacy-Enhancing Technologies on the Basis of Tor and JonDonym”
presents empirical results on which factors influence the acceptance of users for such
About This Book xi
PETs. The chapter “Increasing Users’ Privacy Awareness in the Internet of Things:
Design Space and Sample Scenarios” spans the design space for privacy solutions
aimed at increasing user awareness in Internet of Things (IoT) environments about
the presence of sensors, such as cameras, and presents PriView, a concrete solution
for this.
Finally, we turn to the enterprise context, with the chapter “Challenges, Conflicts,
and Solution Strategies for the Introduction of Corporate Data Protection Measures”
discussing how data protection measures can be introduced in companies and what
social aspects need to be considered in this process. The chapter “Data Cart: A
Privacy Pattern for Personal Data Management in Organizations” then presents
data cart, an example of a concrete solution for the corporate context, which
enables data protection-compliant processing of personal data in companies and
was developed according to the principles of human-centered design, as described
in the chapter “Achieving Usable Security and Privacy Through Human-Centered
Design”.
Contents
Part I Theory
Data Collection Is Not Mostly Harmless: An Introduction to
Privacy Theories and Basics .................................................... 3
Karola Marky
From the Privacy Calculus to Crossing the Rubicon: An
Introduction to Theoretical Models of User Privacy Behavior.............. 11
Nina Gerber and Alina Stöver
Part II Methodology
Empirical Research Methods in Usable Privacy and Security .............. 29
Verena Distler, Matthias Fassl, Hana Habib, Katharina Krombholz,
Gabriele Lenzini, Carine Lallemand, Vincent Koenig,
and Lorrie Faith Cranor
Toward Valid and Reliable Privacy Concern Scales: The Example
of IUIPC-8 ........................................................................ 55
Thomas Groß
Achieving Usable Security and Privacy Through
Human-Centered Design ........................................................ 83
Eduard C. Groen, Denis Feth, Svenja Polst, Jan Tolsdorf,
Stephan Wiefling, Luigi Lo Iacono, and Hartmut Schmitt
What HCI Can Do for (Data Protection) Law—Beyond Design............ 115
Timo Jakobi and Maximilian von Grafenstein
Expert Opinions as a Method of Validating Ideas: Applied to
Making GDPR Usable ........................................................... 137
Johanna Johansen and Simone Fischer-Hübner
xiii
xiv Contents
Part III Application Areas
Privacy Nudges and Informed Consent? Challenges for Privacy
Nudge Design ..................................................................... 155
Verena Zimmermann
The Hows and Whys of Dark Patterns: Categorizations and Privacy ..... 173
Agnieszka Kitkowska
“They see me scrollin”—Lessons Learned from Investigating
Shoulder Surfing Behavior and Attack Mitigation Strategies .............. 199
Alia Saad, Jonathan Liebers, Stefan Schneegass, and Uwe Gruenefeld
Privacy Research on the Pulse of Time: COVID-19
Contact-Tracing Apps ........................................................... 219
Eva Gerlitz and Maximilian Häring
Privacy Perception and Behavior in Safety-Critical Environments ........ 237
Enno Steinbrink, Tom Biselli, Sebastian Linsner, Franziska Herbert,
and Christian Reuter
Part IV Solutions
Generic Consents in Digital Ecosystems: Legal, Psychological,
and Technical Perspectives ...................................................... 255
Bianca Steffes, Simone Salemi, Denis Feth, and Eduard C. Groen
Human-Centered Design for Data-Sparse Tailored Privacy
Information Provision ........................................................... 283
Mandy Goram, Tobias Dehling, Felix Morsbach, and Ali Sunyaev
Acceptance Factors of Privacy-Enhancing Technologies on the
Basis of Tor and JonDonym ..................................................... 299
Sebastian Pape and David Harborth
Increasing Users’ Privacy Awareness in the Internet of Things:
Design Space and Sample Scenarios ........................................... 321
Sarah Prange and Florian Alt
Challenges, Conflicts, and Solution Strategies for the
Introduction of Corporate Data Protection Measures ....................... 337
Christian K. Bosse, Denis Feth, and Hartmut Schmitt
Data Cart: A Privacy Pattern for Personal Data Management in
Organizations ..................................................................... 353
Jan Tolsdorf and Luigi Lo Iacono
Index ............................................................................... 379
Part I
Theory
Data Collection Is Not Mostly Harmless:
An Introduction to Privacy Theories and
Basics
Karola Marky
1 Introduction
The contributions presented in this book belong to the broader field of human factors
in privacy, usable privacy research, or generally deal with the concept privacy.
Usable privacy, in particular, is situated at the intersection of cybersecurity with
a focus on privacy and human–computer interaction [9] specifically considering the
users’ capabilities and knowledge when interacting with a technology.
The remainder of this chapter particularly focuses on the digital life of
individuals and interactions with a digital system, such as a smartphone, personal
computer, or Internet-of-Things (IoT) devices. Before we dive into why we need
privacy, especially in our digital lives, we first take a look at different privacy
definitions and theories that have been described in the literature.
2 Privacy Theories
This section details core privacy theories in the scientific literature. We start
historically with the “The Right to Privacy” [23]. Next, the theories of Westin [24],
Altman [1, 2], and Solove [22] are summarized. From these theories and further
scientific literature, we learn specific properties of privacy and highlight why privacy
is a highly individual concept.
K. Marky ()
Ruhr-University Bochum, Bochum, Germany
e-mail: karola.marky@rub.de
© The Author(s) 2023
N. Gerber et al. (eds.), Human Factors in Privacy Research,
https://doi.org/10.1007/978-3-031-28643- 8_1
3
4K. Marky
The Right to Be Let Alone An early mention of privacy in the literature is the
article “The Right to Privacy” by Warren and Brandeis in 1890 [23]. In this early
work, the authors informally define privacy as the right to be let alone”[23, p. 195].
Warren and Brandeis [23] cite the judge Thomas M. Cooley when making this
statement and refer to a section on bodily integrity in his book [6, p. 29] where
the original quote reads The right to one’s person may be said to be a right of
complete immunity: to be let alone”[6, p. 29]. However, Cooley mainly refers to the
integrity of the human body, specifically to instances of battery, while Warren and
Brandeis take the right to be let alone to the social domain. Further, Cooley does
not attempt to provide a notion of privacy. Also Warren and Brandeis do not attempt
to provide a definition of the right to privacy [18], and they argue that privacy should
be part of the more general right to the immunity of the person, – the right to one’s
personality”[23, p. 207].
Warren and Brandeis specifically mention early technical devices that allow
pictures of individuals to be taken as well as devices that allow eavesdropping
conversations from afar mostly referring to the press that might invade people’s
private lives. Yet, this leaves room for interpretation what the the right to be let
alone entails [21]. Nevertheless, this article had quite an impact by motivating
privacy laws in the USA because it showed that the tort law did not protect privacy
adequately at that time and because privacy violation is an injury to feelings and not
to the body [21, 23].
Westin’s Privacy Theory The right to be let alone”[23, p. 195] was later on
extended to individuals that determine what information about themselves should be
known to others [24]. The political scientist and lawyer Alan F. Westin influenced
how we understand privacy today.
His privacy theory defines privacy as the claim of individuals, groups, or
institutions to determine for themselves when, how, and to what extent information
about them is communicated to others”[24, p. 7]. To show the different reasons
“why” individuals might want privacy, Westin describes four privacy functions,
which are detailed below:
Westin’s Four Privacy Functions
1. Personal autonomy is the desire of individuals to not be manipulated,
dominated, or exposed by others.
2. Emotional release describes a time-out from social demands, such as role
demands.
3. Self-evaluation considers processing experiences.
4. Limited and protected communication sets interpersonal boundaries, while
protected communication exchanges information with trusted peers.
see [24]
Data Collection Is Not Mostly Harmless 5
Westin also details different ways on the “hows” to achieve privacy that he
denotes as states of privacy [24]. Below, we apply these four states to the analog
and digital life and give some examples:
Westin’s Four Privacy States This box gives an overview of Westin’s four
privacy states completed with examples from the analog life (denoted as )
and digital life (denoted as ):
1. Solitude means that information is not shared with others, similar to the
“right to be let alone” [23, p. 195].
There is a possibility to physically separate from others.
A technology provides access control to keep information private.
2. Intimacy refers to information being shared only with specific humans.
Close relationship between peers based on information exchange.
A technology provides options to share information only with specific
humans, e.g., specific posts can only be shared with “friends” in an online
social network.
3. Anonymity means that information cannot be connected to an individual.
The desire of public privacy.
A technology offers the possibility to store or submit anonymized data,
e.g., in an online election, the identities of the voters are not disclosed.
4. Reserve describes that information disclosures to others are limited.
The creation of a psychological barrier against unwanted intrusion.
A technology offers options to limit information disclosures, e.g., IoT
devices do not capture specific information.
see [24]
Altman’s Privacy Regulation Theory Similar like Westin, the social psychologist
Irwin Altman also impacted our understanding of the concept privacy. He concisely
defines privacy as the selective control of access to the self ”[1, p. 24], yet also
captures more nuanced aspects of privacy in his work.
Altman states that privacy involves a dynamic process of boundary control
between individuals [1]. Within this process, the desired level of privacy wanted by
an individual might not match the achieved level in reality. To better describe this,
he models privacy as a non-monotonic function with three different privacy levels:
(1) optimal level where the desired level matches reality, (2), too much privacy,
i.e., the desired level is lower than reality, and (3) too little privacy, i.e., the desired
level is higher than reality. This function also shows several important aspects that
Altman detailed in his later work: privacy, in principal, is a social process, which
is why an in-depth understanding of psychological aspects is needed [2]. Too much
privacy might result in social isolation, while too little might alter the behavior of
6K. Marky
individuals. We will talk about that in more details in the next section. An interesting
extension of Altman’s theory that specifically considers online communication is the
Communication Privacy Management (CPM) by Petronio [17].
Solove’s Privacy Taxonomy While Westin and Altman discuss privacy as a
rather positive concept that enables individuals to exert control, Solove specifically
considers the negative side of privacy invasions [22]. He first dives into different
existing privacy theories mainly demonstrating that those are “too narrow, too broad,
or too vague” [22, p. 8]. Then, he identifies four types of privacy problems that he
uses to build a four-layered taxonomy. Each layer contains a different number of
specific activities that can be done to harm the privacy of individuals:
Solove’s Taxonomy
1. Information collection: surveillance and interrogation
2. Information processing: aggregation (combining different data pieces),
identification (linking information to individuals), insecurity (not protect
stored information adequately), secondary use (using collected information
for a different purpose), and exclusion (not informing individuals properly
about data handling)
3. Information dissemination: breach of confidentiality, disclosure, exposure
(revealing nudity, grief, or bodily functions of individuals), increased
accessibility, blackmail, appropriation (identity misuse), and distortion
(propagating false information)
4. Invasions: intrusion (i.e., disturbing one’s tranquility or solitude), and
decisional interference (i.e., impact on private decisions by governments)
see [22]
It should be noted that each action by itself might not impose any harm on
individuals as long as consent is given [22].
2.1 How (Not) to Define Privacy
Even though several attempts have been made to define privacy later on, no overall
definition has been agreed on so far. Solove discussed different existing privacy
theories concluding that they mainly are “too narrow, too broad, or too vague” [22,
p. 8], and later in his book, he compares the term privacy to the ambiguity of the
term animal to highlight how problematic ambiguity can be [22]. The reason for
that lies in the complexity of privacy as an umbrella term for different concepts
within different disciplines and scopes [22]. Further, privacy has a quite challenging
property: it is a highly individual and elastic concept meaning each individual
Data Collection Is Not Mostly Harmless 7
decides what kind of information they wish to keep private [15]. Something that
is private information for one individual might be happily shared by another.
Further, there are differences in privacy perceptions based on specific contexts,
such as culture [15]. Hence, there are different spheres that can impact privacy
norms on different levels, such as political, socio-cultural, and personal levels [25].
The definition considers the possibility for individuals to exert control on when
and how personal information about them is collected and processed by others [7,
8, 23, 24]. Consequently, it is a personal good that also protects individuals. One
must also mention that sometimes, privacy is considered as a value that can be
traded against specific benefits [5], such as financial benefits or services that are
free of charge. The chapter “From the Privacy Calculus to Crossing the Rubicon: An
Introduction to Theoretical Models of User Privacy Behavior” specifically describes
theories and behavioral models that aim to explain privacy behavior.
Finally, it is also challenging to separate privacy from related concepts, such
as secrecy or anonymity. Especially in the legal context, privacy can be defined
as secrecy, and there are several disagreements on the specific boundaries between
privacy and its related concepts [12]. A core aspect of privacy, however, is that
it is a highly individual concept. Individual differences also make it particularly
challenging to implement one specific overall solution that fits the needs of each
and every individual. Consequently, specific technologies ideally offer a possibility
for individuals to configure it according to their privacy needs. Privacy, furthermore,
can fulfill different functions.
3 Why Do We Need Privacy?
Now that we introduced the concept of privacy, different theories, and its functions,
we discuss why privacy is needed in the first place. Solove’s taxonomy detailed
above already provides a list of negative consequences of privacy invasions [22].
In the remainder of this chapter, we provide three specific reasons why privacy is
important:
1. Missing Privacy Can Bias Decisions: Early research in the field of psychology
showed that sacrificing privacy is not a viable solution. It has repeatedly been
demonstrated that people alter their behavior when observed by others [3,10,19].
For instance, Asch studied the extent to which the opinions and behavior of
a majority could affect individual decisions and judgments of individuals [3].
Therefore, he performed a series of experiments that became known under the
terms elevator test and line test. Both experiments share that one participant
is confronted with a group of actors. In the elevator test, the group performs
unexpected actions, such as facing the elevator’s wall instead of the door. In
the line test, the participants received a card with a line and have to pick a line
that matches the line length on the received card from a set. The actors chose a
line from the set that was obviously not matching the one on the card. Asch’s
8K. Marky
results indicate that individuals conformed to the majority’s opinion even when
the correct answer was obvious. Thus, social influence can make people question
their own decision under the supervision of a contradicting majority. This is
also one reason for a central principle of modern democracies: vote privacy. In
summary, the need for privacy comes from the presence of society and other
individuals around us [14]. Without that, we would not need privacy [14].
2. Missing Privacy Allows Others to Control Us: The amount of information
that another entity holds about individuals can also be used to influence that
specific individual without the presence of other humans. This also relates to
Westin’s privacy function personal autonomy described above [24]. Zuboff coins
the term surveillance capitalism [29] to describe the influence on humans by
massively using data captured about them. More specifically, she describes it as
a “new economic order that claims human experience as free raw material for
hidden commercial practices of extraction, prediction, and sales” [29,p.1]. The
idea behind this is that any kind of data created by human experiences, such as
sharing pictures or purchasing products, is fed into algorithms that aim to subtly
influence actions of humans, e.g., going to a specific restaurant. Such influence
can occur via targeted advertisements, but also via coupons or even games.
While individuals might benefits from such data analysis, many mechanisms are
designed in a way that do not keep individuals in control, and there is a fine
line between benefit and exploitation. A possible solution to that would be not to
process data about individuals.
3. Missing Privacy Can Impact Mental Health: Privacy is an integral human
need. Each individual has different kinds of personal boundaries. In this context,
privacy serves as a boundary control that enables individuals to regulate contact
and information exchange with other individuals on several levels. Too much
information (or contact) is perceived as an invasion of the self [16]. Complete
withdrawal of others, however, can result in feelings of loneliness [16]. There-
fore, privacy regulation is essential for mental health [11].
The reasons outlined above are just a fraction of the reasons to motivate a need
for privacy. Privacy in the digital world is particularly challenging. In the analog
world, humans can use physical restrictions to protect personal information from
others. Until the early two-thousands, the majority of information had been
in analog format. To interact with analog information, humans either needed
to be in the vicinity of the information or had to make a physical copy. To
enforce restrictions based on privacy preferences, humans could physically limit
access to analog information about them. In doing so, humans can decide which
information they share with others. Translating such physical limitations into the
digital world, however, is not trivial.
The ongoing digital transformation is fundamentally changing how humans
interact with information and the kind of information they share with others. At
the beginning of the digital transformation, computers were obvious standalone
devices, and users always intentionally interacted with them. Thus, privacy
did not require much added extra effort. Just two decades later, in 2023, the
majority of information is digital data. Networks, such as the Internet, serve as
Data Collection Is Not Mostly Harmless 9
an infrastructure to interact with data that are stored remotely. Computational
capabilities and sensors for collecting data are integrated into everyday objects
connected to the Internet—the so-called Internet of Things (IoT) [4]. This has
numerous benefits for users, such as availability or convenience of everyday
life [13, 26]. However, the ubiquitous abilities of digital services and the IoT
devices they are connected with raised several privacy challenges because digital
services generate, collect, store, and analyze data about people’s private lives
(cf. [20, 27, 28]). As Warren and Brandeis already feared in 1890, technology
can now penetrate our very private places and eavesdrop on our private conver-
sations [23].
In summary, privacy is a highly individual concept. Missing privacy can
impact mental health, social decisions, and our lives in general. Privacy in the
digital world is challenging for several reasons demanding a need for more in-
depth research in this field and novel solutions that better help protecting the
essential need of our society.
References
1. Altman, I. (1975). The environment and social behavior: Privacy, personal space, territory,
and crowding.ERIC
2. Altman, I. (1990). Toward a transactional perspective. In Environment and behavior studies
(pp. 225–255). Springer.
3. Asch, S. E. (1956). Studies of independence and conformity: I. A minority of one against a
unanimous majority. Psychological monographs: General and Applied, 70(9), 1.
4. Atzori, L., Iera, A., & Morabito, G. (2017). Understanding the Internet of Things: Definition,
potentials, and societal role of a fast evolving paradigm. Ad Hoc Networks, 56, 122–140.
5. Bennett, C. J. (1995). The political economy of privacy: A review of the literature.Centerfor
Social and Legal Research.
6. Cooley, T. M. (1879). Callaghan and Company, Chicago.
7. Culnan, M. J., & Armstrong, P. K. (1999). Information privacy concerns, procedural fairness,
and impersonal trust: An empirical investigation. Organization Science, 10(1), 104–115.
8. Fried, C. (1968). Privacy. Yale Law Journal, 77, 21.
9. Garfinkel, S., & Lipford, H. R. (2014). Usable security: History, themes, and challenges.
Synthesis Lectures on Information Security, Privacy, and Trust, 5(2), 1–124.
10. Jenness, A. (1932). The role of discussion in changing opinion regarding a matter of fact. The
Journal of Abnormal and Social Psychology, 27(3), 279.
11. Johnson, C. A. (1974). Privacy as personal control. Man-Environment Interactions: Evalua-
tions and Applications: Part, 2, 83–100.
12. Margulis, S. T. (2003). Privacy as a social issue and behavioral concept. Journal of Social
Issues, 59(2), 243–261.
13. Marikyan, D., Papagiannidis, S., & Alamanos, E. (2019). A systematic review of the smart
home literature: A user perspective. Technological Forecasting and Social Change, 138, 139–
154.
14. Moore, B. (1984). Privacy: Studies in social and cultural history.
15. Nissenbaum, H. (2020). Protecting privacy in an information age: The problem of privacy in
public. In The ethics of information technologies (pp. 141–178). Routledge.
16. Pedersen, D. M. (1997). Psychological functions of privacy. Journal of Environmental Psy-
chology, 17(2), 147–156.
10 K. Marky
17. Petronio, S. (2002). Boundaries of privacy: Dialectics of disclosure. Suny Press.
18. Schoeman, F. (1984). Privacy: Philosophical dimensions of the literature. Philosophical
Dimensions of Privacy: An Anthology, 1, 33.
19. Sherif, M. (1935). A study of some social factors in perception. Archives of Psychology
(Columbia University).
20. Sivaraman, V., Gharakheili, H. H., Vishwanath, A., Boreli, R., & Mehani, O. (2015). Network-
level security and privacy control for smart-home IoT devices. In 2015 IEEE 11th International
Conference on Wireless and Mobile Computing, Networking and Communications (WiMob)
(pp. 163–167).
21. Solove, D. J. (2002). Conceptualizing privacy. California Law Review 1087–1155.
22. Solove, D. J. (2008). Understanding privacy. Harvard University Press
23. Warren, S., & Brandeis, L. (1890). The right to privacy. In Harvard law review (pp. 193–220).
24. Westin, A. (1967). Privacy and freedom. Atheneum.
25. Westin, A. F. (2003). Social and political dimensions of privacy. Journal of Social Issues, 59(2),
431–453.
26. Wilson, C., Hargreaves, T., & Hauxwell-Baldwin, R. (2017). Benefits and risks of smart home
technologies. Energy Policy, 103, 72–83.
27. Ziegeldorf, J. H., Morchon, O. G., & Wehrle, K. (2014). Privacy in the Internet of Things:
Threats and challenges. Security and Communication Networks, 7 (12), 2728–2742.
28. Zimmermann, V., Dickhaut, E., Gerber, P., & Vogt, J. (2019). Vision: Shining light on smart
homes—supporting informed decision-making of end users. In Proceedings of IEEE European
Symposium on Security and Privacy Workshops (pp. 149–153).
29. Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new
frontier of power. Profile Books.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give appropriate
credit to the original author(s) and the source, provide a link to the Creative Commons license and
indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative
Commons license, unless indicated otherwise in a credit line to the material. If material is not
included in the chapter’s Creative Commons license and your intended use is not permitted by
statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder.