Content uploaded by Philipp K. Masur
Author content
All content in this area was uploaded by Philipp K. Masur on Jun 29, 2020
Content may be subject to copyright.
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
2
Towards an Integration of Individualistic, Networked, and Institutional Approaches to Online
Disclosure and Privacy in a Networked Ecology
Natalya N. Bazarova1
Philipp K. Masur2
1Department of Communication, Cornell University
2Department of Communication, Johannes Gutenberg University Mainz
Author note
Author Contributions: The authors have contributed equally to this work.
Correspondence concerning this article should be addressed to Natalya N. Bazarova, Department
of Communication, Cornell University. E-Mail: nnb8@cornell.edu
Keywords: disclosure, privacy, networked ecology, information control, horizontal and vertical
dimensions of privacy
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
3
Highlights
●Individualistic, networked, and institutional perspectives on disclosure and privacy
●Each perspective circumscribes privacy at a particular level of information control
●Networked ecologies intersect different levels of information control and access
●Control and access are organized along horizontal and vertical dimensions of privacy
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
4
Abstract
In this paper, we review three different approaches to disclosure and privacy: a) an
individualistic approach, which emphasizes an individual’s control over information access and
flow, b) a networked approach focused on information flow in horizontal relations between
people, and c) an institutional approach concerned with public and societal privacy risks from
platforms, providers, and governments. These approaches co-exist largely independently of each
other in privacy and disclosure literature. However, with overlapping public and private spheres
of communication where a presumption of individual agency over personal information is no
longer tenable, we argue for the importance of bridging these perspectives towards a more
multifaceted view on online disclosure and privacy in a networked ecology.
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
5
Towards an Integration of Individualistic, Networked, and Institutional Approaches to Online
Disclosure and Privacy in a Networked Ecology
1. Introduction
Recent surveys have shown that most people feel limited control over their personal
information and report concerns about how their information is collected, tracked, and used by
companies and the government [1,2]. The feeling of loss over control of personal information in
social media underscores the challenges for individualistic models of disclosure and privacy
grounded in the presumption of individuals’ control over their information (see for review, [61]).
As an act of revealing personal information, self-disclosure has predominantly been
conceptualized at the level of individuals, with the main agency over data access and flow
ascribed to the discloser [3–9]. The individual-centered approach has been recently called into
question by the networked perspective on privacy, which shifts information control from an
individual agency to collective disclosure decisions that affect each person’s privacy in a
network [10*,11]. While this is an important development towards understanding disclosure and
privacy from a collective point of view, people’s concerns about corporations’ and government’s
control over their data suggest a need for an even broader perspective on privacy that would
account for both network and institutional forces at play [12*–14]. In an effort to reconcile the
individual-centered perspective with shifting personal information controls, we review the
individualistic, networked, and institutional approaches, towards a more integrative view of
online disclosure and privacy in a networked ecology.
2. Individualistic theories of disclosure
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
6
A majority of individualistic approaches to disclosure and privacy emphasize an
individual’s control of personal information in social interactions. Following Altman’s [15]
dialectical approach, privacy can be regarded as “the selective control of access to the self” (p.
18), where an information holder engages in interpersonal boundary control through the use of
verbal, non-verbal, environmental, and culturally defined behaviors and practices. Accordingly,
individualistic approaches examine privacy as an individual’s choice and control of personal
information in social interactions [16–20]. Most of them emphasize privacy and disclosure
management as a dynamic response to a situation to achieve a desired level of privacy by sharing
or withholding personal information [6**,21,22]. A transfer of information to another context
violates its contextual integrity because of different normative values in different social contexts
[23].
Disclosure, as “the process of making the self known to other persons” [24, p. 91],
presents itself as a dialectical opposite to privacy in that it regulates a selective access to the self
in social interactions [6**,25*]. Studies of disclosure examine it as a volitional act for which
people choose an appropriate level of intimacy and breadth based on individual and situational
factors that influence their perceptions of disclosure rewards and risks [26–30]. Disclosure is, on
the one hand, a form of privacy regulation and information control; however, privacy is often not
an end goal of disclosure, but rather a precondition for disclosure [6**,31,32], to satisfy other
communication and relational goals, such as receiving social support or venting out.
In sum, individualistic approaches to privacy and disclosure are primarily guided by the
“presumption of individual agency”, that is, people’s ability to control the access and flow of
information [25*]. This ability, however, is often undermined in networked environments where
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
7
other actors – either other users, platforms, or governments – have power to limit and override
individuals’ control of access to and flow of personal information.
3. Networked approaches to disclosure and privacy
Many scholars have emphasized that individual control over personal information in
networked environments is no longer feasible and privacy must be understood from an
exclusively networked point of view [10*,11,33,34]. The networked nature of online disclosure
becomes palpable through examples of typical communication dynamics on social media. For
example, even if personal information is disclosed to only a few people, it is possible that one of
the recipients shares the information with originally unintended audiences. Because of the
communication practices on social media (i.e., sharing, liking, and commenting, but also editing
and recontextualizing) and a lack of privacy-by-default settings, such unintended dissemination
happens fast and frequently, allowing information to reach large and unintended audiences.
Second, on social media, personal information is not only disclosed by users themselves. Often, a
user discloses information about another person without the person’s consent (e.g., by uploading
photos with a group of people, tagging people in posts, or simply revealing information about
someone else). These dynamics strongly increase the potential visibility of personal content [35],
lead to the collapse of traditionally distinct contexts [36–38], and cause the blurring of public and
private spheres.
In response to these dynamics, networked approaches to privacy and disclosure highlight
the necessity to engage in collaborative privacy management strategies. Communication privacy
management theory [22] provides a theoretical framework for understanding such practices.
According to this theory, when a discloser shares information with others, those others become
co-owners of this information [22]. Individuals, therefore, establish rules and norms that
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
8
determine the boundaries in which the information is allowed to flow. Such rules are based on
shared social norms and interpersonal trust. They determine the connection between information
co-owners (linkage rules), the openness or closeness of the boundary (permeability rules), and
the rights and responsibilities of each boundary member (ownership rules).
Establishing and protecting such boundaries requires understanding the context in which
one is communicating, the people and parties involved, and the norms and rules that determine
the information flow in this context [11,39]. To navigate the complexities of networked
environments, disclosers, especially teens, engage in creative practices that allow them to retain
some control over personal information and restrict co-ownership to selected parties by
obscuring the meaning of disclosures through shared symbols, cues, references, or language, as
well as creating complex norms about what is acceptable to openly communicate and what is not
[11,39,40].
Therefore, networked approaches to privacy and self-disclosure recognize the
codependency of individuals in safeguarding their privacy in online environments. The available
level of privacy no longer depends only on individual, independent control of access and
regulation choices, but on the behavior and choices of other users. Online privacy management
thus is based on a shared responsibility to protect negotiated boundaries [22,33].
4. Institutional approaches to disclosure and privacy
Because of characteristics of digital information and the technological infrastructure of online
environments, online service providers and institutions gain access to users’ personal information
that they themselves or members of their network share about them, as well as their other
personal data (e.g., information created during a registration process, network information about
links between individual users) and metadata (e.g., activity records, including browsing
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
9
behaviors and deleted content). Whereas the former is knowingly and intentionally produced (cf.
the definition of self-disclosure), the latter is generated and stored automatically from the use of
online services. The combination of personal data and metadata provide fine-grained insights
into the course of an individual's (online) life, as well as mechanisms for data analysis at scale,
both for commercial interests and for mass-surveillance conducted by intelligence agencies [41–
43].
These vertical dynamics pose several challenges to individual privacy. First, identifying the
risks associated with vertical privacy invasions is difficult because many users are unaware of
covert data collection practices and pay limited attention to often convoluted privacy policy and
platform terms of agreement [44]. Second, individual protection against vertical privacy risks is
hardly feasible. Although some control can potentially be exerted through comparatively
sophisticated data protection strategies (e.g., using TOR, encryption, obfuscation, or
pseudonymization), most people do not have the knowledge and skills to implement them [45–
48]. Third, data shared by our online friends can be better predictors of our future behavior than
our own data [49*,50]. Through connections between our friends, online service providers (e.g.,
SNSs) create so-called “shadow profiles” of non-users [13].
Not only individual and networked disclosures can be harvested by third parties for
commercial and surveillance purposes, but big data analytics and algorithmic profiling pose
another privacy control challenge by limiting individuals’ abilities “to self-define, and thus claim
control and agency, over their social trajectory” [12*, p. 591]. Both commercial and institutional
surveillance practices are based on algorithmic big data processes that put individuals in fleeting,
ever-changing, and abstract categories, which may not only be very different from how
individuals would define themselves and would want to be viewed, but which individuals are
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
10
completely unable to challenge. The continuous process of classifying individuals is not (only)
based on information that the individuals (or their friends) have shared or on metadata that they
have produced, but rather relies on the constant accumulation of person-related data and
metadata from all users of various platforms. This implies that even if an individual does
everything to protect herself against horizontal and vertical privacy invasions (e.g., implementing
encryption, anonymization, etc.), follows the established rules within her privacy boundaries (the
networked perspective), and assuming that all those measures actually work, her privacy can still
be invaded because data collected in completely different contexts and situations about unrelated
individuals nonetheless allows inferring and predicting information about her [51,52].
5. Towards a more integrative view on disclosure and privacy in networked ecology
As outlined above, there are multiple agents that have access to or exert control over
users’ personal information in online environments. Studying a disclosure-related phenomenon
from only one of the discussed perspectives circumscribes it around a particular level of
information control. However, whenever corporate and government interests intersect with
individuals’ private pursuits, privacy concerns span the boundary between public and
private and should be studied with regard to “relationships between individuals and
corporate or government organizations as well as to relations among individuals” [53, p. 213].
This “kaleidoscope of privacy and surveillance” [54, p. 32] calls for a conceptual bridging of
individualistic, networked, and institutional approaches to disclosure and privacy, instead of
treating these levels of privacy concerns as independent of each other.
As the initial step towards this bridging, we organize information access (direct and
indirect) and control (primary, proximal, and distal) along horizontal and vertical dimensions
(see Figure 1). For access, information disclosed in networked environments is directly
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
11
accessible horizontally to the intended recipients (i.e., other users; direct horizontal access) and
vertically to platform providers (direct vertical access). However, intended users may share this
information with unintended audiences (indirect horizontal access), and platform providers may
also share this information with collateral third-parties, such as other commercial actors or
governments (indirect vertical access).
Figure 1. Disclosure and privacy in a networked ecology.
For information control, we distinguish between primary, proximal, and distal
information control holders. Initial disclosers (either individuals themselves or members of their
network who share information about them) hold a primary control by exercising individual
privacy boundary management on the horizontal level (i.e., deciding what to share and to whom)
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
12
and individual data protection on the vertical level (i.e., using sophisticated data protection
strategies such as encryption, anonymization, obfuscation). Understanding how individuals
account for both vertical and horizontal risks [e.g., [6**,55,56]], and how shared rules and norms
factor into their privacy boundary management, especially when they disclose information about
someone else, is a step towards bridging the individualistic view with networked and institutional
perspectives on disclosure and privacy.
On the horizontal level, the intended information receivers become information co-
owners and thus have proximal control over the shared data, which is regulated by privacy rules
and norms that determine collective privacy boundary management (see Figure 1). Networked
approaches [11,22] serve well to explain these collective privacy management practices, but
more work needs to be done to explain dissemination of collective privacy norms and rules in a
network, as well as their sensemaking and internalization by proximal information holders.
These approaches also need to account for how perceptions of vertical privacy risks may shape
collective rules, norms, and behaviors with regard to information flow.
On the vertical level, platforms or third-parties with whom platforms share this
information (collateral third-parties) gain a distal control of user data. Although users authorize
their access through companies’ terms of use in exchange for using company services, platforms
and even more so third-parties, are not the intended recipients of disclosures. Being at least two-
steps removed from distal information-control parties, primary information holders have limited
awareness or control over their privacy at the vertical level and, therefore, can feel digitally
resigned and helpless [57,58]. As discussed earlier and shown in Figure 1, preventing such
privacy invasions is challenging and in many cases impossible for the individual. Vertical control
challenges underscore the importance of understanding privacy as a social and collective value
[23,59]. This requires a recognition of the codependency of privacy in networked environments
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
13
and that actions of one person may produce privacy loss of others [50]. The ability for
individuals or collectives to exert agency over their information is severely limited when
information related to any individual (not only those in their own social network) is used to make
inferences about an individual’s personality and behavior, even in the absence of disclosure from
primary information holders. Against such invasions, the only remaining privacy protection
mechanism is collective data protection, such as challenging the societal conditions (e.g., through
democratic deliberation) that have led to the need for privacy protection and information control
in the first place on a societal level [12*,14,60*].
Bringing all three perspectives together thus allows us to distinguish between different
types of information control and access, and individual and collective privacy risks associated
with them. It further acknowledges the intertwined nature of vertical and horizontal dynamics in
networked environments in which platform providers not only exert information control and pose
risks for individuals’ privacy, but also provide spaces in which communication and thus
horizontal privacy regulation takes place.
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
14
References
[1] B. Auxier, L. Rainie, M. Anderson, A. Perrin, M. Kumar, Turner Erica, Americans and
privacy: Concerned, confused and feeling lack of control over their personal information.,
Pew Research Center, 2019. https://www.pewresearch.org/internet/2019/11/15/americans-
and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-
information/.
[2] European Commission, Special Eurobarometer 487a: The General Data Protection
Regulation, 2019.
https://ec.europa.eu/commfrontoffice/publicopinion/index.cfm/ResultDoc/download/Docu
mentKy/86886.
[3] N. Kashian, J. Jang, S.Y. Shin, Y. Dai, J.B. Walther, Self-disclosure and liking in
computer-mediated communication, Comput. Hum. Behav. 71 (2017) 275–283.
https://doi.org/10.1016/j.chb.2017.01.041.
[4] N.C. Krämer, J. Schäwel, Mastering the challenge of balancing self-disclosure and privacy
in social media, Curr. Opin. Psychol. 31 (2020) 67–71.
https://doi.org/10.1016/j.copsyc.2019.08.003.
[5] R. Lin, S. Utz, Self-disclosure on SNS: Do disclosure intimacy and narrativity influence
interpersonal closeness and social attraction?, Comput. Hum. Behav. 70 (2017) 426–436.
https://doi.org/10.1016/j.chb.2017.01.012.
[6] **P.K. Masur, Situational privacy and self-disclosure: Communication processes in online
environments, Springer, Cham, Switzerland, 2018. https://doi.org/10.1007/978-3-319-
78884-5.
The book synthesizes independently developed theories of privacy and self-disclosure
into one comprehensive framework that allows the antecedents of self-disclosure to be
identified and systematized into personal and environmental as well as non-situational and
situational factors. It further investigates this theory of situational privacy and self-
disclosure in the context of smartphone-based communication.
[7] S. Sannon, B. Stoll, D. DiFranzo, M.F. Jung, N.N. Bazarova, “I just shared your responses”
Extending Communication Privacy Management Theory to Interactions with
Conversational Agents, Proc. ACM Hum.-Comput. Interact. 4 (2020) 1–18.
[8] S. Sannon, E.L. Murnane, N.N. Bazarova, G. Gay, “ I was really, really nervous posting it”
Communicating about Invisible Chronic Illnesses across Social Media Platforms, in: Proc.
2019 CHI Conf. Hum. Factors Comput. Syst., 2019: pp. 1–13.
[9] M. Tsay-Vogel, J. Shanahan, N. Signorielli, Social media cultivating perceptions of
privacy: A 5-year analysis of privacy attitudes and self-disclosure behaviors among
Facebook users, New Media Soc. 20 (2018) 141–161.
https://doi.org/10.1177/1461444816660731.
[10] *S. Barocas, K. Levy, Privacy Dependencies, Wash. Law Rev. Forthcom. (2019).
This article outlines three types of privacy dependencies that determine how our privacy
depends on other people’s decisions and behaviors: individuals’ social ties, similarities to
others, and differences from other people. Each type of dependency is characterized by
distinct mechanisms, values, and normative concerns that must be accounted for when
developing technical interventions and regulatory solutions for privacy protection.
[11] A.E. Marwick, danah boyd, Networked privacy: How teenagers negotiate context in social
media:, New Media Soc. (2014). https://doi.org/10.1177/1461444814543995.
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
15
[12] *L. Baruh, M. Popescu, Big data analytics and the limits of privacy self-management, New
Media Soc. 19 (2017) 579–596.
This article examines how big data analytics and predictive algorithms used ubiquitously
for commercial purposes normalize privacy invasions and can accentuate social
inequalities. The algorithmic social sorting made possible through pervasive surveillance
and big data analytics “thin-slices” individuals into social categories and profiles, which
drastically limit individuals’ agency, control, and choices. The article argues for the
importance of acknowledging the collective and social dimensions of privacy as an
alternative to the individual-centric “notice and choice” privacy management framework in
today’s digital environment.
[13] D. Garcia, Privacy beyond the individual, Nat. Hum. Behav. 3 (2019) 112–113.
https://doi.org/10.1038/s41562-018-0513-2.
[14] M. Popescu, L. Baruh, Privacy as Cultural Choice and Resistance in the Age of
Recommender Systems, in: Routledge Handb. Digit. Writ. Rhetor., Routledge, 2018: pp.
280–290.
[15] I. Altman, The environment and social behavior: Privacy, personal space, territory, and
crowding, Wadsworth, Belmont, CA, 1975.
[16] A. Acquisti, L. Brandimarte, G. Loewenstein, Privacy and human behavior in the age of
information, Science. 347 (2015) 509–514. https://doi.org/10.1126/science.aaa1465.
[17] N. Bol, T. Dienlin, S. Kruikemeier, M. Sax, S.C. Boerman, J. Strycharz, N. Helberger, C.H.
De Vreese, Understanding the effects of personalization as a privacy calculus: analyzing
self-disclosure across health, news, and commerce contexts, J. Comput.-Mediat. Commun.
23 (2018) 370–388.
[18] S. Trepte, L. Reinecke, N.B. Ellison, O. Quiring, M.Z. Yao, M. Ziegele, A cross-cultural
perspective on the privacy calculus, Soc. Media Soc. 3 (2017) 2056305116688035.
[19] P.J. Wisniewski, B.P. Knijnenburg, H.R. Lipford, Making privacy personal: Profiling social
network users to inform privacy education and nudging, Int. J. Hum.-Comput. Stud. 98
(2017) 95–108.
[20] D. Yang, Z. Yao, R. Kraut, Self-disclosure and channel difference in online health support
groups, in: Elev. Int. AAAI Conf. Web Soc. Media, 2017.
[21] L. Palen, P. Dourish, Unpacking “privacy” for a networked world, in: Proc. SIGCHI Conf.
Hum. Factors Comput. Syst., Association for Computing Machinery, Ft. Lauderdale,
Florida, USA, 2003: pp. 129–136. https://doi.org/10.1145/642611.642635.
[22] S. Petronio, Boundaries of privacy: Dialectics of disclosure, State University of New York
Press, Albany, 2002.
[23] H.F. Nissenbaum, Privacy in context: Technology, policy, and the integrity of social life,
Stanford Law Books, Stanford, 2010.
[24] S.M. Jourard, P. Lasakow, Some factors in self-disclosure, J. Abnorm. Soc. Psychol. 56
(1958) 91–98. https://doi.org/10.1037/h0043357.
[25] *J.L. Crowley, A framework of relational information control: a review and extension of
information control research in interpersonal contexts, Commun. Theory. 27 (2017) 202–
222. https://doi.org/10.1111/comt.12115.
This paper synthesizes literature on information control and outlines a new framework of
information control in interpersonal relationships. The proposed framework offers a
typology of implicit and explicit forms of information control, addresses antecedents,
outcomes, and contexts of information control, and accounts for the perspectives of both the
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
16
sender and the target.
[26] N. Andalibi, A. Forte, Announcing pregnancy loss on Facebook: A decision-making
framework for stigmatized disclosures on identified social network sites, in: Proc. 2018
CHI Conf. Hum. Factors Comput. Syst., 2018: pp. 1–14.
[27] N.N. Bazarova, Y.H. Choi, Self-disclosure in social media: Extending the functional
approach to disclosure motivations and characteristics on social network sites, J. Commun.
64 (2014) 635–657.
[28] H. Krasnova, S. Spiekermann, K. Koroleva, T. Hildebrand, Online social networks: Why
we disclose, J. Inf. Technol. 25 (2010) 109–125. https://doi.org/10.1057/jit.2010.6.
[29] J. Omarzu, A disclosure decision model: Determining how and when individuals will self-
disclose, Personal. Soc. Psychol. Rev. 4 (2000) 174–185.
https://doi.org/10.1207/S15327957PSPR0402_05.
[30] E.L. Spottswood, J.T. Hancock, Should I share that? Prompting social norms that influence
privacy behaviors on a social networking site, J. Comput.-Mediat. Commun. 22 (2017) 55–
70.
[31] C.A. Johnson, Privacy as personal control, Man-Environ. Interact. Eval. Appl. Part. 2
(1974) 83–100.
[32] S. Trepte, P.K. Masur, Need for privacy, in: Zeigler-Hill, V., Shakelford, T. K. (Ed.),
Encycl. Personal. Individ. Differ., Springer, London, 2017. https://doi.org/10.1007/978-3-
319-28099-8‗.
[33] R. De Wolf, Contextualizing how teens manage personal and interpersonal privacy on
social media, New Media Soc. (2019) 1461444819876570.
[34] R. De Wolf, K. Willaert, J. Pierson, Managing privacy boundaries together: exploring
individual and group privacy management strategies in Facebook, Comput. Hum. Behav.
35 (2014) 444–454. http://dx.doi.org/10.1016/j.chb.2014.03.010.
[35] d. boyd, Social network sites as networked publics: Affordances, dynamics, and
implications, in: Z. Papacharissi (Ed.), Networked Self Identity Community Cult. Soc.
Netw. Sites, Routledge, New York, NY, 2011: pp. 39–58.
[36] d. boyd, Taken out of context. American teen sociality in networked publics: Dissertation,
University of California, Berkeley, CA, 2008.
[37] J. Vitak, The impact of context collapse and privacy on social network site disclosures, J.
Broadcast. Electron. Media. 56 (2012) 451–470.
https://doi.org/10.1080/08838151.2012.732140.
[38] A.F. Zillich, K.F. Müller, Norms as Regulating Factors for Self-Disclosure in a Collapsed
Context: Norm Orientation Among Referent Others on Facebook, Int. J. Commun. 13
(2019) 20.
[39] A.E. Marwick, danah boyd, Understanding Privacy at the Margins, Int. J. Commun. 12
(2018) 9.
[40] E. Oolo, A. Siibak, Performing for one’s imagined audience: Social steganography and
other privacy strategies of Estonian teens on networked publics, Cyberpsychology J.
Psychosoc. Res. Cyberspace. 7 (2013). https://doi.org/10.5817/CP2013-1-7.
[41] G. Greenwald, No Place to Hide: Edward Snowden, the NSA and the surveillance state,
Hamish Hamilton, 2014.
[42] M. Popescu, L. Baruh, M. Popescu, L. Baruh, P. Messaris, L. Humphreys, Consumer
surveillance and distributive privacy harms in the age of big data, Digit. Media Transform.
Hum. Commun. (2017) 313–327.
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
17
[43] E. Snowden, Permanent Record, 1st Edition, Metropolitan Books, New York, 2019.
[44] N. Steinfeld, “I agree to the terms and conditions”: (How) do users read privacy policies
online? An eye-tracking experiment, Comput. Hum. Behav. 55 (2016) 992–1000.
https://doi.org/10.1016/j.chb.2015.09.038.
[45] L. Baruh, E. Secinti, Z. Cemalcilar, Online privacy concerns and privacy management: A
Meta-Analytical Review, J. Commun. (2017). https://doi.org/10.1111/jcom.12276.
[46] M. Büchi, N. Just, M. Latzer, Caring is not enough: The importance of Internet skills for
online privacy protection, Inf. Commun. Soc. 20 (2016) 1261–1278.
https://doi.org/10.1080/1369118X.2016.1229001.
[47] P.K. Masur, D. Teutsch, S. Trepte, Entwicklung und Validierung der Online-
Privatheitskompetenzskala (OPLIS), Diagnostica. 63 (2017) 256–268.
https://doi.org/10.1026/0012-1924/a000179.
[48] S. Trepte, D. Teutsch, P.K. Masur, C. Eicher, M. Fischer, A. Hennhöfer, F. Lind, Do People
Know About Privacy and Data Protection Strategies? Towards the “Online Privacy Literacy
Scale” (OPLIS), in: S. Gutwirth, R. Leenes, P. de Hert (Eds.), Reforming Eur. Data Prot.
Law, Springer Netherlands, Dordrecht, 2015: pp. 333–365. https://doi.org/10.1007/978-94-
017-9385-8_14.
[49] *J.P. Bagrow, X. Liu, L. Mitchell, Information flow reveals prediction limits in online
social activity, Nat. Hum. Behav. 3 (2019) 122–128. https://doi.org/10.1038/s41562-018-
0510-5.
This paper provides empirical evidence that personal information about the activities and
interests of individuals can be accurately predicted from their social ties in online social
networks. Online social networks have embedded information about individuals, even if
they are not on the platform themselves.
[50] E. Sarigol, D. Garcia, F. Schweitzer, Online privacy as a collective phenomenon, in: Proc.
Second ACM Conf. Online Soc. Netw., Association for Computing Machinery, Dublin,
Ireland, 2014: pp. 95–106. https://doi.org/10.1145/2660460.2660470.
[51] M. Mann, T. Matzner, Challenging algorithmic profiling: The limits of data protection and
anti-discrimination in responding to emergent discrimination, Big Data Soc. 6 (2019)
2053951719895805.
[52] T. Matzner, Why privacy is not enough privacy in the context of “ubiquitous computing”
and “big data,” J. Inf. Commun. Ethics Soc. (2014).
[53] P.M. Regan, Legislating privacy: Technology, social values, and public policy., University
of North Carolina Press, Chapel Hill, NC, 1995.
[54] G. Marx T., Coming to Terms: The Kaleidoscope of Privacy and Surveillance, in: Soc.
Dimens. Priv. Interdiscip. Perspect., Cambridge University Press, Cambridge, MA, 2015:
pp. 32–49.
[55] A.E. Marwick, E. Hargittai, Nothing to hide, nothing to lose? Incentives and disincentives
to sharing information with institutions online, Inf. Commun. Soc. 22 (2019) 1697–1713.
[56] H. Xu, The effects of self-construal and perceived control on privacy concerns, ICIS 2007
Proc. (2007) 125.
[57] H. Choi, J. Park, Y. Jung, The role of privacy fatigue in online privacy behavior, Comput.
Hum. Behav. 81 (2018) 42–51.
[58] N.A. Draper, J. Turow, The corporate cultivation of digital resignation, New Media Soc. 21
(2019) 1824–1839. https://doi.org/10.1177/1461444819833331.
[59] P.M. Regan, Privacy and the common good: revisited, in: B. Roessler, D. Mokrosinska
DISCLOSURE AND PRIVACY IN A NETWORK ECOLOGY
18
(Eds.), Soc. Dimens. Priv. Interdiscip. Perspect., Cambridge University Press, 2015: pp.
50–70.
[60] *P.K. Masur, How online privacy literacy facilitates self-data protection and supports
informational self-determination in the age of information, Media Commun. (2020).
The paper proposes a comprehensive model of online privacy literacy that encompasses
factual privacy knowledge, privacy-related reflection abilities, privacy and data protection
skills, as well as critical privacy literacy. It argues that such a literacy not only empowers
individuals to protect themselves against privacy intrusions, but also allows them to
challenge privacy-invasive status quo.
[61] N.N. Bazarova, Online Disclosure. C.R. Berger, M.E. Roloff, S.R. Wilson, J.P. Dillard,
J. Caughlin, D. Solomon (Eds.), In The International Encyclopedia of Interpersonal
Communication (2015). 10.1002/9781118540190.wbeic251