Article

The mismeasurement of privacy: Using contextual integrity to reconsider privacy in HCI

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Privacy is a widely studied concept in relation to social computing and sensor-based technologies; scores of research papers have investigated people's "privacy preferences" and apparent reluctance to share personal data. In this paper we explore how Ubicomp and HCI studies have approached the notion of privacy, often as a quantifiable concept. Leaning on several theoretical frameworks, but in particular Nissenbaum's notion of contextual integrity, we question the viability of obtaining universal answers in terms of people's "general" privacy practices and apply elements of Nissenbaum's theory to our own data in order to illustrate its relevance. We then suggest restructuring inquiries into information sharing in studies of state-of-the-art technologies and analyze contextually grounded issues using a different, more specific vocabulary. Finally, we provide the first building blocks to such vocabulary.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... A common model for privacy used in PI research is disclosure, or "the telling of the previously unknown so that it becomes shared knowledge" [31]. This is often operationalized through disclosure practices, preferences [15,58,60], and concerns [7,42,58], such as whether and how information should be shared in the context of family-centered health-tracking [58]. ...
... Other authors in HCI have also advocated for understanding privacy beyond disclosure [7,22,35,50,51,54,57]. Specifically, we draw from the framework of contextual integrity to understand our participants' transparency experiences and contributing factors in contrast to disclosure practices, preferences, and concerns. ...
... Doing so enables us to better understand privacy in terms of specific contextual factors related to (in)visibility of identity. This helps move towards a more holistic model of collective and situated information sharing practices [7,22,54], and allows us to foreground and address the distinct experiences of vulnerable populations [40]. This paper addresses the following research questions: ...
Conference Paper
Full-text available
Research in personal informatics (PI) calls for systems to support social forms of tracking, raising questions about how privacy can and should support intentionally sharing sensitive health information. We focus on the case of personal data related to the self-tracking of bipolar disorder (BD) in order to explore the ways in which disclosure activities intersect with other privacy experiences. While research in HCI often discusses privacy as a disclosure activity, this does not reflect the ways in which privacy can be passively experienced. In this paper we broaden conceptions of privacy by defining transparency experiences and contributing factors in contrast to disclosure activities and preferences. Next, we ground this theoretical move in empirical analysis of personal narratives shared by people managing BD. We discuss the resulting emergent model of transparency in terms of implications for the design of socially-enabled PI systems. CAUTION: This paper contains references to experiences of mental illness, including self-harm, depression, suicidal ideation, etc.
... Data-driven technologies have entered almost every aspect of our everyday, professional, and private lives [2,3,6,11,14,21,25,29,32]. Even occupations that were once considered largely manual labor (e.g., electricians, plumbers, construction work, and facility managers) increasingly require computing skills, as routine tasks become both datadriven and analytic [28]. ...
... Such data become part of a boundary-regulating process in terms of what details about their work that employees are comfortable sharing and disclosing [24]. When it comes to tracking, boundaries for disclosure and transparency are not fixed, but change over time [2,14,24]. Boundary regulation is not a static act [2], and also with tracking of work it is negotiated when and how it is deemed reasonable by employees that data can be used for a certain purpose. A key aspect of studying accountability and tracking in the bluecollar workplace therefore concerns how employees negotiate reasonable levels of detail to make visible, and their conditions for such disclosures. ...
... When it comes to tracking, boundaries for disclosure and transparency are not fixed, but change over time [2,14,24]. Boundary regulation is not a static act [2], and also with tracking of work it is negotiated when and how it is deemed reasonable by employees that data can be used for a certain purpose. A key aspect of studying accountability and tracking in the bluecollar workplace therefore concerns how employees negotiate reasonable levels of detail to make visible, and their conditions for such disclosures. ...
Conference Paper
Full-text available
This paper examines how mobile technology impacts employee accountability in the blue-collar data-driven workplace. We conducted an observation-based qualitative study of how electricians in an electrical company interact with data related to their work accountability, which comprises the information employees feel is reasonable to share and document about their work. The electricians we studied capture data both manually, recording the hours spent on a particular task, and automatically, as their mobile devices regularly track data such as location. First, our results demonstrate how work accountability manifests for employees' manual labor work that has become data-driven. We show how employees work through moments of transparency, privacy, and accountability using data focused on location, identification and time. Second, we demonstrate how this data production is interdependent with employees' beliefs about what is a reasonable level of detail and transparency to provide about their work. Lastly, we articulate specific design implications related to work accountability.
... For example, they tend to create groups such as "work", "school", "family", "close friends", "church friends", etc., and selectively disclose information to these groups [16,25,35]. While these techniques were designed to better target users' information disclosure to the desired audience, users complained about the great amount of labor involved, such as the upfront time investment [3,37], reoccurring efforts [16,27], and insufficient self-efficacy in privacy management [31]. Some users would configure these settings at some point, but tended to use them only once or stop maintaining them [25]. ...
... Another two studies were based on users' interactions with prototypes of these techniques [5,17]. Second, in other studies that asked participants about their past usage of these techniques for their posts, a few participants reported limited past experience with these techniques [3,16,19,25,35,37]. For example, one study showed that participants configure the settings, but never had a situation where they actually needed to control audience [16]. ...
... Thus, they did not reflect users' long-term experience about SNS granular audience control. Third, several studies found that the low usage of the granular settings was due to the great amount of effort involved in first-time configuration [3,37]. These studies however did not investigate how users' perception of these burdens evolved over time, nor how users would trade off these burdens with the benefits of high granularity of audience control after long-term experience with these techniques. ...
Preprint
Full-text available
Privacy researchers have suggested various granular audience control techniques for users to manage the access to their disclosed information on social network sites. However, it is unclear how users adopt and utilize such techniques in daily use and how these techniques may impact social interactions with others over time. In this study, we examine users' experience during everyday use of granular audience control techniques in WeChat, one of the most popular social network applications in China. Through an interview study with 24 WeChat users, we find that users adjust their configurations and develop rationales for these configurations over time. They also perceive mixed impacts of WeChat's privacy settings on their information disclosure and social interactions, which brings new challenges to privacy design. We discuss the implications of these findings and make suggestions for the future design of privacy settings in SNSs. 1
... The networked privacy research community [4,11,13,24,26,28] is growing quickly as the discourse about and around privacy is becoming increasingly prominent in academia and in the public. One consensus among this research community is that the term "privacy" is complex, misunderstood, and often misused in empirical HCI research [6]. One way to solve the problem of the often fragmented and erratic use of the term privacy in HCI is for our community to converge on a subset of core privacy theories and frameworks that can meaningfully inform our scholarly work and provide a common foundation in which to move our field forward. ...
... Some classify information type by sensitivity [1,18], others focus on privacy as awareness and control of information [14], and still others approach it from a state-based perspective where there are different privacy states (e.g., anonymity, intimacy) [22]. More recently, norm-based approaches have been used to frame privacy as appropriate information sharing [6] such as Nissenbaum's (keynote speaker; see side bar) framework of Contextual Integrity (CI) [15]. Such work shows the value of integrating privacy theories and frameworks into empirically driven privacy research. ...
... This research suggests that users may not always weigh costs and benefits in what researchers might consider a rational way. Therefore, a number of researchers have started to move towards more nuanced approaches for conceptualizing and measuring privacy that focus on context and norms [6]. ...
Conference Paper
Privacy has been a key research theme in the CSCW and HCI communities, but the term is often used in an ad hoc and fragmented way. This is likely due to the fact that privacy is a complex and multi-faceted concept. This one-day workshop will facilitate discourse around key privacy theories and frameworks that can inform privacy research with the goal of producing guidelines for privacy researchers on how and when to incorporate which theories into various aspects of their empirical privacy research. This will lay the groundwork to move the privacy field forward.
... Ackerman advises that privacy is "individually subjective and socially situated" [11]. Privacy concerns should go beyond universal notions of participant's privacy needs as it depends on interpretive information that is perceived differently amongst various communities and individuals [12]. Finally, the onus is placed on researchers to ensure that participants "do not give their permission for something without understanding the consequences" [8]. ...
... Participants' narrations were those they felt most comfortable to share, thus private information was withheld in the documentary processes as participants understood the social considerations they upheld in the their closely related groups. This aspect should not be underestimated as the role of modesty (not sharing more than necessary) often directs scenarios of personal story sharing [12]. ...
Conference Paper
The aim of this work¹ in progress presents a technology tool as a platform for exploring data from an art-based research project in geographically marginalised communities. The perspectives of the research participants on their identity processes and art making inspired the pursuit of a HCI-based (Human Computer Interaction-based) technological platform for the purpose of giving life to the collected data and art outcomes. Vital ethical considerations for the creation of such a platform and the roles communities and researchers will play in the process, are considered in this paper.
... Studies on Privacy and Technology Literacy. Another extension for future research is conducting user studies with the visualization tool on individual technology usage, personal data and information sharing issues, and privacy concerns [1] (e.g. exploring user privacy preferences across these widely used technology products and services with a customized tooltip that displays privacy policies). ...
Conference Paper
Full-text available
A number of large technology companies, or so-called "tech giants", such as Alphabet/Google, Amazon, Apple, Facebook, and Microsoft, are increasingly dominant in people's daily lives, and critically studied in fields such as Science, Technology and Society (STS) studies, with an emphasis on technology, data, and privacy. This project aims to contribute to research at the intersection of technology and society with a prototype visualization tool that shows the vast spread and scope of these large technology companies. In this paper, a prototype graph visualization of notable American technology companies, their acquisitions, and services is presented. The potential applications and limitations of the visualization tool for research are explored. This is followed by a discussion of applying the visualization tool to research on personal data and privacy concerns and possible extensions. In particular, difficulties of data collection and representation are emphasized.
... Of course, the negotiation of these rules can be complex-as norms develop, they also fluctuate as both membership in the community and tools for communication change [65]. Therefore, understanding how norms function within a community is critical in the context of online spaces, to avoid designs that poorly align with social factors [4]. ...
Article
Social norms as a regulatory mechanism often carry more weight than formal law--particularly in contexts when legal rules are gray. In online creative communities that focus on remix, community members must navigate copyright complexities regarding how they are permitted to re-use existing content. This paper focuses on one such community--transformative fandom--where strong social norms regulate behavior beyond copyright law. We conducted interviews with fan creators about their "unwritten rules" surrounding copying and remix and identified highly consistent social norms that have been remarkably effective in policing this community. In examining how these norms have formed over time, and how they are enforced, we conclude that the effectiveness of norms in encouraging cooperative behavior is due in part to a strong sense of social identity within the community. Furthermore, our findings suggest the benefits of creating formal rules within a community that support existing norms, rather than imposing rules from external sources.
... This gap in the privacy literature comes from the way context is conceptualised and on how groups and contexts are distinguished. There is general agreement that context is key to privacy (e.g., Barkhuus, 2012;Marwick & boyd, 2011;Nissenbaum, 2004) but not about how context should be conceptualised. For instance, Nissenbaum (2015) defines context as a physical place, a technology system/platform, a business model, a sector or industry, or a social domain; Masur (2018) emphasises psychological situations rather than contexts; Davis and Jurgenson (2014) define context as encompassing the shared meaning associated with a space, the people in it, and the associated identity meanings; under the social identity approach, context refers to a situation composed of a set of intra-or intergroup cues that make salient one of an individual's (psychologically relevant) identities, meaning that a change in the cues constitutes a change in context, while other aspects of the place can remain the same (see Turner, Hogg, Oakes, Reicher, & Wetherell, 1987). ...
Article
Full-text available
Privacy is a psychological topic suffering from historical neglect—a neglect that is increasingly consequential in an era of social media connectedness, mass surveillance, and the permanence of our electronic footprint. Despite fundamental changes in the privacy landscape, social and personality psychology journals remain largely unrepresented in debates on the future of privacy. By contrast, in disciplines like computer science and media and communication studies, engaging directly with sociotechnical developments, interest in privacy has grown considerably. In our review of this interdisciplinary literature, we suggest four domains of interest to psychologists. These are as follows: sensitivity to individual differences in privacy disposition, a claim that privacy is fundamentally based in social interactions, a claim that privacy is inherently contextual, and a suggestion that privacy is as much about psychological groups as it is about individuals. Moreover, we propose a framework to enable progression to more integrative models of the psychology of privacy in the digital age and in particular suggest that a group and social relations–based approach to privacy is needed.
... [41] When information flows respect contextual norms, individual (and arguably, societal) information privacy is maintained. In 2012, Louise Barkhuus called broadly on Human-Computer Interaction and UBICOMP researchers to draw upon CI in their examinations of personal information sharing practices "for a more nuanced treatment of the notion of privacy within HCI." [11] Potentially because CI has less of a focus on interpersonal interactions and more on societal level norms and systemic applications, there has been less reliance on CI in the CSCW literature. One reason may be that information flows are typically designed at the system level and imposed on users, rather than created from the bottom up. ...
Article
Direct to consumer genetic testing (DTCGT) services where users can identify their inherited diseases and traits, find genetic relatives, and learn more about their ethnic heritage continue to grow in popularity. At the same time, one's DNA is one of the most identifiable, immutable forms of personal information, and sharing it carries risks to one's privacy. What motivates individuals to engage in DTCGT, and what are the implications for information privacy at both the individual and societal levels? This study uses qualitative interviews with ten customers of the DTCGT service 23andMe to explore why they engaged in DTCGT, the benefits they received, their expectations of privacy, and perceptions of risk. It also introduces the use of social exchange theory as a theoretical framework for examining the social dimensions of information privacy and personal disclosure. The findings demonstrate that the participants' assumptions of anonymity, as well as their their belief that their contributions to online genetic databases aid the public good, were key motivating factors. The participants were generally unaware of the potential risks to their individual genetic privacy as well as the impact of large scale genetic testing databases on networked, collective privacy. The findings demonstrate that framework of social exchange theory aids in understanding how the form of the relationship affected the participants' decisions to disclose their personal information to 23andMe as well as their perceptions of risk in the DTCGT context.
... al. [16] ran a survey to predict human expectations of robot privacy. Both borrow methods from Human-Computer Interaction and Ubiquitous Computing, e.g., [17] [18], and form the inspiration for some of the variables explored in this paper. ...
Conference Paper
Full-text available
This paper explores peoples attitudes about a service robot using customer data in conversation. In particular, how can robots understand privacy expectations in social grey-areas like cafes, which are both open to the public and used for private meetings? To answer this question, we introduce the Theater Method, which allows a participant to experience a "violation" of their privacy rather than have their actual privacy be violated. Using Python to generate 288 scripts that fully explored our research variables, we ran a large-scale online study (N=4608). To validate our results and ask more in-depth questions, we also ran an in-person follow-up (N=20). The experiments explored social & data-inspired variables such as data source, the positive or negative use of that data, and whom the robot verbally addressed, all of which significantly predicted participants' social attitudes towards the robot's politeness, consideration, appropriateness, and respect of privacy. Body language analysis and cafe-related conversation were the lowest risk, but even more extreme data channels are potentially okay when used for positive purposes.
... The concerns about privacy are related to who has access to one's personal data, who can use it, to do what and how it is shared. However, the notion of 'privacy' in ITC is broadly used for any kind of issues related to the recording, use and sharing of personal information (Barkhuus, 2012;Joinson et al., 2010) while the concerns at stake often relates to different kinds of impact of the use of personal information, such as the changes in users' behaviors due to exposition of their own information to the public and the relationship with the self and with other people due to the representation of the identity through data. In this setting the need of considering privacy (and other issues related to the use of personal information) 'by design' is growing among the research and design community so to both create knowledge about the different impacts on the people, the society, the politics, the commercial fairness (Schneier, 2016) and avoid problems and misperception (Yu & Cysneiros, 2002;Joinson et al., 2010). ...
Thesis
Full-text available
In a context where the development of data-based technologies and systems opens new domains for the design of interactive and responsive solutions, the management of personal information has become a significant issue. It’s a critical concern for both the user’s management, and the designed system’s one. The research aims to investigate on the use personal information as signifiers that contribute to the creation of the meaning. The initial purpose is to individuate the issues that the use of personal information raises in the form of impacts on individuals and society. Then the investigation on the impacts leads to the creation of tools to support the design process introducing critical thinking so to help designers in creating responsible and robust solutions. The research approach, defined following the DRM framework, starts with the identification of the background knowledge through review-based activities representing the starting point for the addressing of the first specific question (How is it possible to create updated knowledge about the impacts of the use of personal information in digital and interactive solutions considering both the societal point of view and the technologies’ rapid evolution?). The question has been then answered in the second stage of the research, the Descriptive Study I, through the definition of the protocol of data gathering from online sources. The second specific question (How this knowledge can be used in the design process so to elicit critical thinking in the designers?) has been addressed in the third and fourth phase of the research (the Prescriptive Study and the Descriptive Study II) through the initial assumption and the subsequent refining of the Impact Anticipation Method. The third and fourth phase of the research addressed also the third specific question (What are the results of the use of the knowledge in the different phases of the design process?) providing results for each application of the tools in design processes. The fourth and last specific question (What is the role of design and designers on the discussion about the consequences of the use of personal information?) has been addressed starting from the second phase of the research through the review-based knowledge acquired during the comprehensive studies and the final results of the application of the method in the design processes. The dissertation reports the path, the activities and the results of the PhD research providing in PART I the initial context and background in which the research is set and the clarification of the general and specific objectives. PART II provides the systematization of the theoretical background and the findings that comes from case studies analyses framing the state of the art of the use of personal information in digital and interactive solutions starting the discussion on the possible impacts that the use of such information could have on the individual and the society. Starting from the formulation of the hypothesis, to the validation of the tools through the application on use cases, PART III reports the development of a method for the anticipation of the impacts of personal information during the design process. Then PART IV illustrates the final version of the tools of the Impact Anticipation Method as well as the discussion on: i) the results obtained by its application in design processes in terms of raising of designers’ awareness and changes in design choices and output; ii) perspectives of the active role of the design in the discussion on the consequences of the use of personal information; iii) future developments of the method and its tools.
... In a broader spectrum, there is a plethora of studies that provide insight into the perceptions and acceptance of location tracking, location sharing, and privacy [3]. Interestingly, Liu et al. [18] point out in their recent literature review that people's views on location privacy have changed over time: early studies (before 2010) show little concern for location privacy, but more recent studies show otherwise -this change in itself serves as further motivation for this study. ...
Conference Paper
Full-text available
Tracking the location of people and their mobile devices creates opportunities for new and exciting ways of interacting with public technology. For instance, users can transfer content from public displays to their mobile device without touching it, because location tracking allows automatic recognition of the target device. However, many uncertainties remain regarding how users feel about interactive displays that track them and their mobile devices, and whether their experiences vary based on the setting. To close this research gap, we conducted a 24-participant user study. Our results suggest that users are largely willing - even excited - to adopt novel location-tracking systems. However, users expect control over when and where they are tracked, and want the system to be transparent about its ownership and data collection. Moreover, the deployment setting plays a much bigger role on people's willingness to use interactive displays when location tracking is involved.
... Users' sharing behaviors contradict their previously established attitudes and concerns. Because privacy attitudes/behaviors related to HCI should not be regarded as a universal construct, Barkhuus (2012) advocated a more finely defined definition of privacy related to contextual integrity to help reconcile the privacy paradox issues. ...
Article
In the past few years, there has been exponential growth in the volume of “always listening” intelligent virtual assistant devices used in the home. The adoption of intelligent virtual assistants is also moving rapidly into the applications and devices utilized by businesses. Recently, organizations such as MyFirmsApp and SAP have partnered with Amazon to embed Amazon’s Lex Artificial Intelligence technology in their applications for use in accounting workplaces. Although there are highly publicized drawbacks related to the home use of this technology, the benefits of this more convenient, spoken interface are many. Therefore, it is important and timely to to review the relevant literature, such as Altman’s (1975) regulation of interpersonal barriers, Nissenbaum’s (2010) notions of contextual integrity, the privacy paradox (Norberg et al. 2007), and Eyal’s (2014) work on habit-forming products, to help understand the concerns related to the adoption of these efficiency- and productivity-boosting assistants. This paper will discuss the advantages and disadvantages of these interfaces, imagine how accountants might use these devices in the workplace currently and in the near future, examine the challenges of adopting digital assistants, and provide recommendations for future research in this area.
... In contrast, Eva felt that the home stream exposed too much information, preventing her from surprising her partner by arriving home early or meeting him at a restaurant. These individual differences expose deeply personal preferences about how revealing or discreet each stream should be, which can be context dependent [6]. ...
Conference Paper
Full-text available
Couples exhibit special communication practices, but apps rarely offer couple-specific functionality. Research shows that sharing streams of contextual information (e.g. location, motion) helps couples coordinate and feel more connected. Most studies explored a single, ephemeral stream; we study how couples' communication changes when sharing multiple, persistent streams. We designed Lifelines, a mobile-app technology probe that visualizes up to six streams on a shared timeline: closeness to home, battery level, steps, media playing, texts and calls. A month-long study with nine couples showed that partners interpreted information mostly from individual streams, but also combined them for more nuanced interpretations. Persistent streams allowed missing data to become meaningful and provided new ways of understanding each other. Unexpected patterns from any stream can trigger calls and texts, whereas seeing expected data can replace direct communication, which may improve or disrupt established communication practices. We conclude with design implications for mediating awareness within couples.
... No wonder Lahlou, Langheinrich, and Rucker (2005) found that engineers were very reluctant to embrace privacy: Privacy 'was either an abstract problem [to them], not a problem yet (they are 'only prototypes'), not a problem at all (firewalls and cryptography would take care of it), not their problem (but one for politicians, lawmakers or, more vaguely, society) or simply not part of the project deliverables' (Lahlou et al., 2005, p. 60). When the term "privacy" is so often misunderstood and misused in human-computer interaction (HCI) (Barkhuus, 2012), there is a need to converge on a subset of core privacy theories and frameworks to guide privacy research and design (Badillo-Urquiola et al., 2018), taking into account the new requirements of data-driven society (Belanger & Crossler, 2011). Figure 1 gives an overview of how privacy theories have developed from mainly focusing on the individual handling 'small data' to dealing with data sharing in group and societal settings, where new technologies using big data set the scene. ...
Article
Full-text available
There is a gap between people’s online sharing of personal data and their concerns about privacy. Till now, this gap is addressed by attempting to match individual privacy preferences with service providers’ options for data handling. This approach has ignored the role different contexts play in data sharing. This paper aims at giving privacy engineering a new direction putting context centre stage and exploiting the affordances of machine learning in handling contexts and negotiating data sharing policies. This research is explorative and conceptual, representing the first development cycle of a design science research project in privacy engineering. The paper offers a concise understanding of data privacy as a foundation for design extending the seminal contextual integrity theory of Helen Nissenbaum. This theory started out as a normative theory describing the moral appropriateness of data transfers. In our work, the contextual integrity model is extended to a socio-technical theory that could have practical impact in the era of artificial intelligence. New conceptual constructs such as ‘context trigger’, ‘data sharing policy’ and ‘data sharing smart contract’ are defined, and their application is discussed from an organisational and technical level. The constructs and design are validated through expert interviews; contributions to design science research are discussed, and the paper concludes with presenting a framework for further privacy engineering development cycles.
... Researchers have applied contextual integrity to various privacy-sensitive contexts, such as search engines (Zimmer, 2008), social network sites (Shi, Xu, & Chen, 2013), location-based technologies (Barkhuus, 2012), electronic medical records (Chen & Xu, 2014), student learning analytics (Rubel & Jones, 2016), smart home devices (Apthorpe, Shvartzshnaider, Mathur, Reisman, & Feamster, 2018), and big data research ethics (Zimmer, 2018), among others. These studies have identified more nuanced explanations for perceived "inconsistencies" or "paradoxes" in privacy behaviors, suggesting that breaches in contextual integrity can help explain why users would be concerned with uses of information that go beyond the original purpose or context in which they were initially disclosed. ...
Article
Full-text available
In this position paper, we synthesize various knowledge gaps in information privacy scholarship and propose a research agenda that promotes greater cross-disciplinary collaboration within the iSchool community and beyond. We start by critically examining Westin's conceptualization of information privacy and argue for a contextual approach that holds promise for overcoming some of Westin's weaknesses. We then highlight three contextual considerations for studying privacy-digital networks, marginalized populations, and the global context-and close by discussing how these considerations advance privacy theorization and technology design.
... Nissenbaum's theory of Contextual Integrity (CI) frames privacy as "the right to appropriate flow of personal information" [59] where what is "appropriate" is based on particular contexts and relationships. CI has been found to be a useful practical framework to interpret people's perceptions of privacy [10,43]. ...
Preprint
Children under 11 are often regarded as too young to comprehend the implications of online privacy. Perhaps as a result, little research has focused on younger kids' risk recognition and coping. Such knowledge is, however, critical for designing efficient safeguarding mechanisms for this age group. Through 12 focus group studies with 29 children aged 6-10 from UK schools, we examined how children described privacy risks related to their use of tablet computers and what information was used by them to identify threats. We found that children could identify and articulate certain privacy risks well, such as information oversharing or revealing real identities online; however, they had less awareness with respect to other risks, such as online tracking or game promotions. Our findings offer promising directions for supporting children's awareness of cyber risks and the ability to protect themselves online.
... However, a fundamental step towards establishing an actionable privacy framework that would shape system design is investigating how users define privacy. Privacy attitudes and perceptions can be influenced by many factors including culture [35], social norms [9,51], and contextual factors [32,33,50]. Equally important, individuals often make decisions based on the expectation of loss of privacy and the potential gain of disclosure; user's final privacy behavior is usually based on the expected outcome of the tradeoff [22]. ...
Conference Paper
Full-text available
The African continent is making considerable strides to develop and implement technology-driven health innovations. Policymakers are increasingly acknowledging the rising concerns for online personal privacy and data protection as advances in eHealth results in increased levels of data collection and surveillance. In this paper, we propose a research agenda to investigate the effect of cultural, constitutional, and societal factors on privacy concerns and preferences among the different African countries in the context of healthcare technologies. In addition to helping us understand policy and design implications for members of this region, this research will broaden our understanding of cultural factors influencing privacy worldwide.
... In 2012, Barkhuus highlighted the shortcomings of the existing privacy frameworks on self-gathered empirical data [5]. She argued for the use of CI for privacy-related user studies as a way to understand the "contextually grounded reasons for people's privacy concern or lack thereof. ...
Preprint
The proliferation of Internet of Things (IoT) devices for consumer "smart" homes raises concerns about user privacy. We present a survey method based on the Contextual Integrity (CI) privacy framework that can quickly and efficiently discover privacy norms at scale. We apply the method to discover privacy norms in the smart home context, surveying 1,731 American adults on Amazon Mechanical Turk. For $2,800 and in less than six hours, we measured the acceptability of 3,840 information flows representing a combinatorial space of smart home devices sending consumer information to first and third-party recipients under various conditions. Our results provide actionable recommendations for IoT device manufacturers, including design best practices and instructions for adopting our method for further research.
... Because privacy is a highly normative construct [29], individual differences have been shown to play a key role in shaping attitudes related to these various privacy concerns (e.g., interactional preferences on social media [21]) and influence subsequent on-or offline behaviors [17] An individual's digital privacy behavior and preferences are influenced by personal factors such as: [3] time available [6],recipient [4], age [14,16], gender [10,14,16], personality [33] network compositions [30,31], social norms [2,19], culture [13], and previous experiences [14,16,32]. Research suggests that privacy preferences vary drastically from individual to individual, change over time, and is based on context [18]. ...
Conference Paper
Full-text available
As our lives become increasingly digitized, how people maintain and manage their networked privacy has become a formidable challenge for academics, practitioners, and policy-makers. A shift toward people-centered privacy initiatives has shown promise; yet many applications still adopt a "one-size fits all" approach, which fails to consider how individual differences in concerns, preferences, and behaviors shape how different people interact with and use technology. The main goal of this workshop is to highlight individual differences (e.g., age, culture, personal preference) that influence users' experiences and privacy-related outcomes. We will work towards best practices for research, design, and online privacy regulation policies that consider these differences.
... The notion of privacy as autonomy over what information is communicated, by whom, and when-canonically articulated by Alan Westin [64]-has exerted a strong and lasting influence over the design of privacy tools and interfaces which seek to set and enforce end-user preferences and rules. Alternative conceptions of privacy as a dynamic, dialectical process of boundary negotiation [4,45,44], have also inspired contextually-aware design patterns for HCI [33] and cautioned against treating privacy preferences as persistent and universal [9,41]. ...
Conference Paper
Most smartphone apps collect and share information with various first and third parties; yet, such data collection practices remain largely unbeknownst to, and outside the control of, end-users. In this paper, we seek to understand the potential for tools to help people refine their exposure to third parties, resulting from their app usage. We designed an interactive, focus-plus-context display called X-Ray Refine (Refine) that uses models of over 1 million Android apps to visualise a person's exposure profile based on their durations of app use. To support exploration of mitigation strategies, emphRefine can simulate actions such as app usage reduction, removal, and substitution. A lab study of emphRefine found participants achieved a high-level understanding of their exposure, and identified data collection behaviours that violated both their expectations and privacy preferences. Participants also devised bespoke strategies to achieve privacy goals, identifying the key barriers to achieving them.
... User privacy, security, and safety concerns are critically important in experimental design. In webbased studies, leaking participant personal information is a real possibility, since researchers often do not have control of how a participant may interact with a site [1], what data the site collects [2], or the potential network of other sites and services which may have access to user data on a live website [4]. For studies in the cybersecurity research domain, lack of control of webpage content is a large safety concern [7,12], since content of interest may include malicious links, unexpected popups, malware, and other generally maligned web content. ...
Article
Full-text available
Human-computer interaction and computer-mediated behavioral psychology research studies often rely on capturing user interaction data to characterize online behaviors. IRB considerations, site policies, and/or security and privacy concerns may force researchers to use screenshots or offline copies of pages of interest, instead of live websites, in their study designs. These interaction modalities reduce the fidelity and contextual realism of web content and often affect interface aesthetic quality – due to broken links, missing images, and/or malfunctioning scripts. StudySandboxx is a tool that allows websites to be saved exactly as they appear online. The tool sandboxes websites in a way that removes dangerous scripts that threaten privacy and security. Saved websites are encapsulated into a single portable file that contains all related website resources. Finally, the tool also supports certain types of permutations commonly used in research – such as changing links in a page. The project is housed within a GitHub repository at https://github.com/gewethor/study-sandbox.
... A possible explanation is that while the participants understand the sensitivity of data collected by smart health devices, they are more likely to perceive the entity collecting the data as trustworthy (e.g., when consulting a physician) and therefore be more concerned about repurposing the data by other entities. As researchers stress the importance of context in determining privacy issues (see e. g. [3,16]), our findings provide a further confirmation for this approach, indicating the need to consider both cultural factors and context of a specific system or data exchange in order to support the end users with their security and privacy protection. ...
Chapter
Smart environments are becoming ubiquitous despite many potential security and privacy issues. But, do people understand what consequences could arise from using smart environments? To answer this research question, we conducted a survey with 575 participants from three different countries (Germany, Spain, Romania) considering smart home and health environments. Less than half of all participants mentioned at least one security and privacy issue, with significantly more German participants mentioning issues than the Spanish ones and the Spanish participants in turn mentioning significantly more security and privacy issues than the Romanian participants. Using open coding, we find that among the 275 participants mentioning security and privacy issues, 111 only expressed abstract concerns such as “security issues” and only 34 mentioned concrete harms such as “Burglaries (physical and privacy)”, caused by security and privacy violations. The remaining 130 participants who mentioned security and privacy issues named only threats (i.e. their responses were more concrete than just abstract concerns but they did not mention concrete harming scenarios).
... • Privacy risk: Location-aware social data are highly sensitive. With these data obtained online, criminals and stalkers could easily speculate on the scope of victims' daily activities and the places that attract victims to go in the future, and then spot and track victims at these locations [4,7]. ...
Article
Point-of-Interest (POI) recommendation is significant in location-based social networks to help users discover new locations of interest. Previous studies on such recommendation mainly adopted a centralized learning framework where check-in data were uploaded, trained and predicted centrally in the cloud. However, such a framework suffers from privacy risks caused by check-in data exposure and fails to meet real-time recommendation needs when the data volume is huge and communication is blocked in crowded places. In this paper, we propose PREFER, an edge-accelerated federated learning framework for POI recommendation. It decouples the recommendation into two parts. Firstly, to protect privacy, users train local recommendation models and share multi-dimensional user-independent parameters instead of check-in data. Secondly, to improve recommendation efficiency, we aggregate these distributed parameters on edge servers in proximity to users (such as base stations) instead of remote cloud servers. We implement the PREFER prototype and evaluate its performance using two real-world datasets and two POI recommendation models. Extensive experiments demonstrate that PREFER strengthens privacy protection and improves efficiency with little sacrifice to recommendation quality compared to centralized learning. It achieves the best quality and efficiency and is more compatible with increasingly sophisticated POI recommendation models compared to other state-of-the-art privacy-preserving baselines.
... Some researchers have argued that people feel helpless and resigned in the face of data demands and accept the discomfort caused by inevitable privacy violations [25,91]. Others have pointed out that privacy in Human Computer Interaction (HCI) is mismeasured or misdefned, thus leading to solutions that fail to address the core problem [12,21]. Despite decades of research and debate across many diferent felds, a clear and coherent defnition of privacy continues to be elusive [21,101]. ...
... Users may also grant consent because they feel there is no alternative. A more responsible approach would rely on the notion of contextual consent (Barkhuus, 2012). ...
Article
Full-text available
Data and data science offer tremendous potential to address some of our most intractable public problems (including the Covid-19 pandemic). At the same time, recent years have shown some of the risks of existing and emerging technologies. An updated framework is required to balance potential and risk, and to ensure that data is used responsibly. Data responsibility is not itself a new concept. However, amid a rapidly changing technology landscape, it has become increasingly clear that the concept may need updating, in order to keep up with new trends such as big data, open data, the Internet of things, and artificial intelligence, and machine learning. This paper seeks to outline 10 approaches and innovations for data responsibility in the 21st century. The 10 emerging concepts we have identified include: End-to-end data responsibility Decision provenance Professionalizing data stewardship From data science to question science Contextual consent Responsibility by design Data asymmetries and data collaboratives Personally identifiable inference Group privacy Data assemblies Each of these is described at greater length in the paper, and illustrated with examples from around the world. Put together, they add up to a framework or outline for policy makers, scholars, and activists who seek to harness the potential of data to solve complex social problems and advance the public good. Needless to say, the 10 approaches outlined here represent just a start. We envision this paper more as an exercise in agenda-setting than a comprehensive survey.
... The strand of research indicates that privacy has stronger moral and ethical dimensions embedded into cultures where religion plays an influential role in people's everyday life. As privacy perception and management are highly dynamic and contextual [18,80], religion offers an important orientation in understanding contextual norms and people's information management practices [3]. ...
... However, this type of location sharing provides information to the public and the public can effortlessly track the geo-positioned location of that user and lead to privacy implications (Louise 2012). To avert this type of privacy implication, several works have been gone through; nevertheless, for a sparse set of geo-positioned information itself, the linked identifiers can unlock the sensitive personal data of users (Tatiana et al. 2012). ...
Article
Full-text available
Crowdsourcing is a procedure for demonstrating data outsourcing to a wide range of individual workers rather than considering a unique entity or a company. Crowdsourcing has made different kinds of chances for some trying issues by utilizing human knowledge. In order to attain an optimal global assignment technique, it is necessary to gather information regarding the location of the entire workers. There occur few security issues during information gathering that causes severe threat to all workers. To overcome the concerns based on privacy-preserving, this paper proposes a privacy-preserving model based on Fuzzy with the Black widow and Spider Monkey Optimization (BW–SMO). The fuzzy can be used to cluster the query solution. To optimize the query selection, we exploited the Black widow optimization algorithm incorporated with the Spider Monkey optimization algorithm. The parameters of both algorithms are controlled by the Fuzzy logic controller. Thus, our proposed frameworks of Fuzzy with BW–SMO effectively solve optimizing selection and join queries with low cost, latency, and securing data. The proposed model will be compared with the existing models to show the system's effectiveness.
... Privacy, broadly, is a well-researched topic within HCI literature [14,24,25,56,71,85]. While privacy has become increasingly understood as a function of context, limited work exists that meaningfully recognizes it as a nuanced product of varying cultural considerations [3,16]. Related research is the work of scholars Abokhodair and Vieweg, who examined and theorized about social media and privacy practices from an Islamic perspective within the context of the GCC Region 4 [1][2][3]119]. ...
Article
As job-seeking and recruiting processes transition into digital spaces, concerns about hiring discrimination in online spaces have developed. Historically, women of color, particularly those with marginalized religious identities, have more challenges in securing employment. We conducted 20 semi-structured interviews with Muslim-American women of color who had used online job platforms in the past two years to understand how they perceive digital hiring tools to be used in practice, how they navigate the US job market, and how hiring discrimination as a phenomenon is thought to relate to their intersecting social identities. Our findings allowed us to identify three major categories of asymmetries (i.e., the relationship between the computing algorithms' structures and their users' experiences): (1) process asymmetries, which is the lack of transparency in data collection processes of job applications; (2) information asymmetries, which refers to the asymmetry in data availability during online job-seeking; and (3) legacy asymmetries, which explains the cultural and historical factors impacting marginalized job applicants. We discuss design implications to support job seekers in identifying and securing positive employment outcomes.
Chapter
This chapter introduces relevant privacy frameworks from academic literature that can be useful to practitioners and researchers who want to better understand privacy and how to apply it in their own contexts. We retrace the history of how networked privacy research first began by focusing on privacy as information disclosure. Privacy frameworks have since evolved into conceptualizing privacy as a process of interpersonal boundary regulation, appropriate information flows, design-based frameworks, and, finally, user-centered privacy that accounts for individual differences. These frameworks can be used to identify privacy needs and violations, as well as inform design. This chapter provides actionable guidelines for how these different frameworks can be applied in research, design, and product development.
Chapter
With the popularity of social media, researchers and designers must consider a wide variety of privacy concerns while optimizing for meaningful social interactions and connection. While much of the privacy literature has focused on information disclosures, the interpersonal dynamics associated with being on social media make it important for us to look beyond informational privacy concerns to view privacy as a form of interpersonal boundary regulation. In other words, attaining the right level of privacy on social media is a process of negotiating how much, how little, or when we desire to interact with others, as well as the types of information we choose to share with them or allow them to share about us. We propose a framework for how researchers and practitioners can think about privacy as a form of interpersonal boundary regulation on social media by introducing five boundary types (i.e., relational, network, territorial, disclosure, and interactional) social media users manage. We conclude by providing tools for assessing privacy concerns in social media, as well as noting several challenges that must be overcome to help people to engage more fully and stay on social media.
Chapter
This chapter introduces the book Modern Socio-Technical Perspectives on Privacy . The book informs academic researchers and industry professionals about the socio-technical privacy challenges related to modern networked technologies. This chapter provides a working definition of privacy, describes the envisioned audiences of this book, and summarizes the key aspects covered in each chapter. The chapter concludes with an invitation to join our community of privacy researchers and practitioners at modern-privacy.org .
Article
This study explores privacy concerns perceived by people with respect to having their DNA tested by direct-to-consumer (DTC) genetic testing companies such as 23andMe and Ancestry.com. Data collected from 510 respondents indicate that those who have already obtained a DTC genetic test have significantly lower levels of privacy and security concerns than those who have not obtained a DTC genetic test. Qualitative data from respondents of both these groups show that the concerns are mostly similar. However, the factors perceived to alleviate privacy concerns are more varied and nuanced amongst those who have obtained a DTC genetic test. Our data suggest that privacy concerns or lack of concerns are based on complex and multiple considerations including data ownership, access control of data and regulatory authorities of social, political and legal systems. Respondents do not engage in a full cost/benefit analysis of having their DNA tested.
Conference Paper
This paper addresses privacy related to individual's personal information disclosure in the context of Social Network Sites (SNSs). We present the Privacy Design Model (PDM) - built upon Altman's theory of privacy and Semiotic Engineering theory - as a descriptive model that takes into account dimensions corresponding to aspects upon which designers should reflect at design time, regarding personal information disclosure within these systems. PDM considers as personal information disclosure not only pieces of information about the individual, but also the individual's speech and activities that express his/her identity within the system. Thus, we hope to provide designers with a better understanding of how privacy can be treated in SNS by structuring the design space of personal information disclosure through such dimensions. We also analyze the applicability of the model, through the deconstruction of a real SNS in terms of the privacy dimensions proposed in PDM and discuss the implications of our findings.
Article
Digital health data is important to keep secure, and patients' perception around the privacy of it is essential to the development of digital health records. In this paper we present people's perceptions of the communication of data protection, in relation to their personal health data and the access to it; we focused particularly on people with chronic or long-term illness. Based on their use of personally accessible health records, we inquired into their explicit perception of security and sense of data privacy in relation to their health data. Our goal was to provide insights and guidelines to designers and developers on the communication of data protection in health records in an accessible way for the users. We analyzed their approach to and experience with their own health care records and describe the details of their challenges. A conceptual framework called "Privacy Awareness' was developed from the findings and reflects the perspectives of the users. The conceptual framework forms the basis of a proposal for design guidelines for Digital Health Record systems, which aim to address, facilitate and improve the users' awareness of the protection of their online health data.
Article
Aged care monitoring devices (ACMDs) enable older adults to live independently at home. But to do so, ACMDs collect and share older adults’ personal information with others, potentially raising privacy concerns. This paper presents a detailed account of the different privacy problems in ACMDs that concern older adults. We report findings from interviews and a focus group conducted with older adults who are ageing in place. Using Daniel Solove’s privacy taxonomy to categorize privacy concerns, our analysis suggests that older adults are concerned about the potential for ACMDs to give rise to six problems: surveillance, secondary use of data, breach of confidentiality, disclosure, decisional interference and disturbing others. Other findings indicate that participants are worried about their ability to impose control over collection and management of their personal details and are willing to only accept privacy trade-offs during emergencies. We provide recommendations for ACMD developers and future directions to address findings from this research.
Article
Today, industry practitioners (e.g., data scientists, developers, product managers) rely on formal privacy reviews (a combination of user interviews, privacy risk assessments, etc.) in identifying potential customer acceptance issues with their organization’s data practices. However, this process is slow and expensive, and practitioners often have to make ad-hoc privacy-related decisions with little actual feedback from users. We introduce Lean Privacy Review (LPR), a fast, cheap, and easy-to-access method to help practitioners collect direct feedback from users through the proxy of crowd workers in the early stages of design. LPR takes a proposed data practice, quickly breaks it down into smaller parts, generates a set of questionnaire surveys, solicits users’ opinions, and summarizes those opinions in a compact form for practitioners to use. By doing so, LPR can help uncover the range and magnitude of different privacy concerns actual people have at a small fraction of the cost and wait-time for a formal review. We evaluated LPR using 12 real-world data practices with 240 crowd users and 24 data practitioners. Our results show that (1) the discovery of privacy concerns saturates as the number of evaluators exceeds 14 participants, which takes around 5.5 hours to complete (i.e., latency) and costs 3.7 hours of total crowd work ( $80 in our experiments); and (2) LPR finds 89% of privacy concerns identified by data practitioners as well as 139% additional privacy concerns that practitioners are not aware of, at a 6% estimated false alarm rate.
Article
Automatic emotion recognition (ER)-enabled wellbeing interventions use ER algorithms to infer the emotions of a data subject (i.e., a person about whom data is collected or processed to enable ER) based on data generated from their online interactions, such as social media activity, and intervene accordingly. The potential commercial applications of this technology are widely acknowledged, particularly in the context of social media. Yet, little is known about data subjects' conceptualizations of and attitudes toward automatic ER-enabled wellbeing interventions. To address this gap, we interviewed 13 US adult social media data subjects regarding social media-based automatic ER-enabled wellbeing interventions. We found that participants' attitudes toward automatic ER-enabled wellbeing interventions were predominantly negative. Negative attitudes were largely shaped by how participants compared their conceptualizations of Artificial Intelligence (AI) to the humans that traditionally deliver wellbeing support. Comparisons between AI and human wellbeing interventions were based upon human attributes participants doubted AI could hold: 1) helpfulness and authentic care; 2) personal and professional expertise; 3) morality; and 4) benevolence through shared humanity. In some cases, participants' attitudes toward automatic ER-enabled wellbeing interventions shifted when participants conceptualized automatic ER-enabled wellbeing interventions' impact on others, rather than themselves. Though with reluctance, a minority of participants held more positive attitudes toward their conceptualizations of automatic ER-enabled wellbeing interventions, citing their potential to benefit others: 1) by supporting academic research; 2) by increasing access to wellbeing support; and 3) through egregious harm prevention. However, most participants anticipated harms associated with their conceptualizations of automatic ER-enabled wellbeing interventions for others, such as re-traumatization, the spread of inaccurate health information, inappropriate surveillance, and interventions informed by inaccurate predictions. Lastly, while participants had qualms about automatic ER-enabled wellbeing interventions, we identified three development and delivery qualities of automatic ER-enabled wellbeing interventions upon which their attitudes toward them depended: 1) accuracy; 2) contextual sensitivity; and 3) positive outcome. Our study is not motivated to make normative statements about whether or how automatic ER-enabled wellbeing interventions should exist, but to center voices of the data subjects affected by this technology. We argue for the inclusion of data subjects in the development of requirements for ethical and trustworthy ER applications. To that end, we discuss ethical, social, and policy implications of our findings, suggesting that automatic ER-enabled wellbeing interventions imagined by participants are incompatible with aims to promote trustworthy, socially aware, and responsible AI technologies in the current practical and regulatory landscape in the US.
Article
Full-text available
Smart Meters are a key component of increasing the power efficiency of the Smart Grid. To help manage the grid effectively, these meters are designed to collect information on power consumption and send it to third parties. With Smart Metering, for the first time, these cloud-connected sensing devices are legally mandated to be installed in the homes of millions of people worldwide. Via a multi-staged empirical study that utilized an open-ended questionnaire, focus groups, and a design probe, we examined how people characterize the tension between the utility of Smart Metering and its impact on privacy. Our findings show that people seek to make abstract Smart Metering data accountable by connecting it to their everyday practices. Our insight can inform the design of usable privacy configuration tools that help Smart Metering consumers relate abstract data with the real-world implications of its disclosure.
Article
Introduction The governance structures associated with health data are evolving in response to advances in digital technologies that enable new ways of capturing, using, and sharing different types of data. Increasingly, health data moves between different contexts such as from healthcare to research, or to commerce and marketing. Crossing these contextual boundaries has the potential to violate societal expectations about the appropriate use of health data and diminish public trust. Understanding citizens’ views on the acceptability of and preferences for data use in different contexts is essential for developing information governance policies in these new contexts. Methods Focus group design presenting data sharing scenarios in England, Iceland, and Sweden. Results Seventy-one participants were recruited. Participants supported the need for data to help understand the observable world, improve medical research, the quality of public services, and to benefit society. However, participants consistently identified the lack of information, transparency and control as barriers to trusting organisations to use data in a way that they considered appropriate. There was considerable support for fair and transparent data sharing practices where all parties benefitted. Conclusion Data governance policy should involve all stakeholders’ perspectives on an ongoing basis, to inform and implement changes to health data sharing practices that accord with stakeholder views. The Findings showed that 1) data should be used for ethical purposes even when there was commercial interest; 2) data subjects and/or public institutions that provide and share data should also receive benefits from the sharing of data; 3) third parties use of data requires greater transparency and accountability than currently exists, 4) there should be greater information provided to empower data subjects.
Article
ABSTRACT This research anticipates a future where “smart cities” rely extensively on data analytics to determine budget allocations, to manage traffic, to design infrastructure, and to advance sustainability efforts. In this study, Helen Nissenbaum's contextual integrity framework helps us understand how smart city residents consider privacy norms, and provides a structure for comparing these norms to current data privacy practices. The study findings and policy recommendations are based on focus group discussions with more than 80 residents of Long Beach, California, as well as 60 responses to an open-ended question asked in a smart city survey.
Article
In this paper we examine Anshimi, a mobile safety application that hosts women's safety services. We present a case study based on qualitative data gathered from semi-structured interviews and a participatory design workshop with women in Seoul, South Korea. We examine women's perceptions of various kinds of safety data that Anshimi provides and collects from its users, such as GPS data, security camera footage, messages, personal information, etc. Exploring how women negotiate the tension between feeling protected and surveilled, we account for the nuanced ways in which women understand, feel, and use safety data in their daily lives. We also present scenarios for engaging with women's safety data, in the hopes of developing a guiding framework for designing women's safety applications and safety data practices.
Conference Paper
Full-text available
With the wide spread of IoT devices, smart systems gain more and more control over personal data and daily lives of their users. This control, however, can easily be misused, either by system providers themselves acting in bad faith, or by external attackers. Implementing proper measures towards security and privacy protection of smart systems, therefore, becomes of critical importance. In this paper we conduct a study to investigate beliefs among end users, whether the smart system providers are both capable and motivated to implement such measures. For this purpose, we conduct an online survey of 98 participants from the UK, which we analyse using quantitative and qualitative methods. Our results show that users' trust in proper security and privacy protection in smart system is influenced by a multitude of factors such as information about concrete technologies and privacy policies of the systems, but also information about the company such as its reputation or geographical location. We conclude that transparency by companies, regarding both the technologies behind the concrete system and the general practices of the company itself, is a crucial factor in ensuring customer confidence.
Chapter
Calls for ethics by and in design of new technologies are now commonplace in academic literature, private businesses such as Microsoft and Google, and the European Commission’s Horizon 2020 and Horizon Europe research projects. This emphasis on ethics is necessary owing to the ways in which new technologies are embedded in our every day practices, can radically affect these practices, and have the potential for transgressing or promoting important values. Despite this importance, there is a lack of clarity concerning how designers can translate ethical theories and ethical values into ethical action. In this paper, I canvass some of the most prominent ethical theories and explain their connection to action. Finding these wanting, I propose an ethic of responsibility as a first step in a more ethically sensitive approach to value-oriented design. This approach internalizes responsibility for ethical action into the actor, rather than seeking ethical characteristics in the external act or value. The reader should keep in mind that this is only the first step given constraints on time and space of this paper. The following step of identifying concrete design suggestions will follow in a subsequent article.
Article
Through the past two and a half years, COVID-19 has swept through the world and new technologies for mitigating spread, such as exposure notification applications and contact tracing, have been implemented in many countries. However, the uptake has differed from country to country and it has not been clear if culture, death rates or information dissemination have been a factor in their adoption rate. However, these apps introduce issues of trust and privacy protection, which can create challenges in terms of adoptions and daily use. In this paper we present the results from a cross-country survey study of potential barriers to adoption of in particular COVID-19 contact tracing apps. We found that people's existing privacy concerns are an have a reverse correlation with adoption behavior but that the geographical location, as well as other demographics, such as age and gender, do not have significant effect on either adoption of the app or privacy concerns. Instead, a better understanding of what data is collected through the apps lead to a higher level of adoption. We provide suggestions for how to approach the development and deployment of contact tracing apps and more broadly health tracking apps.
Conference Paper
Full-text available
Although privacy is broadly recognized as a dominant concern for the development of novel interactive technologies, our ability to reason analytically about privacy in real settings is limited. A lack of conceptual interpretive frameworks makes it difficult to unpack interrelated privacy issues in settings where information technology is also present. Building on theory developed by social psychologist Irwin Altman, we outline a model of privacy as a dynamic, dialectic process. We discuss three tensions that govern interpersonal privacy management in everyday life, and use these to explore select technology case studies drawn from the research literature. These suggest new ways for thinking about privacy in socio-technical environments as a practical matter.
Article
Full-text available
As everyday life is increasingly conducted online, and as the electronic world continues to move out into the physical, the privacy of information and action and the security of information systems are increasingly a focus of concern both for the research community and the public at large. Accordingly, privacy and security are active topics of investigation from a wide range of perspectives-institutional, legislative, technical, interactional, and more. In this article, we wish to contribute toward a broad understanding of privacy and security not simply as technical phenomena but as embedded in social and cultural contexts. Privacy and security are difficult concepts to manage from a technical perspective precisely because they are caught up in larger collective rhetorics and practices of risk, danger, secrecy, trust, morality, identity, and more. Reductive attempts to deal with these issues separately produce incoherent or brittle results. We argue for a move away from narrow views of privacy and security and toward a holistic view of situated and collective information practice.
Conference Paper
Full-text available
The rapid adoption of location tracking and mobile social networking technologies raises significant privacy challenges. Today our understanding of people's location sharing privacy preferences remains very limited, including how these preferences are impacted by the type of location tracking device or the nature of the locations visited. To address this gap, we deployed Locaccino, a mobile location sharing system, in a four week long field study, where we examined the behavior of study participants (n=28) who shared their location with their acquaintances (n=373.) Our results show that users appear more comfortable sharing their presence at locations visited by a large and diverse set of people. Our study also indicates that people who visit a wider number of places tend to also be the subject of a greater number of requests for their locations. Over time these same people tend to also evolve more sophisticated privacy preferences, reflected by an increase in time- and location-based restrictions. We conclude by discussing the implications our findings.
Conference Paper
Full-text available
We report on a two-week deployment of a peer-to-peer, mobile, location-enhanced messaging service. This study is specifically aimed at investigating the need for and effectiveness of automatic location disclosure mechanisms, the emerging strategies to achieve plausible deniability, and at understanding how place and activity are used to communicate plans, intentions and provide awareness. We outline the research that motivated this study, briefly describe the application we designed, and provide details of the evaluation process. The results show a lack of value of automatic messaging functions, confirm the need for supporting plausible deniability in communications, and highlight the prominent use of activity instead of place to indicate one’s location. Finally, we offer suggestions for the development of social mobile applications.
Conference Paper
Full-text available
Location-based ubiquitous computing systems are entering mainstream society and becoming familiar parts of everyday life. However, the settings in which they are deployed are already suffused with complex social dynamics. We report on a study of parole officers and parolees whose relationships are being transformed by location-based technologies. While parolees are clearly subjects of state discipline, the parole officers also find themselves subject to new responsibilities. This study highlights the complexities of power in sociotechnical systems and what happens when location becomes a tradable, technological object. Author Keywords
Conference Paper
Full-text available
Mobile privacy concerns are central to Ubicomp and yet remain poorly understood. We advocate a diversified approach, enabling the cross-interpretation of data from complementary methods. However, mobility imposes a number of limitations on the methods that can be effectively employed. We discuss how we addressed this problem in an empirical study of mobile social networking. We report on how, by combining a variation of experience sampling and contextual interviews, we have started focusing on a notion of context in relation to privacy, which is subjectively defined by emerging socio-cultural knowledge, functions, relations and rules. With reference to Gieryn's sociological work, we call this place, as opposed to a notion of context that is objectively defined by physical and factual elements, which we call space. We propose that the former better describes the context for mobile privacy.
Conference Paper
Full-text available
This paper demonstrates how useful content can be generated as a by-product of an enjoyable mobile multiplayer game. In EyeSpy, players tag geographic locations with photos or text. By locating the places in which other players' tags were created and 'confirming' them, players earn points for themselves and verify the tags' locations. As a side effect of game-play, EyeSpy produces a collection of recognisable and findable geographic details, in the form of photographs and text tags, that can be repurposed to support navigation tasks. Two user trials of the game successfully produced an archive of geo-located photographs and tags, and in a follow- up experiment we compared performance in a navigation task using photographs from the game, with geo-referenced photos collected from the Flickr website. Our experiences with EyeSpy support reflection upon the design challenges presented by 'human computation' and the production of usable by-products through mobile game-play. Author keywords Human computation, mobile multiplayer games, mobile photography, navigation, RF fingerprinting.
Conference Paper
Full-text available
We introduce a location--based game called Feeding Yoshi that provides an example of seamful design, in which key characteristics of its underlying technologies-the coverage and security characteristics of WiFi-are exposed as a core element of gameplay. Feeding Yoshi is also a long--term, wide--area game, being played over a week between three different cities during an initial user study. The study, drawing on participant diaries and interviews, supported by observation and analysis of system logs, reveals players' reactions to the game. We see the different ways in which they embedded play into the patterns of their daily lives, augmenting existing practices and creating new ones, and observe the impact of varying location on both the ease and feel of play. We identify potential design extensions to Feeding Yoshi and conclude that seamful design provides a route to creating engaging experiences that are well adapted to their underlying technologies.
Conference Paper
Full-text available
How do mobility and presence feature as aspects of social life? Using a case study of paroled offenders tracked via Global Positioning System (GPS), we explore the ways that location-based technologies frame people's everyday experiences of space. In particular, we focus on how access and presence are negotiated outside of traditional conceptions of "privacy." We introduce the notion of accountabilities of presence and suggest that it is a more useful concept than "privacy" for understanding the relationship between presence and sociality. Author Keywords Location, ubiquitous computing, GPS, parolees, mobility, privacy, space, time, body
Conference Paper
Full-text available
Increasingly, users access online services such as email, e- commerce, and social networking sites via 802.11-based wireless networks. As they do so, they expose a range of personal information such as their names, email addresses, and ZIP codes to anyone within broadcast range of the network. This paper presents results from an exploratory study that examined how users from the general public understand Wi-Fi, what their concerns are related to Wi-Fi use, and which practices they follow to counter perceived threats. Our results reveal that while users understand the practical details of Wi-Fi use reasonably well, they lack understanding of important privacy risks. In addition, users employ incomplete protective practices which results in a false sense of security and lack of concern while on Wi-Fi. Based on our results, we outline opportunities for technology to help address these problems. Author Keywords
Conference Paper
Full-text available
Feedback is viewed as an essential element of ubiquitous computing systems in the HCI literature for helping people manage their privacy. However, the success of online social networks and existing commercial systems for mobile loca- tion sharing which do not incorporate feedback would seem to call the importance of feedback into question. We inves- tigated this issue in the context of a mobile location sharing system. Specifically, we report on the findings of a field de- ployment of Locyoution, a mobile location sharing system. In our study of 56 users, one group was given feedback in the form of a history of location requests, and a second group was given no feedback at all. Our major contribution has been to show that feedback is an important contributing fac- tor towards improving user comfort levels and allaying pri- vacy concerns. Participants' privacy concerns were reduced after using the mobile location sharing system. Additionally, our study suggests that peer opinion and technical savviness contribute most to whether or not participants thought they would continue to use a mobile location technology. Author Keywords
Conference Paper
Full-text available
Little research exists on one of the most common, oldest, and most utilized forms of online social geographic information: the 'location' field found in most virtual community user profiles. We performed the first in-depth study of user behavior with regard to the location field in Twitter user profiles. We found that 34% of users did not provide real location information, frequently incorporating fake locations or sarcastic comments that can fool traditional geographic information tools. When users did input their location, they almost never specified it at a scale any more detailed than their city. In order to determine whether or not natural user behaviors have a real effect on the 'locatability' of users, we performed a simple machine learning experiment to determine whether we can identify a user's location by only looking at what that user tweets. We found that a user's country and state can in fact be determined easily with decent accuracy, indicating that users implicitly reveal location information, with or without realizing it. Implications for location-based services and privacy are discussed.
Conference Paper
Full-text available
This paper investigates emergent practices around 'microblogging', changing and sharing status within a social group. We present results from a trial of 'Connecto', a phone based status and location sharing application that allows a group to 'tag' areas and have individuals' locations shared automatically on a mobile phone. In use the system moved beyond being an awareness tool to a way of continuing the ongoing 'story' of conversations within the group. Through sharing status and location the system supported each groups' ongoing repartee - a site for social exchange, enjoyment and friendship. Author Keywords
Conference Paper
Full-text available
We conducted a questionnaire-based study of the relative importance of two factors, inquirer and situation, in determining the preferred accuracy of personal information disclosed through a ubiquitous computing system. We found that privacy preferences varied by inquirer more than by situation. That is, individuals were more likely to apply the same privacy preferences to the same inquirer in different situations than to apply the same privacy preferences to different inquirers in the same situation. We are applying these results to the design of a user interface for managing everyday privacy in ubiquitous computing.
Conference Paper
Full-text available
We take as our premise that it is possible and desirable todesign systems that support social processes. We describe Loops, aproject which takes this approach to supporting computer-mediatedcommunication (CMC) through structural and intemctive propertiessuch as persistence and a minimalist graphical representation ofusers and their activities that we call a social proxy. We discussa prototype called Babble that has been used by our group for overa year, and has been deployed to six other groups at the Watsonlabs for about two months. We describe usage experiences, lessonslearned, and next steps.
Conference Paper
Full-text available
Location-sharing services have a long history in research, but have only recently become available for consumers. Most popular commercial location-sharing services differ from previous research efforts in important ways: they use manual 'check-ins' to pair user location with semantically named venues rather than tracking; venues are visible to all users; location is shared with a potentially very large audience; and they employ incentives. By analysis of 20 in-depth interviews with foursquare users and 47 survey responses, we gained insight into emerging social practices surrounding location-sharing. We see a shift from privacy issues and data deluge, to more performative considerations in sharing one's location. We discuss performance aspects enabled by check-ins to public venues, and show emergent, but sometimes conflicting norms (not) to check-in.
Conference Paper
Full-text available
Context -aware computing often involves tracking peoples' location. Many studies and applications highlight the importance of keeping people's location information private. We discuss two types of locat ion- based services; location-tracking services that are based on other parties tracking the user's location and position-aware services that rely on the device's knowledge of its own location. We present an experimental case study that examines people's concern for location privacy and compare this to the use of location-based services. We find that even though the perceived usefulness of the two different types of services is the same, location- tracking services generate more concern for privacy than posit ion-aware services. We conclude that development emphasis should be given to position -aware services but that location-tracking services have a potential for success if users are given a simple option for turning the location-tracking off.
Conference Paper
Full-text available
Long-distance couples face considerable communication challenges in their relationships. Unlike collocated couples, long-distance couples lack awareness cues associated with physical proximity and must use technologies such as SMS or telephony to stay in sync. We posit that long-distance couples have needs that are not met by prevailing communication technologies, which require explicit action from the sender as well as the receiver. We built CoupleVIBE to explore the properties of an implicit messaging channel and observe how couples would use such a technology. CoupleVIBE is a mobile application that automatically pushes a user's location-information to her partner's mobile phone via vibrotactile cues. We present qualitative results of a four-week user study, studying how seven couples used CoupleVIBE. A key result is that CoupleVIBE's implicit communication modality operated as a foundation that helps keep couples in sync, with other modalities being brought into play when further interaction was needed.
Article
Full-text available
This study examines the relationship between use of Facebook, a popular online social network site, and the formation and maintenance of social capital. In addition to assessing bonding and bridging social capital, we explore a dimension of social capital that assesses one's ability to stay connected with members of a previously inhabited community, which we call maintained social capital. Regression analyses conducted on results from a survey of undergraduate students (N=286) suggest a strong association between use of Facebook and the three types of social capital, with the strongest relationship being to bridging social capital. In addition, Facebook usage was found to interact with measures of psychological well-being, suggesting that it might provide greater benefits for users experiencing low self-esteem and low life satisfaction.
Article
Full-text available
Video media spaces are an excellent crucible for the study of privacy. Their design affords opportunities for misuse, prompts ethical questions, and engenders grave concerns from both users and nonusers. Despite considerable discussion of the privacy problems uncovered in prior work, questions remain as to how to design a privacy-preserving video media space and how to evaluate its effect on privacy. The problem is more deeply rooted than this, however. Privacy is an enormous concept from which a large vocabulary of terms emerges. Disambiguating the meanings of and relationships between these terms facilitates understanding of the link between privacy and design. In this article, we draw from resources in environmental psychology and computer-supported cooperative work (CSCW) to build a broadly and deeply rooted vocabulary for privacy. We relate the vocabulary back to the real and hard problem of designing privacy-preserving video media spaces. In doing so, we facilitate analysis of the privacy-design relationship.
Article
Full-text available
Ubiquitous computing is unusual amongst technological research arenas. Most areas of computer science research, such as programming language implementation, distributed operating system design, or denotational semantics, are defined largely by technical problems, and driven by building upon and elaborating a body of past results. Ubiquitous computing, by contrast, encompasses a wide range of disparate technological areas brought together by a focus upon a common vision. It is driven, then, not so much by the problems of the past but by the possibilities of the future. Ubiquitous computing’s vision, however, is over a decade old at this point, and we now inhabit the future imagined by its pioneers. The future, though, may not have worked out as the field collectively imagined. In this article, we explore the vision that has driven the ubiquitous computing research agenda and the contemporary practice that has emerged. Drawing on cross-cultural investigations of technology adoption, we argue for developing a “ubicomp of the present” which takes the messiness of everyday life as a central theme.
Article
Full-text available
A number of mobile applications have emerged that allow users to locate one another. However, people have expressed concerns about the privacy implications associated with this class of software, suggesting that broad adoption may only happen to the extent that these concerns are adequately addressed. In this article, we report on our work on PeopleFinder, an application that enables cell phone and laptop users to selectively share their locations with others (e.g. friends, family, and colleagues). The objective of our work has been to better understand people’s attitudes and behaviors towards privacy as they interact with such an application, and to explore technologies that empower users to more effectively and efficiently specify their privacy preferences (or “policies”). These technologies include user interfaces for specifying rules and auditing disclosures, as well as machine learning techniques to refine user policies based on their feedback. We present evaluations of these technologies in the context of one laboratory study and three field studies.
Article
Full-text available
Effective privacy management requires that mobile systems' users be able to make informed privacy decisions as their experience and knowledge of a system progresses. Prior work has shown that making such privacy decisions is a difficult task for users because systems do not provide support for awareness, visibility and accountability when sharing privacy-sensitive information. This paper reports results of our investigation into the efficacy of realtime feedback as a mechanism for incorporating these features of social translucence in location-sharing applications, in order to help users make better privacy decisions. We explored the role of real-time feedback in the context of Buddy Tracker, a mobile location-sharing application. Our work focuses on ways in which real-time feedback affects people's behaviour in order to identify the main criteria for acceptance of this technology. Based on the data from a three week field trial of Buddy Tracker, a focus group session, and interviews, we found that when using a system that provided real-time feedback, people were more accountable for their actions and reduced the number of unreasonable location requests. We have used the results of our study to propose high-level design criteria for incorporating real-time feedback into information sharing applications in a manner that ensures social acceptance of the technology.
Article
Full-text available
Location-aware technologies, such as sensor networks, enable everyday devices to become increasingly, interconnected with one another and with the Internet. Some analysts predict that by 2010 half of all cell phone users in the US will be using location-based services. Location-sensing technology raises problems for place, such as the expectations users have for privacy in particular places or while engaged in specific activities. Examining the patterns of sharing location information across the three different categories indicates that the participants fit into three types of revealed location-privacy policies. The majority of participants in the study had a consistent policy for sharing across all conditions and situations. The findings suggests that when participants are alone at home or in the library, they are more interested in enabling social contact.
Article
Video media spaces are an excellent crucible for the study of privacy. Their design affords opportunities for misuse, prompts ethical questions, and engenders grave concerns from both users and nonusers. Despite considerable discussion of the privacy problems uncovered in prior work, questions remain as to how to design a privacy-preserving video media space and how to evaluate its effect on privacy. The problem is more deeply rooted than this, however. Privacy is an enormous concept from which a large vocabulary of terms emerges. Disambiguating the meanings of and relationships between these terms facilitates understanding of the link between privacy and design. In this article, we draw from resources in environmental psychology and computer-supported cooperative work (CSCW) to build a broadly and deeply rooted vocabulary for privacy. We relate the vocabulary back to the real and hard problem of designing privacy-preserving video media spaces. In doing so, we facilitate analysis of the privacy-design relationship.
Conference Paper
Although privacy is broadly recognized as a dominant concern for the development of novel interactive technologies, our ability to reason analytically about privacy in real settings is limited. A lack of conceptual interpretive frameworks makes it difficult to unpack interrelated privacy issues in settings where information technology is also present. Building on theory developed by social psychologist Irwin Altman, we outline a model of privacy as a dynamic, dialectic process. We discuss three tensions that govern interpersonal privacy management in everyday life, and use these to explore select technology case studies drawn from the research literature. These suggest new ways for thinking about privacy in socio-technical environments as a practical matter.
Article
The practices of public surveillance, which include the monitoring of individuals in public through a variety of media (e.g., video, data, online), are among the least understood and controversial challenges to privacy in an age of information technologies. The fragmentary nature of privacy policy in the United States reflects not only the oppositional pulls of diverse vested interests, but also the ambivalence of unsettled intuitions on mundane phenomena such as shopper cards, closed-circuit television, and biometrics. This Article, which extends earlier work on the problem of privacy in public, explains why some of the prominent theoretical approaches to privacy, which were developed over time to meet traditional privacy challenges, yield unsatisfactory conclusions in the case of public surveillance. It posits a new construct, "contextual integrity," as an alternative benchmark for privacy, to capture the nature of challenges posed by information technologies. Contextual integrity ties adequate protection for privacy to norms of specific contexts, demanding that information gathering and dissemination be appropriate to that context and obey the governing norms of distribution within it. Building on the idea of "spheres of justice," developed by political philosopher Michael Walzer, this Article argues that public surveillance violates a right to privacy because it violates contextual integrity; as such, it constitutes injustice and even tyranny.
Article
Social networking sites like Facebook are rapidly gaining in popularity. At the same time, they seem to present significant privacy issues for their users. We analyze two of Facebooks’s more recent features, Applications and News Feed, from the perspective enabled by Helen Nissenbaum’s treatment of privacy as “contextual integrity.” Offline, privacy is mediated by highly granular social contexts. Online contexts, including social networking sites, lack much of this granularity. These contextual gaps are at the root of many of the sites’ privacy issues. Applications, which nearly invisibly shares not just a users’, but a user’s friends’ information with third parties, clearly violates standard norms of information flow. News Feed is a more complex case, because it involves not just questions of privacy, but also of program interface and of the meaning of “friendship” online. In both cases, many of the privacy issues on Facebook are primarily design issues, which could be ameliorated by an interface that made the flows of information more transparent to users. KeywordsInternet–Privacy–Contextual integrity–Social networking–Facebook
Conference Paper
The popularity of micro-blogging has made general-purpose information sharing a pervasive phenomenon. This trend is now impacting location sharing applications (LSAs) such that users are sharing their location data with a much wider and more diverse audience. In this paper, we describe this as social-driven sharing, distinguishing it from past examples of what we refer to as purpose-driven location sharing. We explore the differences between these two types of sharing by conducting a comparative two-week study with nine participants. We found significant differences in terms of users' decisions about what location information to share, their privacy concerns, and how privacy-preserving their disclosures were. Based on these results, we provide design implications for future LSAs.
Conference Paper
Long-term personal GPS data is useful for many UbiComp services such as traffic monitoring and environmental impact assessment. However, inference attacks on such traces can reveal private information including home addresses and schedules. We asked 32 participants from 12 households to collect 2 months of GPS data, and showed it to them in visualizations. We explored if they understood how their individual privacy concerns mapped onto 5 location obfuscation schemes (which they largely did), which obfuscation schemes they were most comfortable with (Mixing, Deleting data near home, and Randomizing), how they monetarily valued their location data, and if they consented to share their data publicly. 21/32 gave consent to publish their data, though most households' members shared at different levels, which indicates a lack of awareness of privacy interrelationships. Grounded in real decisions about real data, our findings highlight the potential for end-user involvement in obfuscation of their own location data.
Conference Paper
Most research regarding online social networks such as Facebook, MySpace, Linked-In and Friendster has looked at these networks in terms of activity within the online network, such as profile management and friending behavior. In this paper we are instead focusing on offline socializing structures around an online social network (exemplified by Facebook) and how this can facilitate in- person social life for students. Because students lead nomadic lives, they find Facebook a particularly useful tool for initiating and managing social gatherings, and as they adopt mobile technologies that can access online social networks, their ad-hoc social life is further enabled. We conclude that online social networks are a powerful tool for encouraging peripheral friendships, important in particular to students. We emphasize that the use of online social networks must be viewed from a perspective of use that involves both mobile and stationary platforms and that it is important to relate online and offline social practices. Author Keywords
Conference Paper
Advances in location-enhanced technology are making it easier for us to be located by others. These new technologies present a difficult privacy tradeoff, as disclosing one's location to another person or service could be risky, yet valuable. To explore whether and what users are willing to disclose about their location to social relations, we conducted a three-phased formative study. Our results show that the most important factors were who was requesting, why the requester wanted the participant's location, and what level of detail would be most useful to the requester. After determining these, participants were typically willing to disclose either the most useful detail or nothing about their location. From our findings, we reflect on the decision process for location disclosure. With these results, we hope to influence the design of future location-enhanced applications and services.
Conference Paper
Privacy practices in social network sites often appear paradoxical, as content-sharing behavior stands in conflict with the need to reduce disclosure-related harms. In this study we explore privacy in social network sites as a contextual information practice, managed by a process of boundary regulation. Drawing on a sample survey of undergraduate Facebook users, we examine a particular privacy-enhancing practice: having a friends-only Facebook profile. Particularly, we look at the association between network composition, expectancy violations, interpersonal privacy practices and having a friends-only profile. We find that expectancy violations by weak ties and increased levels of interpersonal privacy management are positively associated with having a friends-only profile. We conclude with a discussion of how these findings may be integrated into the design of systems to facilitate interaction while enhancing individual privacy.
Article
The purpose of this article is twofold. First, we summarize research on the topic of privacy in Human-Computer Interaction (HCI), outlining current approaches, results, and trends. Practitioners and researchers can draw upon this review when working on topics related to privacy in the context of HCI and CSCW. The second purpose is that of charting future research trends and of pointing out areas of research that are timely but lagging. This work is based on a comprehensive analysis of published academic and industrial literature spanning three decades, and on the experience of both ourselves and of many of our colleagues.
Article
We present a 3-week user study in which we tracked the locations of 27 subjects and asked them to rate when, where, and with whom they would have been comfortable sharing their locations. The results of analysis conducted on over 7,500 h of data suggest that the user population represented by our subjects has rich location-privacy preferences, with a number of critical dimensions, including time of day, day of week, and location. We describe a methodology for quantifying the effects, in terms of accuracy and amount of information shared, of privacy-setting types with differing levels of complexity (e.g., setting types that allow users to specify location- and/or time-based rules). Using the detailed preferences we collected, we identify the best possible policy (or collection of rules granting access to one’s location) for each subject and privacy-setting type. We measure the accuracy with which the resulting policies are able to capture our subjects’ preferences under a variety of assumptions about the sensitivity of the information and user-burden tolerance. One practical implication of our results is that today’s location-sharing applications may have failed to gain much traction due to their limited privacy settings, as they appear to be ineffective at capturing the preferences revealed by our study.
Online Social Networks On-The-Go: An Exploration of Facebook on the Mobile Phone. Horizon paper in Proc
  • N Bornoe
  • L Barkhuus
Bornoe, N. and Barkhuus, L. 2011. Online Social Networks On-The-Go: An Exploration of Facebook on the Mobile Phone. Horizon paper in Proc. of CSCW EA 2011.
Surveillance, Privacy, and the New Technology
  • Lyon
  • E Zureik
Lyon, D and Zureik, E. Surveillance, Privacy, and the New Technology. In Lyon, D. and Zureik, E. Computers, Surveillance, & Privacy (eds). University of Minnesota Press. 1996.
Lyon D and Zureik E. Surveillance Privacy and the New Technology
  • D Lyon
  • E Zureik