Conference Paper

Location disclosure to social relations: Why, when, & what people want to share

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Advances in location-enhanced technology are making it easier for us to be located by others. These new technologies present a difficult privacy tradeoff, as disclosing one's location to another person or service could be risky, yet valuable. To explore whether and what users are willing to disclose about their location to social relations, we conducted a three-phased formative study. Our results show that the most important factors were who was requesting, why the requester wanted the participant's location, and what level of detail would be most useful to the requester. After determining these, participants were typically willing to disclose either the most useful detail or nothing about their location. From our findings, we reflect on the decision process for location disclosure. With these results, we hope to influence the design of future location-enhanced applications and services.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Co-located (storytelling), Remote sharing Why to share? Connectedness, Reminding, Self-representation, Self-expression, Functional, Emotional With the rapid adoption of GPS sensors on smartphones, people started to share their locations [Wagner et al., 2010] not only with family members (e.g., through specialized apps) but also publicly (e.g., on SNS), which naturally prompted issues of privacy and security [Consolvo et al., 2005;Beldad and Citra Kusumadewi, 2015]. Furthermore, Brown et al. [2007] pointed out that people can infer one's activity from a shared location and, subsequently, make judgmental conclusions about one's behavior. ...
... We synthesized the motivations for sharing details of training routines and physical exercises from prior research in Table 2.3. The most frequent reasons to share are (1) to get feedback and guidance [Ojala, 2013]; (2) to create and maintain social ties (e.g., through finding a running partner) [Consolvo et al., 2005;Mueller et al., 2010]; ...
... Prior research identified peoples' socio-technical requirements across respective sharing spheres and outline attendant interpersonal and technological challenges (see Sections 2.3 through 2.5). In short, despite the fact that online sharing is a widespread practice nowadays, there are a number of issues end-users face, such as (a) managing access to shared content [Voida et al., 2006]; (b) selfpresentation to multiple audiences [Voida et al., 2005;Vitak, 2012]; (c) larger concerns of privacy [Ahern et al., 2007;Raij et al., 2011]; (d) trust in a sharing service [Beldad and Citra Kusumadewi, 2015]; (e) security [Consolvo et al., 2005;Lange, 2007]; and (f) avoiding information oversharing [Dalal et al., 2008]. A number of research efforts suggested different ways to address those challenges. ...
Book
Online social networks have made sharing personal experiences with others mostly in form of photos and comments a common activity. The convergenceof social, mobile, cloud and wearable computing expanded the scope of usergeneratedand shared content on the net from personal media to individual preferencesto physiological details (e.g., in the form of daily workouts) to informationabout real-world possessions (e.g., apartments, cars). Once everydaythings become increasingly networked (i.e., the Internet of Things), future onlineservices and connected devices will only expand the set of things to share. Given that a new generation of sharing services is about to emerge, it is of crucialimportance to provide service designers with the right insights to adequatelysupport novel sharing practices. This work explores these practices within twoemergent sharing domains: (1) personal activity tracking and (2) sharing economyservices. The goal of this dissertation is to understand current practices ofsharing personal digital and physical possessions, and to uncover correspondingend-user needs and concerns across novel sharing practices, in order to map thedesign space to support emergent and future sharing needs. We address this goalby adopting two research strategies, one using a bottom-up approach, the otherfollowing a top-down approach.In the bottom-up approach, we examine in-depth novel sharing practices within two emergent sharing domains through a set of empirical qualitative studies.We offer a rich and descriptive account of peoples sharing routines and characterizethe specific role of interactive technologies that support or inhibit sharingin those domains. We then design, develop, and deploy several technology prototypesthat afford digital and physical sharing with the view to informing the design of future sharing services and tools within two domains, personal activitytracking and sharing economy services.In the top-down approach, drawing on scholarship in human-computer interaction (HCI) and interaction design, we systematically examine prior workon current technology-mediated sharing practices and identify a set of commonalitiesand differences among sharing digital and physical artifacts. Based uponthese findings, we further argue that many challenges and issues that are presentin digital online sharing are also highly relevant for the physical sharing in thecontext of the sharing economy, especially when the shared physical objects havedigital representations and are mediated by an online platform. To account forthese particularities, we develop and field-test an action-driven toolkit for designpractitioners to both support the creation of future sharing economy platformsand services, as well as to improve the user experience of existing services.This dissertation should be of particular interest to HCI and interaction designresearchers who are critically exploring technology-mediated sharing practicesthrough fieldwork studies, as well to design practitioners who are building and evaluating sharing economy services.
... Co-located (storytelling), Remote sharing Why to share? Connectedness, Reminding, Self-representation, Self-expression, Functional, Emotional With the rapid adoption of GPS sensors on smartphones, people started to share their locations [Wagner et al., 2010] not only with family members (e.g., through specialized apps) but also publicly (e.g., on SNS), which naturally prompted issues of privacy and security [Consolvo et al., 2005;Beldad and Citra Kusumadewi, 2015]. Furthermore, Brown et al. [2007] pointed out that people can infer one's activity from a shared location and, subsequently, make judgmental conclusions about one's behavior. ...
... We synthesized the motivations for sharing details of training routines and physical exercises from prior research in Table 2.3. The most frequent reasons to share are (1) to get feedback and guidance [Ojala, 2013]; (2) to create and maintain social ties (e.g., through finding a running partner) [Consolvo et al., 2005;Mueller et al., 2010]; ...
... Prior research identified peoples' socio-technical requirements across respective sharing spheres and outline attendant interpersonal and technological challenges (see Sections 2.3 through 2.5). In short, despite the fact that online sharing is a widespread practice nowadays, there are a number of issues end-users face, such as (a) managing access to shared content [Voida et al., 2006]; (b) selfpresentation to multiple audiences [Voida et al., 2005;Vitak, 2012]; (c) larger concerns of privacy [Ahern et al., 2007;Raij et al., 2011]; (d) trust in a sharing service [Beldad and Citra Kusumadewi, 2015]; (e) security [Consolvo et al., 2005;Lange, 2007]; and (f) avoiding information oversharing [Dalal et al., 2008]. A number of research efforts suggested different ways to address those challenges. ...
Article
Online social networks have made sharing personal experiences with others – mostly in the form of photos and comments – a common activity. The convergence of social, mobile, cloud and wearable computing has expanded the scope of user-generated and shared content on the net from personal media to individual preferences to physiological details (e.g., in the form of daily workouts) to information about real-world possessions (e.g., apartments, cars). Once everyday things become increasingly networked (i.e., the Internet of Things), future online services and connected devices will only expand the set of “things” to share. Given that a new generation of sharing services is about to emerge, it is of crucial importance to provide service designers with the right insights to adequately support novel sharing practices. This work explores these practices within two emergent sharing domains: (1) personal activity tracking and (2) “sharing economy” services. The goal of this dissertation is to understand current practices of sharing personal digital and physical possessions, and to uncover corresponding end-user needs and concerns across novel sharing practices, in order to map the design space to support emergent and future sharing needs. We address this goal by adopting two research strategies, one using a bottom-up approach, the other following a top-down approach. In the bottom-up approach, we examine in-depth novel sharing practices within two emergent sharing domains through a set of empirical qualitative studies. We offer a rich and descriptive account of peoples’ sharing routines and characterize the specific role of interactive technologies that support or inhibit sharing in those domains. We then design, develop, and deploy several technology prototypes that afford digital and physical sharing with the view to informing the design of future sharing services and tools within two domains, personal activity tracking and sharing economy services. In the top-down approach, drawing on scholarship in human-computer interaction (HCI) and interaction design, we systematically examine prior work on current technology-mediated sharing practices and identify a set of commonalities and differences among sharing digital and physical artifacts. Based upon these findings, we further argue that many challenges and issues that are present in digital online sharing are also highly relevant for the physical sharing in the context of the sharing economy, especially when the shared physical objects have digital representations and are mediated by an online platform. To account for these particularities, we develop and field-test an action-driven toolkit for design practitioners to both support the creation of future sharing economy platforms and services, as well as to improve the user experience of existing services. This dissertation should be of particular interest to HCI and interaction design researchers who are critically exploring technology-mediated sharing practices through fieldwork studies, as well to design practitioners who are building and evaluating sharing economy services.
... Relatively little is known about the actual added value of different context aware features in mobile services, and about the factors that contribute to user adoption of these services. For example the degree of control with respect to location disclosure is important (Consolvo et al., 2005). However, research in customer needs and behavior concerning context aware mobile services is hindered by the fact that respondents have difficulty understanding the specific characteristics of services that are not yet available. ...
... Sharing of social context information has the potential of supporting (dynamic) groups of people within social networks to more efficiently and effectively perform their activities (Ter Hofte et al., 2006). However, users' willingness to disclose their location is dependent on who can see their location, for what purpose and with what accuracy (Consolvo et al., 2005). Hegering et al. (2004) advance several management challenges that are specifically hard when provisioning context aware services, such as configuration of the context value chain, fault management, accounting, performance (quality of context) and security. ...
... The willingness to share location information with one's partner is above 80%, but this drops sharply to around 50% and 25% for sharing with family and friends, respectively. This result confirms the finding in (Consolvo et al., 2005) that people's willingness to share location information depends on whom it is to be shared with. For most respondents the group of likely buddies for location sharing is limited to family members. ...
Chapter
Full-text available
Context aware services have the ability to utilize information about the user’s context and adapt services to a user’s current situation and needs. In this article the authors consider users’ perceptions of the added value of location awareness and presence information in mobile services. The authors use an experimental design, where stimuli comprising specific bundles of mobile services were presented to groups of respondents. The stimuli showed increasing, manipulated, levels of context-awareness, including location of the user and location and availability of buddies as distinct levels. Their results indicate that simply adding context aware features to mobile services does not necessarily provide added value to users, rather the contrary. The potential added value of insight in buddies’ location and availability is offset by people’s reluctance to share location information with others. Although the average perceived value overall is rather low there exists a substantial minority that does appreciate the added context aware features. High scores on constructs like product involvement, social influence and self-expressiveness characterize this group. The results also show that context aware service bundles with utilitarian elements have a higher perceived value than bundles with hedonic elements. On the basis of the different results some guidelines for designing context aware mobile services are formulated.
... Apart from this, Consolvo et al. [10] show that the majority of people do not read data policies although they are essential for a personalized privacy setting. According to their study, a lack of knowledge of privacy-related technologies raises the hurdle to assess the own privacy concern as well [10]. ...
... Apart from this, Consolvo et al. [10] show that the majority of people do not read data policies although they are essential for a personalized privacy setting. According to their study, a lack of knowledge of privacy-related technologies raises the hurdle to assess the own privacy concern as well [10]. This mismatch between intended and real privacy settings is also shown by Madejski et al. [29], who investigate privacy settings in an online social network service. ...
... In the first step, the privacy setting database is classified with an unsupervised learning approach. To group the privacy concern, we aim to map the clusters on the Westin/Harris Privacy Segmentation model [10]. This model defines people with the highest privacy concern as fundamentalist, people with a medium privacy concern and a balanced privacy attitude as pragmatist and people with no privacy concern as unconcerned. ...
Article
Full-text available
When requesting a web-based service, users often fail in setting the website's privacy settings according to their self privacy preferences. Being overwhelmed by the choice of preferences, a lack of knowledge of related technologies or unawareness of the own privacy preferences are just some reasons why users tend to struggle. To address all these problems, privacy setting prediction tools are particularly well suited. Such tools aim to lower the burden to set privacy preferences according to owners privacy preferences. To be in line with the increased demand for explainability and interpretability by regulatory obligations - such as the General Data Protection Regulation (GDPR) in Europe - this paper introduces an explainable model for default privacy setting prediction. Compared to the previous work we present an improved feature selection, increased interpretability of each step in model design and enhanced evaluation metrics to better identify weaknesses in the model's design before they go into production. As a result, we aim to provide an explainable and transparent tool for default privacy setting prediction which users easily understand and are therefore more likely to use.
... This figure is expected to increase to 258 million by 2022, a 45 percent increase in five years (Statista, 2022). Also, as the number of Android apps is growing so much attention is being given to other considerations as well such as security and privacy concerns, malfunctioning of the applications etc. Past research on the privacy issues of mobile users have largely concentrated on location monitoring and sharing (Barkhuus & Dey, 2003;Consolvo et al., 2005). Although location sharing is an essential aspect of mobile privacy, only two out of 134 Android apps correctly use location permissions (Statista, n.d.). ...
... To help users avoid privacy violations, several tools have been developed by Google for Android privacy. Most of the work focused on defining unsafe activities without addressing how people would make better safety choices (Barkhuus & Dey, 2003;Consolvo et al., 2005). Howell et al. (Felt et al., n.d.), however, recommended the development of a sensor control widget that would alert the user visually when a sensor such as the camera is working. ...
Book
Full-text available
A key focus in recent years has been on sustainable development and promoting environmentally conscious practices. In today’s rapidly evolving technological world, it is important to consider how technology can be applied to solve problems across disciplines and fields in these areas. Further study is needed in order to understand how technology can be applied to sustainability and the best practices, considerations, and challenges that follow.
... With the popularity of mobile devices, mobile messaging is becoming one of the most important communication channels [1, 2, 3], which includes short message service, mobile instant messaging, etc. Nowadays, in order to increase expressiveness and awareness, different types of context information are integrated into mobile messaging [4,5,6,7], including mobile device setting, location, activity, heart rate, etc. Researchers have designed mobile messaging applications that integrate context awareness and have used field studies to explore user behaviors and design dimensions [8,9,10,11]. They have found that context awareness can help users understand each others' status, enhance communication, and shorten the distance between each other, but users have privacy concerns about disclosing their locations and mobile device settings. ...
... Antila et al. proposed providing more privacy settings for users in mobile messaging [19]. Some researchers used surveys to study users' privacy settings when integrating location awareness in mobile messaging [9,20,21]. They found that social relationships, the granularity of location, the semantic type of location, etc., affect users' privacy settings for disclosing location in mobile messaging. ...
Preprint
Full-text available
Nowadays, different types of context information are integrated into mobile messaging to increase expressiveness and awareness, including mobile device setting, location, activity, heart rate, etc. Due to low recognition accuracy, activity awareness has been underutilized in mobile messaging. Recently, activity recognition technology has advanced. However, the user behaviors of activity awareness with improved technology have not been studied. In this study, we design ActAware, a mobile instant messaging application that integrates activity awareness based on re-summarized design dimensions, and conduct a field study to fully explore user behaviors. We found that the improved activity recognition accuracy and the addition of activity transition notification allow users to better utilize activity awareness to achieve utilitarian and emotional purposes. We also found that users have fewer privacy concerns when disclosing activity information in ActAware. Based on these findings, we provide design recommendations for mobile messaging to better support activity awareness.
... An increasing number of location-based services continuously track the locations of users, often without their knowledge [6,8,10,12,14,17,20,33,37,53,60,61,68]. This data is very valuable as it reveals personal information about the users [57]. ...
... There has been several studies on privacy concerns and location-based services [6,10,12,14,17,37,53,57,60]. In the following, we describe the most representative works in this area. ...
Article
Data gathered from smartphones enables service providers to infer a wide range of personal information about their users, such as their traits, their personality, and their demographics. This personal information can be made available to third parties, such as advertisers, sometimes unbeknownst to the users. Leveraging location information, advertisers can serve ads micro-targeted to users based on the places they visited. Understanding the types of information that can be extracted from location data and implications in terms of user privacy is of critical importance. In this context, we conducted an extensive in-the-wild research study to shed light on the range of personal information that can be inferred from the places visited by users, as well as privacy sensitivity of the personal information. To this end, we developed TrackingAdvisor, a mobile application that continuously collects user location and extracts personal information from it. The app also provides an interface to give feedback about the relevance of the personal information inferred from location data and its corresponding privacy sensitivity. Our findings show that, while some personal information such as social activities is not considered private, other information such as health, religious belief, ethnicity, political opinions, and socio-economic status is considered private by the participants of the study. This study paves the way to the design of privacy-preserving systems that provide contextual recommendations and explanations to help users further protect their privacy by making them aware of the consequences of sharing their personal data.
... Previous literature [11,39,43,52] on falsification as a privacy protective behavior focuses on the reasons why people lie, the impact of lies, frequency of lying, and encouraging people to tell the truth. Also, there is research looking at whether certain factors like the data being asked [7,22,25] and the context in which it is being asked [15,34,39], influence providing truthful information. However, the question is, can we really predict, in practice, for a given context and data attribute whether a particular individual is going to provide truthful information or not? ...
... Several studies [7,22,25] have found that if users perceive data as sensitive, they are more likely to alter it than providing specific information. Moreover, users are likely to provide deliberately vague responses to requests for information they think can personally identify them [8,22,50]. ...
Conference Paper
Full-text available
Individuals are known to lie and/or provide untruthful data when providing information online as a way to protect their privacy. Prior studies have attempted to explain when and why individuals lie online. However, no work has examined into how people lie or provide untruthful data online, i.e. the specific strategies they follow to provide untruthful data, or attempted to predict whether people would be truthful or not depending on the specific question/data. To close this gap, we present a large-scale study with over 800 participants. Based on it, we show that it is possible to predict whether users are truthful or not using machine learning with very high accuracy (89.7%). We also identify four main strategies people employ to provide untruthful data and show the factors that influence the choices of their strategies. We discuss the implications of findings and argue that understanding privacy lies at this level can help both users and data collectors. CCS CONCEPTS • Security and privacy → Privacy protections; Usability in security and privacy; • Human-centered computing → Empirical studies in HCI .
... However, the App never asked after the installation. As a result of this approach, the privacy of the users was violated, and the user was unable to differentiate between the required permissions and those unnecessarily acquired [7,8,9,10]. ...
... The most soughed information that has been retrieved from mobile devices is primarily focused on location tracking and sharing [7,8,9,10,11]. Location information sharing is a piece of critical information that can endanger user privacy. ...
Article
Full-text available
The Android operating system is used by most smartphones and has a huge number of users. App developers provide services in the form of apps. Due to the large user base and app developers, it must maintain the users' security and privacy. Android Apps are needed to access resource utilization after getting permission from the user. But the user does not read the app permission details for using the device resources and may grant excessive or objectionable permissions, where Android gray-ware app developers collect large amounts of personal information through such apps. The user is also unaware of what type of permission they are granting to the apps. This paper presents an assessment method for examining the security and privacy controls of Android users' adaptation to the Android permission model. The study consists of evaluating Android users' attention while installing or understanding the purpose of the permissions during or after installation. So, the assessment will be carried out through an Android app whose sole purpose is to monitor all the permissions granted to every installed app on the device. During the study, it was discovered that the data set of 102 users paid little attention and were granted unnecessary permissions. Moreover, the android deployed permission model was assessed, and we observed the literacy and awareness of users of the Android permission model is an essential factor that can control the misuse of a malicious app.
... To that end, we conducted a two-week feld study with 15 main participants, who invited a total of 82 of their peers to provide data about them when they were not receptive, and then tested how well the peers' assessment data agreed with the main participants' own assessments. The three types of data we requested from both these participant groups, commonly sought in MA studies, were the main participants' 1) current locations (e.g., [8]), 2) activity types (e.g., [40]), and 3) emotional statuses (e.g., [12]). The feld study per se was guided by two research questions: RQ1 How many assessment responses could be obtained from the main participants' peers when they themselves were not receptive? ...
... It could have occurred because the main participants were engaged in multiple activities and/or at several locations, but only reported the major ones in their own responses. Also, people can be reluctant to make detailed online disclosures of contextual information that might enable others to infer their daily routines or habits [8]. In light of these two factors, peers' lack of confdence in their activity and location responses is perhaps less surprising. ...
... Concurrently, HCI started to build an understanding of when and how users were willing to share their location. Consolvo et al. [11] identied three reasons when people complied with location sharing: the relationship with the person requesting the location, the reasons for the request and the level of detail of the data that was requested. Notably, similar to the majority of previous work, the participants in the study of Consolvo et al. [11] had agency (i.e. they had the freedom to decide if they would share information about their location or not). ...
... Consolvo et al. [11] identied three reasons when people complied with location sharing: the relationship with the person requesting the location, the reasons for the request and the level of detail of the data that was requested. Notably, similar to the majority of previous work, the participants in the study of Consolvo et al. [11] had agency (i.e. they had the freedom to decide if they would share information about their location or not). In contrast, since QApp was a governmentenforced app, our participants did not have agency in that sense, meaning that they were not able to decide if they were willing to share location information with the authorities. ...
... As communication activity increasingly moves to mobile and ubiquitous platforms [21,41], measuring individuals' receptivity across constantly changing contexts is becoming more challenging. Prior research used various contextual factors in its attempts to estimate and predict various notions of receptivity, including attentiveness [22,57], responsiveness [38,43], interruptibility [48,50,52,53,55,73,77], and opportune moment [25,31,37,39,56,59,78,79]. For example, Dingler and Pielot [22] predicted mobile users' attentiveness to incoming messages using logs of their phone usage, and achieved an accuracy rate of close to 80%. Lee et al. [43] likewise predicted users' responsiveness to their IM contacts based on IM chat logs, with up to 71% accuracy (AUROC). ...
... Buchenscheit [11] also suggested that OSI may be used to infer users' daily routines and habits, such as bedtimes and waking-up times; when they deviate from such routines and habits; whether they are using systems when they are expected not to, e.g., when they are meant to be working; and even whom they are communicating with (see also [7,12,23]). Therefore, users sometimes seek to deactivate OSI features or to otherwise manage their own online status: including by controlling what information is being shared, at what granularity, and with whom [8,10,21,32,41,70,80]. Previous studies of status sharing have consistently found that individuals prefer to appear either away or ofine, i.e., to remain "invisible" [17,54]. ...
... Rooted in a literature survey on several user studies under privacy in location sharing (e.g., Reference [16]), mobile devices (e.g., Reference [5]), and smart environments (e.g., Reference [62]) and supported with his own studies, Koenings argues that: who (recipient and relationship with them), what (content), when (context), how (processed, collected, distributed), and why (purpose, benefits) are the most influential factors that affect users' awareness. Users give high priority to "who," "what," and "why." ...
Article
The emergence of ubiquitous computing (UbiComp) environments has increased the risk of undesired access to individuals’ physical space or their information, anytime and anywhere, raising potentially serious privacy concerns. Individuals lack awareness and control of the vulnerabilities in everyday contexts and need support and care in regulating disclosures to their physical and digital selves. Existing GUI-based solutions, however, often feel physically interruptive, socially disruptive, time-consuming and cumbersome. To address such challenges, we investigate the user interaction experience and discuss the need for more tangible and embodied interactions for effective and seamless natural privacy management in everyday UbiComp settings. We propose the Privacy Care interaction framework, which is rooted in the literature of privacy management and tangible computing. Keeping users at the center, Awareness and Control are established as the core parts of our framework. This is supported with three interrelated interaction tenets: Direct, Ready-to-Hand, and Contextual . Direct refers to intuitiveness through metaphor usage. Ready-to-Hand supports granularity, non-intrusiveness, and ad hoc management, through periphery-to-center style attention transitions. Contextual supports customization through modularity and configurability. Together, they aim to provide experience of an embodied privacy care with varied interactions that are calming and yet actively empowering. The framework provides designers of such care with a basis to refer to, to generate effective tangible tools for privacy management in everyday settings. Through five semi-structured focus groups, we explore the privacy challenges faced by a sample set of 15 older adults (aged 60+) across their cyber-physical-social spaces. The results show conformity to our framework, demonstrating the relevance of the facets of the framework to the design of privacy management tools in everyday UbiComp contexts.
... Westin's segmentation has been used and validated in different works (31,32). Dupree et al. (33) provide an overview of studies that found different degrees of correlation between Westin' clusters and the tested scenarios, concluding in an "ample criticism of Westin's categorisation of users, both from a methodological perspective [. . . ...
Article
Full-text available
The reliance on data donation from citizens as a driver for research, known as citizen science, has accelerated during the Sars-Cov-2 pandemic. An important enabler of this is Internet of Things (IoT) devices, such as mobile phones and wearable devices, that allow continuous data collection and convenient sharing. However, potentially sensitive health data raises privacy and security concerns for citizens, which research institutions and industries must consider. In e-commerce or social network studies of citizen science, a privacy calculus related to user perceptions is commonly developed, capturing the information disclosure intent of the participants. In this study, we develop a privacy calculus model adapted for IoT-based health research using citizen science for user engagement and data collection. Based on an online survey with 85 participants, we make use of the privacy calculus to analyse the respondents' perceptions. The emerging privacy personas are clustered and compared with previous research, resulting in three distinct personas which can be used by designers and technologists who are responsible for developing suitable forms of data collection. These are the 1) Citizen Science Optimist, the 2) Selective Data Donor, and the 3) Health Data Controller. Together with our privacy calculus for citizen science based digital health research, the three privacy personas are the main contributions of this study.
... Privacy concerns were found to vary based on data type as well as data content. For example, perceptions and valuation of location sharing as a privacy risk vary across contexts and between individuals, and nations [15]. Hence, we also consider if users were concerned with being contacted by colleagues over mobile messaging applications outside of workplace and if it changed how they managed information over messaging platforms (like restricting media download). ...
Conference Paper
Full-text available
The purpose of this study is to understand the privacy concerns and behavior of non-WEIRD populations in online mes-saging platforms. Analysis of surveys (n = 674) of WhatsApp users in Saudi Arabia and India revealed that Saudis had significantly higher concerns about being contacted by strangers. In contrast, Indians showed significantly higher concerns with respect to social contact from professional colleagues. Demo-graphics impinge privacy preferences in both populations, but in different ways. Results from regression analysis show that there are statistically significant differences between the privacy behaviors of Saudis and Indians. In both cases, privacy concerns were strongly correlated with their reported privacy behaviors. Despite the differences, we identified technical solutions that could address the concerns of both populations of participants. We close by discussing the applicability of our recommendations, specifically those on transparency and consent, to other applications and domains.
... Ahern et al. [13] investigated how users select the privacy settings for uploaded photos, and found that users are more likely to set as private photos that are taken at frequently photographed locations while tending to set photos from less frequented locations public. Consolvo et al. [22] found that users were willing to disclose exact locations, but that study focused on a different setting where users were asked about sharing information with friends, family and colleagues. Tang et al. [68] identified how users adapt their location sharing behavior and explored the different ways and levels of granularity at which users decide to share their location under different hypothetical scenarios. ...
Conference Paper
Full-text available
The exposure of location data constitutes a significant privacy risk to users as it can lead to de-anonymization, the inference of sensitive information, and even physical threats. In this paper we present LPAuditor, a tool that conducts a comprehensive evaluation of the privacy loss caused by public location metadata. First, we demonstrate how our system can pinpoint users’ key locations at an unprecedented granularity by identifying their actual postal addresses. Our evaluation on Twitter data highlights the effectiveness of our techniques which outperform prior approaches by 18.9%-91.6% for homes and 8.7%-21.8% for workplaces. Next we present a novel exploration of automated private information inference that uncovers “sensitive” locations that users have visited (pertaining to health, religion, and sex/nightlife). We find that location metadata can provide additional context to tweets and thus lead to the exposure of private information that might not match the users’ intentions. We further explore the mismatch between user actions and information exposure and find that older versions of the official Twitter apps follow a privacy-invasive policy of including precise GPS coordinates in the metadata of tweets that users have geotagged at a coarse-grained level (e.g., city). The implications of this exposure are further exacerbated by our finding that users are considerably privacy-cautious in regards to exposing precise location data. When users can explicitly select what location data is published, there is a 94.6% reduction in tweets with GPS coordinates. As part of current efforts to give users more control over their data, LPAuditor can be adopted by major services and offered as an auditing tool that informs users about sensitive information they (indirectly) expose through location metadata.
... LBSs usually utilize geographical information (e.g., latitude and longitude) to represent a place. Recently, semantic labels (Consolvo et al., 2005), e.g., "home" and "school", are employed to complement the representation of a place, which can make LBSs more intelligent, e.g., providing semantic-based services (Chen, Lyu, Xu, Long, & Chen, 2020;Ma, Mao, Ba, & Li, 2020;Noh, Lee, Oh, Hwang, & Cho, 2012;Paule, Sun, & Moshfeghi, 2019;Valverde-Rebaza, Roche, Poncelet, & Lopes, 2018;Wu, Kao, Wu, & Huang, 2015). ...
Article
Personalized place semantics recognition is the process of giving individual semantic labels to locations, e.g., “home” and “school”. Capturing personalized place semantics exactly is critical for location-based services. To address the problems of existing methods, i.e., the insufficient utilization of context information and the neglect of the semantic correlation across related tasks, we propose a method for personalized place semantics recognition, which employs embedding methods, including deep learning based embedding and word embedding, to obtain effective representations from multi-context information (e.g., system settings, phone usage, and user activities). Meanwhile, we jointly model personalized place semantics and App usage sequences by sharing the App representations, which can improve generalization capability by exploiting the commonalities and differences across related tasks. We evaluate the proposed method on the Mobile Data Challenge dataset, and experimental results show that it outperforms existing methods significantly.
... However, the literature points to the sensitivity of sharing such data. Much location-sharing research has been devoted to understanding with whom and under what conditions people are willing to share their location [50][51][52][53]. A number of studies have attempted to predict end-user privacy behavior in mobile contexts. ...
Article
Full-text available
Smartphone location sharing is a particularly sensitive type of information disclosure that has implications for users’ digital privacy and security as well as their physical safety. To understand and predict location disclosure behavior, we developed an Android app that scraped metadata from users’ phones, asked them to grant the location-sharing permission to the app, and administered a survey. We compared the effectiveness of using self-report measures commonly used in the social sciences, behavioral data collected from users’ mobile phones, and a new type of measure that we developed, representing a hybrid of self-report and behavioral data to contextualize users’ attitudes toward their past location-sharing behaviors. This new type of measure is based on a reflective learning paradigm where individuals reflect on past behavior to inform future behavior. Based on data from 380 Android smartphone users, we found that the best predictors of whether participants granted the location-sharing permission to our app were: behavioral intention to share information with apps, the “FYI” communication style, and one of our new hybrid measures asking users whether they were comfortable sharing location with apps currently installed on their smartphones. Our novel, hybrid construct of self-reflection on past behavior significantly improves predictive power and shows the importance of combining social science and computational science approaches for improving the prediction of users’ privacy behaviors. Further, when assessing the construct validity of the Behavioral Intention construct drawn from previous location-sharing research, our data showed a clear distinction between two different types of Behavioral Intention: self-reported intention to use mobile apps versus the intention to share information with these apps. This finding suggests that users desire the ability to use mobile apps without being required to share sensitive information, such as their location. These results have important implications for cybersecurity research and system design to meet users’ location-sharing privacy needs.
Chapter
Mobile Location Based Services (MLBS) have been in operation since the 1970s. Conceived initially for military use, the Global Positioning System technology was later released to the world for other applications. As usage of the technology increased, mobile network points, developed by mobile service operators, supplemented its usage in various applications of MLBS. This chapter charts the trajectory of MLBS applications in the mass market, afforded by the evolution of technology, digital, and mobility cultures. Assimilating various MLBS classifications, it then situates examples into four quadrants according to the measures of user-position or device-position focus, and alert-aware or active-aware applications. The privacy implications of MLBS are captured on the economic, social, and political fronts, and its future is discussed.
Chapter
Social networks are migrating to mobile. Mobile social networks combine community-level interactivity with mobile communications and location-awareness, and hence represent a novel phenomenon with unique properties. Due to the growing business potential of this new trend and its increasing impact on the realm of communications, mobile social networks started to draw scholars’ attention. Researchers in computer-mediated-communication have been investigating the phenomenon from a variety of angles, yet marketing literature is falling behind. This chapter aims to review existing academic knowledge on mobile social networks and provide a conceptual framework to study and understand this complex, emergent phenomenon and discuss related future research avenues.
Thesis
As the number of online services powered by personal data is growing, the technology behind those services raises unprecedented concerns with regard to users’ privacy. Although there are significant privacy engineering efforts made to provide users with an acceptable level of privacy, often users lack mechanisms to understand, decide and control how their personal data is collected, processed and used. On one hand, this affects users’ trust towards the service provider; on the other, under some regulatory frameworks the service provider is legally required to obtain user’s consent to collection, use and processing of personal data. Therefore, in this thesis, we focus on privacy engineering mechanisms for consent. As opposed to the simple act of clicking ‘I agree’, we view consent as a process, which involves the formation of user’s privacy preferences, the agreement between the user and the service provider and the implementation of that agreement in the service provider’s system. Firstly, we focus on understanding the user’s consent decision-making. Specifically, we explore the role of privacy knowledge in data sharing. To that end, we conduct an experiment, where we inform participants how they stop allowing the collection of their online activity data. We compare the behaviour of two groups with an increased knowledge of data collection: one provided only with actionable information on privacy protection, and one additionally informed about the details of how and by whom the collection is conducted. In our experiment, we observe no significant difference between the two groups. Our results suggest that procedural privacy knowledge on how users can control their privacy has impact on their consent decisions. However, we also found that the provision of factual privacy knowledge in addition to procedural knowledge does not effect users’ prevention intent or behaviour. These outcomes suggest that the information about privacy protection itself may act a stimulus for users to refuse consenting to data collection. Secondly, we investigate the idea of agent-based privacy negotiations between a user and a service provider. To that end, we propose a novel framework for the implementation of semi-automated, multi-issue negotiation. Our findings suggest that such a framework is more suitable for negotiation in the privacy domain that the ‘take-it-or-leave-it’ approach or setting privacy preferences manually, because it allows for a collaborative search for mutually beneficial agreements: users consent to data use more often, consent is more consistent with users’ data-sharing sensitivity and it requires less users’ effort. Moreover, in order for an agent to accurately represent the user, the agent needs to learn the user’s privacy preferences. To address this problem, we compare two approaches to privacy preference elicitation through a user study: one where the preferences are personalised for each user based on their previous consent and one where the user classified into one of the three privacy profiles and later re-classified if their consent decisions reflect a change. We find that the latter approach can represent the user more accurately in the initial negotiation rounds than those of the former. Finally, we look at the implementation of consent on the service provider’s side after the agreement regarding data use has been made. In more detail, we consider a scenario where a user can deny consent to process certain data for certain purposes. To that end, the existing approaches do not allow service providers to satisfy the user’s consent in the optimal way. Therefore, we propose a novel graph-theoretic model for the service provider to store consent, which indicates the kinds of data processing that can be performed under the privacy agreement. Then, we formalise the consent problem as a constraint satisfaction problem on graphs. We provide several algorithms to solve the problem and compare them in terms of their trade off between execution time and quality of the solution. Our algorithms can provide a nearly optimal solution in the face of tens of constraints and graphs of thousands of nodes in a few seconds. The research presented in this thesis contributes to understanding users’ consent decision making and addresses an emerging need for technologies that can help service providers manage users’ consent. We propose ideas for potentially fruitful lines of exploration within this area.
Chapter
Android applications require necessary permissions to perform the right function, but at the same time, there are other permissions that may affect the user privacy that need to be retrieved and revoked. Each step forward in digitization is also leading to an increase in the number of malware attacks by the malicious applications with the use of permissions. To solve the problem of APK app user data privacy and security, the authors created a system that searches for and downloads Android apps from the Android Market automatically. In addition, they produced a thorough mapping of Android application programming interface (API) calls to the needed permission(s) for each call, if any were required. Based on a static examination of each application's APK bytecode, they next ran an analysis of 141,372 Android applications to see if they had the right set of permissions. According to the research, most mobile app developers do not use the proper permission set and either over-specify or under-specify their security requirements.
Article
Many social network sites (SNSs) have become available around the world and users’ online social networks increasingly include contacts from different cultures. However, there is lack of investigation into the concrete cultural differences in the effects of contextual factors and privacy concerns on users’ privacy decisions on social network sites (SNSs). The goal of this paper is to understand how contextual factors and privacy concerns cast different impact on privacy decisions, such as friend request decisions, information disclosure and perceived risk, in different countries. We performed a quantitative study through a large-scale online survey across the US, Korea and China to model the relationships between contextual factors, privacy concerns and privacy decisions. We find that the contextual influence and focus of privacy concerns vary between the individualistic and collectivistic countries in our sample. We suggest that multinational SNS service providers should consider different contextual factors and focus of privacy concerns in different countries and customise privacy designs and friend recommendation algorithms in SNSs in different countries.
Chapter
Frequent contact with online businesses requires Internet users to distribute large amounts of personal information. This spreading of users’ information through different Websites can eventually lead to increased probabilities for identity theft, profiling and linkability attacks, as well as other harmful consequences. Methods and tools for securing people’s online activities and protecting their privacy on the Internet, called Privacy Enhancing Technologies (PETs), are being designed and developed. However, these technologies are often perceived as complicated and obtrusive by users who are not privacy aware or are not computer or technology savvy. This chapter explores the way in which users’ involvement has been considered during the development process of PETs and argues that more democratic approaches of user involvement and data handling practices are needed. It advocates towards an approach in which people are not only seen as consumers of privacy and security technologies, but where they can play a role as the producers of ideas and sources of inspiration for the development of usable PETs that meet their actual privacy needs and concerns.
Article
Location data reveals users’ trajectories, yet it is often shared to enable many location-based services (LBS). In this paper, we propose a privacy-preserving geospatial query system with geo-hashing and somewhat homomorphic encryption. We geo-hash locations using space-filling curves for locality-preserving dimension reduction, which allows the users to specify granularity preference of their location and is agnostic to specific maps or precoded location models. Our system features three homomorphic algorithms to compute geospatial queries on encrypted location data and encrypted privacy preferences. Comparing with previous work, one of our algorithms reduces the multiplicative depth of a basic homomorphic computation approach by more than half, which significantly speeds it up. We then present an optimized prototype and experimentally demonstrates its utility in spatial cloaking.
Chapter
Development and deployment of location-based systems is a key consideration in the design of new mobile technologies. Critical to the design process is to understand and manage the expectations of stakeholders (including funders, research partners and end users) for these systems. In particular, the way in which expectations impact upon technology development choices between small-scale, ‘high tech' innovations or larger scalable solutions. This paper describes the differences in a revolutionary design process (for ‘high tech' prototypes or catwalk technologies) versus an evolutionary design process (for scalable or prêt-a-porter systems), as exemplified in two location-based mobile interaction case studies. One case study exemplifies a revolutionary design process and resultant system, and the other an evolutionary design process and system. The use of these case studies is a clear natural progression from the paper that first described the concept of ‘catwalk technologies' (Adams et al, 2013), which itself drew upon research that used mobile devices for outdoor 'in the wild' locations. This paper presents a set list of fifteen heuristic guidelines based upon an analysis of these case studies. These heuristics present characteristics and key differences between the two types of design process. This paper provides a key reference point for researchers, developers and the academic community as a whole, when defining a project rationale for designing and developing technical systems. In addition, we refer to the role of the researcher/research team in terms of guiding and managing stakeholder and research team expectations and how this relates to the planning and deployment of catwalk or prêt-à-porter technologies. Lastly, we state how this research has vital implications for planning and enacting interventions and sequences of interactions with stakeholders and, crucially, in the planning of future research projects.
Article
While recent research on intelligent transportation systems including vehicular communication systems has focused on technical aspects, little research work has been conducted on drivers’ privacy perceptions and preferences. Understanding the driver’s privacy perceptions and preferences will allow researchers to design usable privacy and identity management systems offering user privacy choices and controls for intelligent transportation systems. We conducted in-depth semi-structured interviews with 17 Swedish drivers to analyse their privacy perceptions and preferences for intelligent transportation systems, particularly for user control and for privacy trade-offs with cost, safety and usability. We also compare our results from the interviews with Swedish drivers with results from interviews that we conducted previously with South African drivers. Our cross-cultural comparison shows that perceived privacy implications, the drivers’ willingness to share location information under certain conditions with other parties, as well as their appreciation of Privacy Enhancing Technologies differ significantly across drivers with different cultural backgrounds. We further discuss the cultural impact on privacy preferences, including those for privacy trade-offs, and the implications of our results for usable privacy-enhancing Identity Management for future vehicular communication systems. In particular, we provide recommendations for suitable pre-defined privacy options to be offered to users with different cultural backgrounds enabling them to easily make privacy-related control choices.
Article
The aim of this study was to investigate the use of mobile computing technologies for improving the mobility of Windhoek residents, through exploring the perceptions and attitudes of Windhoek taxi drivers and passengers subjected to a real time experiment of an existing mobile taxi hailing and dispatcher system, Taxi StartApp. The study used the Experience Sampling Method (ESM) where users simulated a taxi hailing service that included a pick and drop in real time and then assessed their perceptions of the location-based technology and its service. The research further deployed a widely used System Usability Scale to evaluate the Taxi StartApp system usability, as a combination of effectiveness, efficiency and satisfaction. This research employed qualitative analysis through the adoption of ESM. Eighty samples were selected randomly among taxi drivers and passengers within the town (Windhoek) vicinity. The research falls in the technology adoption research category and the lessons learned from the ESM will help developers to come up with demographically appropriate applications that will address the transportation and mobility challenges in Windhoek.
Chapter
We envision a unique social interaction system, ‘users-as-beacons’ built upon Bluetooth Low Energy (BLE) beacon technology, that could provide potential privacy benefits. It leverages BLE to employ the user devices to act as mobile beacons. Its potential applications include community-based social networking, localized advertising, and instant reviewing. To evaluate the potential for this system and inform design, we conducted an exploratory interview study of 27 participants of a hypothetical localized content-creating system. Using a design prototype and multiple scenarios as prompts, we asked questions regarding users’ perceptions of the potential benefits and challenges of a users-as-beacons system, focusing in particular on their privacy concerns and needs. Our results indicate that users do perceive the benefit of increased trustworthiness of user-beacons, but do not have expectations of greater location or behavioral tracking privacy. We highlight multiple design challenges of this system in supporting the trustworthy, relevant, and timely sharing of posts between people in a community.
Article
Full-text available
Passengers’ security perceptions of ride-hailing services influence their acceptance and adoption of such online-to-offline services. The current study aims to study factors that influence Chinese passengers’ security perceptions of ride-hailing services in general and for a specific ride. Based on a literature review and information sourced from focus groups, we identified three groups of issues influencing security perceptions: the perception of the risk, the situation of a specific ride, and individual differences. Through two rounds of surveys, involving 163 and 314 respondents respectively, we identified four risk factors and four situation factors. Regression analyses showed that general security perceptions of ride-hailing services were influenced by two risk factors, perceived possibility and familiarity of the risk. Situational security perceptions were only slightly influenced by the general perceived security, but more influenced by situation characteristics including interaction with the driver and the vehicle, trip companions, and trip duration. The online platform, however, had no significant influence. The differences between sexual assaults and other types of personal assaults, between ride-hailing services and traditional taxi services, and between female and male passengers were analyzed and discussed. The results provide implications on how to improve ride-hailing services to enhance passengers’ perceived security.
Article
Full-text available
Although privacy settings are important not only for data privacy, but also to prevent hacking attacks like social engineering that depend on leaked private data, most users do not care about them. Research has tried to help users in setting their privacy settings by using some settings that have already been adapted by the user or individual factors like personality to predict the remaining settings. But in some cases, neither is available. However, the user might have already done privacy settings in another domain, for example, she already adapted the privacy settings on the smartphone, but not on her social network account. In this article, we investigate with the example of four domains (social network posts, location sharing, smartphone app permission settings and data of an intelligent retail store), whether and how precise privacy settings of a domain can be predicted across domains. We performed an exploratory study to examine which privacy settings of the aforementioned domains could be useful, and validated our findings in a validation study. Our results indicate that such an approach works with a prediction precision about 15%–20% better than random and a prediction without input coefficients. We identified clusters of domains that allow model transfer between their members, and discuss which kind of privacy settings (general or context-based) leads to a better prediction accuracy. Based on the results, we would like to conduct user studies to find out whether the prediction precision is perceived by users as a significant improvement over a “one-size-fits-all” solution, where every user is given the same privacy settings.
Article
Today, industry practitioners (e.g., data scientists, developers, product managers) rely on formal privacy reviews (a combination of user interviews, privacy risk assessments, etc.) in identifying potential customer acceptance issues with their organization’s data practices. However, this process is slow and expensive, and practitioners often have to make ad-hoc privacy-related decisions with little actual feedback from users. We introduce Lean Privacy Review (LPR), a fast, cheap, and easy-to-access method to help practitioners collect direct feedback from users through the proxy of crowd workers in the early stages of design. LPR takes a proposed data practice, quickly breaks it down into smaller parts, generates a set of questionnaire surveys, solicits users’ opinions, and summarizes those opinions in a compact form for practitioners to use. By doing so, LPR can help uncover the range and magnitude of different privacy concerns actual people have at a small fraction of the cost and wait-time for a formal review. We evaluated LPR using 12 real-world data practices with 240 crowd users and 24 data practitioners. Our results show that (1) the discovery of privacy concerns saturates as the number of evaluators exceeds 14 participants, which takes around 5.5 hours to complete (i.e., latency) and costs 3.7 hours of total crowd work ( $80 in our experiments); and (2) LPR finds 89% of privacy concerns identified by data practitioners as well as 139% additional privacy concerns that practitioners are not aware of, at a 6% estimated false alarm rate.
Chapter
With the popularity of social media, researchers and designers must consider a wide variety of privacy concerns while optimizing for meaningful social interactions and connection. While much of the privacy literature has focused on information disclosures, the interpersonal dynamics associated with being on social media make it important for us to look beyond informational privacy concerns to view privacy as a form of interpersonal boundary regulation. In other words, attaining the right level of privacy on social media is a process of negotiating how much, how little, or when we desire to interact with others, as well as the types of information we choose to share with them or allow them to share about us. We propose a framework for how researchers and practitioners can think about privacy as a form of interpersonal boundary regulation on social media by introducing five boundary types (i.e., relational, network, territorial, disclosure, and interactional) social media users manage. We conclude by providing tools for assessing privacy concerns in social media, as well as noting several challenges that must be overcome to help people to engage more fully and stay on social media.
Chapter
This chapter introduces relevant privacy frameworks from academic literature that can be useful to practitioners and researchers who want to better understand privacy and how to apply it in their own contexts. We retrace the history of how networked privacy research first began by focusing on privacy as information disclosure. Privacy frameworks have since evolved into conceptualizing privacy as a process of interpersonal boundary regulation, appropriate information flows, design-based frameworks, and, finally, user-centered privacy that accounts for individual differences. These frameworks can be used to identify privacy needs and violations, as well as inform design. This chapter provides actionable guidelines for how these different frameworks can be applied in research, design, and product development.
Conference Paper
Recent research has shown that explanations serve as an important means to increase transparency in group recommendations while also increasing users’ privacy concerns. However, it is currently unclear what personal and contextual factors affect users’ privacy concerns about various types of personal information. This paper studies the effect of users’ personality traits and preference scenarios —having a majority or minority preference— on their privacy concerns regarding location and emotional information. To create natural scenarios of group decision-making where users can control the amount of information disclosed, we develop Toury- Bot, a chat-bot agent that generates natural language explanations to help group members explain their arguments for suggestions to the group in the tourism domain. We conducted a user study in which we instructed 541 participants to convince the group to either visit or skip a recommended place. Our results show that users generally have a larger concern regarding the disclosure of emotion compared to location information. However, we found no evidence that personality traits or preference scenarios affect privacy concerns in our task. Further analyses revealed that task design (i.e., the pressure on users to convince the group) had an effect on participants’ emotion-related privacy concerns. Our study also highlights the utility of providing users with the option of partial disclosure of personal information, which appeared to be popular among the participants.
Article
Full-text available
While there is increasing global attention to data privacy, most of their current theoretical understanding is based on research conducted in a few countries. Prior work argues that people's cultural backgrounds might shape their privacy concerns; thus, we could expect people from different world regions to conceptualize them in diverse ways. We collected and analyzed a large-scale dataset of tweets about the #CambridgeAnalytica scandal in Spanish and English to start exploring this hypothesis. We employed word embeddings and qualitative analysis to identify which information privacy concerns are present and characterize language and regional differences in emphasis on these concerns. Our results suggest that related concepts, such as regulations, can be added to current information privacy frameworks. We also observe a greater emphasis on data collection in English than in Spanish. Additionally, data from North America exhibits a narrower focus on awareness compared to other regions under study. Our results call for more diverse sources of data and nuanced analysis of data privacy concerns around the globe.
Article
Full-text available
People are not always able to respond immediately to incoming messages on their mobile devices, either due to engagement in another task or simply because the moment is inconvenient for them. This delay in responding could affect social relationships, as there are often expectations associated with mobile messaging and people may experience a lingering pressure to attend to their messages. In this work, we investigate an approach for generating automated contextual responses on behalf of message recipients when they are not available to respond. We first identify several types of contextual information that can be obtained from a user’s smartphone and explore whether those can be used to explain unavailability. We then assess users’ perception of the usefulness of these sensor-based categories and their level of comfort with sharing such information through a Mechanical Turk survey study. Our results show emergent groups with varying preferences with regards to the usefulness and comfort in sharing two types of contextual information: user state and device state. Further, we also observed a strong influence of message context (i.e., message urgency and social tie strength) in users’ perceptions of these auto-generated messages. Our research provides understanding of users’ perceptions of sharing context through an autonomous agent that can help design and create effective approaches towards enabling communication awareness.
Article
Through the past two and a half years, COVID-19 has swept through the world and new technologies for mitigating spread, such as exposure notification applications and contact tracing, have been implemented in many countries. However, the uptake has differed from country to country and it has not been clear if culture, death rates or information dissemination have been a factor in their adoption rate. However, these apps introduce issues of trust and privacy protection, which can create challenges in terms of adoptions and daily use. In this paper we present the results from a cross-country survey study of potential barriers to adoption of in particular COVID-19 contact tracing apps. We found that people's existing privacy concerns are an have a reverse correlation with adoption behavior but that the geographical location, as well as other demographics, such as age and gender, do not have significant effect on either adoption of the app or privacy concerns. Instead, a better understanding of what data is collected through the apps lead to a higher level of adoption. We provide suggestions for how to approach the development and deployment of contact tracing apps and more broadly health tracking apps.
Preprint
Full-text available
While there is increasing global attention to data privacy, most of their current theoretical understanding is based on research conducted in a few countries. Prior work argues that people's cultural backgrounds might shape their privacy concerns; thus, we could expect people from different world regions to conceptualize them in diverse ways. We collected and analyzed a large-scale dataset of tweets about the #CambridgeAnalytica scandal in Spanish and English to start exploring this hypothesis. We employed word embeddings and qualitative analysis to identify which information privacy concerns are present and characterize language and regional differences in emphasis on these concerns. Our results suggest that related concepts, such as regulations, can be added to current information privacy frameworks. We also observe a greater emphasis on data collection in English than in Spanish. Additionally, data from North America exhibits a narrower focus on awareness compared to other regions under study. Our results call for more diverse sources of data and nuanced analysis of data privacy concerns around the globe.
Article
Full-text available
Purpose: The paper assesses expectations regarding the amount of discount in insurance premium, which could compensate insured for loss of privacy should it transpire that one’s driving style is to be monitored. Design/Methodology/Approach: The analysis is carried out using data collected through a survey conducted on a sample of clients of insurance companies. As part of the analysis, Pearson's Chi-Square Test is used. It is based on comparing empirical values with expected values, where expected values are treated as variables that would occur if there were no relationship between them. Findings: Analysis showed that 75% of the respondents who would allow the possibility of concluding a UBI type contract at all would expect a discount of up to 30% of their current premium. While a maximum of 15% discount relates to as many as 37,45% of the respondents. The factors like gender, age, education, and place of residence of the respondents influence the level of expectations. Practical Implications: The paper conducts an empirical study on the impact of the loss of privacy on the discount in insurance premium thus the topic might be of interest to insurance companies and their clients making the final insurance purchase decision. Originality/Value: Is the first study when UBI expectations are explained in terms of Poland one of the biggest insurance market in CEE region. The findings might have a practical and science value. From practical point of view, it gives the insurance professionals the knowledge of the expected level of possible decrease in premium that might result in intensification of UBI. From scientific point of view the study provides useful information for further research especially in terms of factors that determine the level of expectations.
Article
We are surrounded by digital images of personal lives posted online. Changes in information and communications technology have enabled widespread sharing of personal photos, increasing access to aspects of private life previously less observable. Most studies of privacy online explore differences in individual privacy preferences. Here we examine privacy perceptions of online photos considering both social norms, collectively—shared expectations of privacy and individual preferences. We conducted an online factorial vignette study on Amazon’s Mechanical Turk ( n = 279). Our findings show that people share common expectations about the privacy of online images, and these privacy norms are socially contingent and multidimensional. Use of digital technologies to share personal photos is influenced by social context as well as individual preferences, while such sharing can affect the social meaning of privacy.
Conference Paper
Full-text available
Although privacy is broadly recognized as a dominant concern for the development of novel interactive technologies, our ability to reason analytically about privacy in real settings is limited. A lack of conceptual interpretive frameworks makes it difficult to unpack interrelated privacy issues in settings where information technology is also present. Building on theory developed by social psychologist Irwin Altman, we outline a model of privacy as a dynamic, dialectic process. We discuss three tensions that govern interpersonal privacy management in everyday life, and use these to explore select technology case studies drawn from the research literature. These suggest new ways for thinking about privacy in socio-technical environments as a practical matter.
Article
Full-text available
Global multimedia communications is advancing the freedom of information and knowledge. However, as the amount and variety of multimedia data generated through these applications in-creases, so do risks associated with widespread accessibility and utilization of such data. Specifi-cally, data may be used in a manner which users regard as an invasion of their privacy. The relation-ship between multimedia data and privacy invasion has not yet been clearly described. The main problem is that current approaches to privacy define characteristics of the data and thus information, rather than how it is perceived by the users (Davies, 1997). Three years of research within this field have, however, identified that previous approaches to privacy protection are not addressing the real problems in this field. Most multimedia invasions of privacy are not intentional or malicious; rather, the designers failed to anticipate how the data could be used, by whom, and how this might affect users (Adams, 1999a & b; Adams & Sasse, 1999a & b). Seeking to address this problem a model of the user perspective on privacy in multimedia environments has been identified. The model helps to deter-mine which information users regard as private, from whom, and in which context. Trade-offs users make, thus rendering some privacy risks acceptable are also identified. The model can assist design-ers and organizations utilizing multimedia communications to assess privacy implications, and thus develop mechanisms for acceptable use of the technology. 1. The importance of users’ perceptions It has been argued that there are many inalienable privacy rights which should never be disre-garded when developing systems (Davies, 1997). Similarly it is also maintained that privacy experts understand potential privacy risks at a greater depth than users (Bennett, 1997). Both these argu-ments have directed privacy research and identification of privacy requirements in system develop-ment towards appraisals by privacy advocates. The problem with only taking this approach is that any expert may have a distorted perception of a situation and potential privacy risks that do not reflect the perceptions of those whose privacy needs protecting. Inaccurate assumptions are a major cause of unintentional invasions of privacy (Adams, 1999b; Adams & Sasse, 1999a/b).
Conference Paper
Full-text available
We conducted a questionnaire-based study of the relative importance of two factors, inquirer and situation, in determining the preferred accuracy of personal information disclosed through a ubiquitous computing system. We found that privacy preferences varied by inquirer more than by situation. That is, individuals were more likely to apply the same privacy preferences to the same inquirer in different situations than to apply the same privacy preferences to different inquirers in the same situation. We are applying these results to the design of a user interface for managing everyday privacy in ubiquitous computing.
Conference Paper
Full-text available
Privacy is a difficult design issue that is becoming increasingly important as we push into ubiquitous computing environments. While there is a fair amount of theoretical work on designing for privacy, there are few practical methods for helping designers create applications that provide end-users with a reasonable level of privacy protection that is commensurate with the domain, with the community of users, and with the risks and benefits to all stakeholders in the intended system. Towards this end, we propose privacy risk models as a general method for refining privacy from an abstract concept into concrete issues for specific applications and prioritizing those issues. In this paper, we introduce a privacy risk model we have developed specifically for ubiquitous computing, and outline two case studies describing our use of this privacy risk model in the design of two ubiquitous computing applications.
Conference Paper
Full-text available
Privacy is the most often-cited criticism of ubiquitous computing, and may be the greatest barrier to its long-term success. However, developers currently have little support in designing software architectures and in creating interactions that are effective in helping end-users manage their privacy. To address this problem, we present Confab, a toolkit for facilitating the development of privacy-sensitive ubiquitous computing applications. The requirements for Confab were gathered through an analysis of privacy needs for both end-users and application developers. Confab provides basic support for building ubiquitous computing applications, providing a framework as well as several customizable privacy mechanisms. Confab also comes with extensions for managing location privacy. Combined, these features allow application developers and end-users to support a spectrum of trust levels and privacy needs.
Conference Paper
Full-text available
Context -aware computing often involves tracking peoples' location. Many studies and applications highlight the importance of keeping people's location information private. We discuss two types of locat ion- based services; location-tracking services that are based on other parties tracking the user's location and position-aware services that rely on the device's knowledge of its own location. We present an experimental case study that examines people's concern for location privacy and compare this to the use of location-based services. We find that even though the perceived usefulness of the two different types of services is the same, location- tracking services generate more concern for privacy than posit ion-aware services. We conclude that development emphasis should be given to position -aware services but that location-tracking services have a potential for success if users are given a simple option for turning the location-tracking off.
Article
Full-text available
Traditional typologies of consumer privacy concern suggest that consumers fall into three distinct groups: One-fourth of consumers are not concerned about privacy, one-fourth are highly concerned, and half are pragmatic, in that their concerns about privacy depend on the situation presented. This study examines online users to determine whether types of privacy concern online mirror the offline environment. An e-mail survey of online users examined perceived privacy concerns of 15 different situations involving collection and usage of personally identifiable information. Results indicate that the vast majority of online users are pragmatic when it comes to privacy. Further analysis of the data suggested that online users can be segmented into four distinct groups, representing differing levels of privacy concern. Distinct demographic differences were seen. Persons with higher levels of education are more concerned about their privacy online than persons with less education. Additionally, persons over the age of 45 years tended to be either not at all concerned about privacy or highly concerned about privacy. Younger persons tended to be more pragmatic. Content and policy implications are provided.
Article
Full-text available
To participate in meaningful privacy practice in the context of technical systems, people require opportunities to understand the extent of the systems' alignment with relevant practice and to conduct discernible social action through intuitive or sensible engagement with the system. It is a significant challenge to design for such understanding and action through the feedback and control mechanisms of today's devices. To help designers meet this challenge, we describe five pitfalls to beware when designing interactive systems—on or off the desktop—with personal privacy implications. These pitfalls are: obscuring potential information flow, obscuring actual information flow, emphasizing configuration over action, lacking coarse-grained control, and inhibiting existing practice. They are based on a review of the literature, on analyses of existing privacy-affecting systems, and on our own experiences designing a prototypical user interface for managing privacy in ubiquitous computing. We illustrate how some existing research and commercial systems—our prototype included—fall into these pitfalls and how some avoid
Article
Full-text available
Mobile technology requires new methods for studying its use under realistic conditions "in the field." Reflexively, mobile technology also creates new opportunities for data collection while participants are remotely located. We report on our experiences with a variation on the paperbased diary study technique, which we extend by using voice-mail paired with mobile and landline telephony to more easily collect data in natural situations. We discuss lessons learned from experiences with voice-mail diary studies in two investigations of different scope. We also present suggestions for tailoring the technique to different research objectives, garnering high subject participation, and configuring the voice-mail system for data collection.
Conference Paper
Although privacy is broadly recognized as a dominant concern for the development of novel interactive technologies, our ability to reason analytically about privacy in real settings is limited. A lack of conceptual interpretive frameworks makes it difficult to unpack interrelated privacy issues in settings where information technology is also present. Building on theory developed by social psychologist Irwin Altman, we outline a model of privacy as a dynamic, dialectic process. We discuss three tensions that govern interpersonal privacy management in everyday life, and use these to explore select technology case studies drawn from the research literature. These suggest new ways for thinking about privacy in socio-technical environments as a practical matter.
Article
Volume I contains the lectures of Fall 1964 through Fall 1967, in which Sacks explores a great variety of topics, from suicide to children's games to Medieval Hell as a nemonic device to pronouns and paradoxes. But two key issues emerge: rules of conversational sequencing - central to the articulation of interaction, and membership categorization devices - central to the social organization of knowledge. This volume culminates in the extensive and formal explication of turn-taking which Sacks delivered in Fall, 1967. Volume II contains the lectures of Spring 1968 through Spring 1972. Again he touches on a wide range of subjects, such as the poetics of ordinary talk, the integrative function of public tragedy, and pauses in spelling out a word. He develops a major new theme: storytelling in converstion, with an attendant focus on topic. His investigation of conversational sequencing continues, and this volume culminates in the elegant dissertation on adjacency pairs which Sacks delivered in Spring, 1972. © 1992, 1995 by The Estate of Harvey Sacks. All rights reserved.
Article
Experience-sampling procedures enable researchers to record the momentary thoughts, feelings, and actions of people in daily life. The authors explain how palmtop computers have expanded the repertoire of experience-sampling techniques and reduced or eliminated some traditional problems with pen-and-paper methods. As a running example, they illustrate the capabilities of the Experience Sampling Program (ESP), their configurable, freely distributable software environment for designing and running experience-sampling studies on Palm Pilots and Windows CE palmtops.
Conference Paper
Communication of one's location as part of a social discourse is common practice, and we use a variety of technologies to satisfy this need. This practice suggests a potentially useful capability that technology may sup- port more directly. We present such a social location disclosure service, Reno, designed for use on a common mobile phone platform. We describe the guiding principles that dictate parameters for creating a usable, useful and ubiquitous service and we report on a pilot study of use of Reno for a realistic social net- work. Our preliminary results reveal the competing factors for a system that fa- cilitates both manual and automatic location disclosure, and the role social con- text plays in making such a lightweight communication solution work.
Conference Paper
This paper provides detail on two key components of the Houdini framework under development at Bell Labs, that enable context-aware and privacy-conscious user data sharing appropriate for mobile and/or ubiquitous computing. The framework includes an approach for integrating data from diverse sources, for gathering user preferences for what data to share and when to share it, and a policy management infrastructure in the network for enforcing those preferences. The current paper focuses on two components of this infrastructure that are essential for mobile and ubiquitous computing, namely the framework to support self-provisioning of preferences, and the performance of the underlying rules engine.
Article
Ubiquitous computing's overarching goal is for technology to disappear into the background yet remain useful to users. A technique from the field of psychology - the experience sampling method - could help researchers improve ubiquitous computing's evaluation process.
Article
Introduction Agents are increasingly upon us. Although opposition is rare, intelligent agents" have been attacked for user interface problems, and on larger social issues. Agent supporters have countered these arguments and raised doubts about alternative technologies. We place this in historical, social, and ethical contexts, noting the cyclic nature of such debates. One conclusion is that many problems with arti cial agents arise from a poor understanding of social aspects of human agents. 2. Historical Perspectives The history of technology has seen many movements call for human-like systems, and use anthropomorphic terminology to generate understanding and support. Such movements often make excessive claims, perhaps misled by their own rhetoric or their (sometimes impressive) partial success. This raises unrealistic expectations, which often leads to disappointment, which is surprisingly often followed by rebirth with similar goals, and somewhat improved terminology and techno
Article
Privacy is a necessary concern in electronic commerce. It is difficult, if not impossible, to complete a transaction without revealing some personal data -- a shipping address, billing information, or product preference. Users may be unwilling to provide this necessary information or even to browse online if they believe their privacy is invaded or threatened. Fortunately, there are technologies to help users protect their privacy. P3P (Platform for Privacy Preferences Project) from the World Wide Web Consortium is one such technology. However, there is a need to know more about the range of user concerns and preferences about privacy in order to build usable and effective interface mechanisms for P3P and other privacy technologies. Accordingly, we conducted a survey of 381 U.S. Net users, detailing a range of commerce scenarios and examining the participants' concerns and preferences about privacy. This paper presents both the findings from that study as well as their design implicat...
Consumer Privacy Attitudes: A Major Shift Since 2000 and Why
P&AB, " Consumer Privacy Attitudes: A Major Shift Since 2000 and Why, " Privacy & American Business Newsletter, Vol. 10, Number 6, Sep 2003.