Article

Investigating Privacy Concerns Related to Mobile Augmented Reality Apps – A Vignette Based Online Experiment

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Augmented reality (AR) gained much public attention after the success of Pokémon Go in 2016, and has found application in online games, social media, interior design, and other services since then. AR is highly dependent on various different sensors gathering real time context-specific personal information about the users causing more severe and new privacy threats compared to other technologies. These threats have to be investigated as long as AR is still shapeable in order to ensure users’ privacy and foster market adoption of privacy-friendly AR systems. To provide viable recommendations regarding the design of privacy-friendly AR systems, we follow a user-centric approach and investigate the role and causes of privacy concerns within the context of mobile AR (MAR) apps. We design a vignette-based online experiment adapting ideas from the framework of contextual integrity to analyze drivers of privacy concerns related to MAR apps, such as characteristics of permissions, trust-evoking signals, and AR-related contextual factors. The results of the large-scale experiment with 1,100 participants indicate that privacy concerns are mainly determined by the sensitivity of app permissions (i.e., whether sensitive resources on the smartphone are accessed) and the number of prior app downloads. Furthermore, we devise detailed practical and theoretical implications for developers, regulatory authorities and future research.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... e., higher income levels correspond to lower privacy concerns) [52]. We hypothesize for the demographic variables: Smartphone experience and the experience with the respective mobile operating system is included as control for participants technical experience by including these variables as antecedents of privacy concerns [26]. We argue that participants with more experience regarding both dimensions have higher privacy concerns as they might have witnessed more privacy-related breaches and attacks on smartphones [3,11]. ...
Chapter
The SARS-CoV-2 pandemic is a pressing societal issue today. The German government promotes a contact tracing app named Corona-Warn-App (CWA), aiming to change citizens’ health behavior during the pandemic by raising awareness about potential infections and enable infection chain tracking. Technical implementations, citizens’ perceptions, and public debates around apps differ between countries, i.e., in Germany there has been a huge discussion on potential privacy issues of the app. Thus, we analyze effects of privacy concerns regarding the CWA, perceived CWA benefits, and trust in the German healthcare system to answer why citizens use the CWA. We use a sample with 1,752 actual users and non-users and find support for the privacy calculus theory, i.e., individuals weigh privacy concerns and benefits in their use decision. Thus, citizens’ privacy perceptions about health technologies (e.g., shaped by public debates) are crucial as they can hinder adoption and negatively affect future fights against pandemics.
... AR Marketing involves data technologies that accentuate the question of when marketing has gone too far (Finnegan et al., 2021). The drastic data demands required for AR Marketing to be effectively implemented present even further emphasis on the need to understand and protect consumer privacy Cowan et al., 2021;Harborth, & Pape, 2021). Further, other technologies, such as motion pictures or videos, presented some ability for consumers to escape reality. ...
Article
Full-text available
Augmented Reality (AR) has received increased attention over the last years, both from managers and scholars alike. Various studies in the marketing discipline have tackled fragmented aspects of AR, such as its impact on sales or brands. Yet, a holistic approach to AR remains scarce. Therefore, the authors define “Augmented Reality Marketing” as a novel, strategic, and potentially disruptive subdiscipline in marketing. In conjunction, they discuss a nuanced customer journey model for AR Marketing strategy and propose the BICK FOUR framework (branding, inspiring, convincing, and keeping) as a tool to organize corresponding goals. Another contribution is the introduction of several fundamental differences between AR Marketing and traditional digital marketing concepts, such as redefining the reality concept (reduced reality, normal reality, and augmented reality in a metaverse context). Insights from 127 managers further enhance the current and future practices of AR Marketing. Finally, a discussion of ethical and legal considerations completes the assessment.
Article
Augmented reality (AR) has found application in online games, social media, interior design, and other services since the success of the smartphone game Pokémon Go in 2016. With recent news on the metaverse and the AR cloud, the contexts in which the technology is used become more and more ubiquitous. This is problematic, since AR requires various different sensors gathering real-time, context-specific personal information about the users, causing more severe and new privacy threats compared to other technologies. These threats can have adverse consequences on information self-determination and the freedom of choice and, thus, need to be investigated as long as AR is still shapeable. This communication paper takes on a bird’s eye perspective and considers the ethical concept of autonomy as the core principle to derive recommendations and measures to ensure autonomy. These principles are supposed to guide future work on AR suggested in this article, which is strongly needed in order to end up with privacy-friendly AR technologies in the future.
Article
Location-Based Services (LBSs) and Augmented Reality (AR) technologies are extensively adopted in various contexts such as Location-Based Games (LBGs). However, those technologies could increase information privacy concerns and perceived risks for users. Thus, privacy protection mechanisms are important. This study aims to explore the direct or indirect effects of self-efficacy to protect information privacy, privacy knowledge, privacy concerns, and perceived risks on privacy protection behaviours of an LBG's players and to investigate the different effects among two-player groups (full-time students and full-time employees). Three types of privacy protection behaviours are explored: fabricate, seek, and refrain behaviours. Data are gathered from 259 Pokémon GO's players. Confirmatory Factor Analysis (CFA), Structural Equation Modeling (SEM), and Multi-group analysis are applied to test the research hypotheses. Privacy knowledge, self-efficacy, privacy concerns, and perceived risks are confirmed as salient factors directly or indirectly influencing the privacy protection behaviour of players one way or another.
Article
This work analyzes the influence of privacy concerns and different dimensions of currency-related trust on individuals’ willingness to use Central Bank Digital Currency (CBDC), specifically a digital euro. A quantitative survey with 1034 respondents was analyzed using partial least squares structural equation model (PLS-SEM). Empirical results indicate that multiple antecedents are associated with privacy concerns in the digital euro that in turn influence intention to adopt a digital euro. Especially soft trust factors such as credibility and image are found to influence both privacy concerns and the intention to adopt a digital euro. It contributes to the current literature by introducing trust as a second-order construct composed of hard and soft trust factors for digital currencies. The results provide valuable insights for researchers and practitioners aiming at designing and implementing CBDCs by demonstrating which factors need to be considered in order to achieve widespread adoption by citizens.
Article
A contact tracing app can positively support the requirement of social and physical distancing during a pandemic. However, there are aspects of the user’s intention to download the app that remain under-researched. To address this, we investigate the role of perceived privacy risks, social empowerment, perceived information transparency and control, and attitudes towards government, in influencing the intention to download the contact tracing app. Using fuzzy set qualitative comparative analysis (fsQCA), we found eight different configurations of asymmetrical relationships of conditions that lead to the presence or absence of an intention to download. In our study, social empowerment significantly influences the presence of an intention to download. We also found that perceived information transparency significantly influences the absence of an intention to download the app.
Chapter
Augmented reality (AR) is considered one of the top technologies that will revolutionize the future of education. Real-time interaction, different formats of visualization, and the merge of the real and digital world may open up new opportunities for teaching and learning. Although AR is easily accessible via mobile phones, the extent to which this technology will be adopted greatly depends on the user experience. The user reviews of mobile applications or so-called “apps” are a potential source of information for designers, software developers, and scholars interested in understanding the user experience. This study investigates the current state of the user experience of augmented reality apps by extracting and classifying the information from reviews published in the Google Play Store. A set of 116 educational mobile AR apps were mined from the Google Play Store, and a total of 1,752 user reviews were retrieved and classified. Results suggest developers of educational mobile AR apps need to solve technical problems, improve certain features, and provide more explicit instructions to users. Regardless of these needs, users recognize that these apps have great potential as educational tools. Future developments should focus on tackling these shortcomings, expanding the use of AR apps to more fields of education, and targeting specific audiences to extend the technology adoption.
Conference Paper
Full-text available
Augmented reality (AR) greatly diffused into the public consciousness in the last years, especially due to the success of mobile applications like Pokémon Go. However, only few people experienced different forms of augmented reality like head-mounted displays (HMDs). Thus, people have only a limited actual experience with AR and form attitudes and perceptions towards this technology only partially based on actual use experiences, but mainly based on hearsay and narratives of others, like the media or friends. Thus, it is highly difficult for developers and product managers of AR solutions to address the needs of potential users. Therefore, we disentangle the perceptions of individuals with a focus on their concerns about AR. Perceived concerns are an important factor for the acceptance of new technologies. We address this research topic based on twelve intensive interviews with laymen as well as AR experts and analyze them with a qualitative research method.
Conference Paper
Full-text available
Augmented reality (AR) gained much public attention since the success of Poke ́mon Go in 2016. Technology companies like Apple or Google are currently focusing primarily on mobile AR (MAR) technologies, i.e. applications on mobile devices, like smartphones or tablets. Associated privacy issues have to be investigated early to foster market adoption. This is especially relevant since past research found several threats associated with the use of smartphone applications. Thus, we investigate two of the main privacy risks for MAR application users based on a sample of 19 of the most downloaded MAR applications for Android. First, we assess threats arising from bad privacy policies based on a machine-learning approach. Second, we investigate which smartphone data resources are accessed by the MAR applications. Third, we combine both approaches to evaluate whether privacy policies cover certain data accesses or not. We provide theoretical and practical implications and recommendations based on our results.
Conference Paper
Full-text available
Companies are experiencing more and more pressure to increase productivity and quality while cutting costs in the digital era. The integration of innovative new technologies in the work process is crucial when transforming businesses to cope with these increasing requirements. In this research article, we investigate the current integration of such an innovative technology - augmented reality (AR) - in the manufacturing industry. For that purpose, we conduct a systematic literature review as well as a practically oriented search for augmented reality use cases in the field of manufacturing. We contribute to the current literature on augmented reality and digital transformation by analysing and synthesizing 95 articles and use cases to identify the current and the potential future role of augmented reality in the manufacturing industry and its impact on different work processes. We show that theoretical proof of concepts articles mostly focus on improving production operations, especially assembly processes, while the majority of practical use cases of currently applied AR solutions involve maintenance and inspection processes. Based on these findings, relevant future work opportunities for researchers as well as practitioners are derived.
Article
Full-text available
Privacy decision making has been examined in the literature from alternative perspectives. A dominant “normative” perspective has focused on rational processes by which consumers with stable preferences for privacy weigh the expected benefits of privacy choices against their potential costs. More recently, a behavioral perspective has leveraged theories from decision research to construe privacy decision making as a process in which cognitive heuristics and biases predictably occur. In a series of experiments, we compare the predictive power of these two perspectives by evaluating the impact of changes in the objective risk of disclosure and the impact of changes in the relative perceptions of risk of disclosure on both hypothetical and actual consumer privacy choices. We find that both relative and objective risks can, in fact, influence consumer privacy decisions. However, and surprisingly, the impact of objective changes in risk diminishes between hypothetical and actual choice settings. Vice versa, the impact of relative risk becomes more pronounced going from hypothetical to actual choice settings. Our results suggest a way to integrate diverse streams of the information systems literature on privacy decision making: in hypothetical choice contexts, relative to actual choice contexts, consumers may both overestimate their response to normative factors and underestimate their response to behavioral factors.
Article
Full-text available
Mixed reality (MR) technology is now gaining ground due to advances in computer vision, sensor fusion, and realistic display technologies. With most of the research and development focused on delivering the promise of MR, there is only barely a few working on the privacy and security implications of this technology. This survey paper aims to put in to light these risks, and to look into the latest security and privacy work on MR. Specifically, we list and review the different protection approaches that have been proposed to ensure user and data security and privacy in MR. We extend the scope to include work on related technologies such as augmented reality (AR), virtual reality (VR), and human-computer interaction (HCI) as crucial components, if not the origins, of MR, as well as a number of work from the larger area of mobile devices, wearables, and Internet-of-Things (IoT). We highlight the lack of investigation, implementation, and evaluation of data protection approaches in MR. Further challenges and directions on MR security and privacy are also discussed.
Article
Full-text available
It is commonplace for those who support less restrictive privacy regulation on the collection and use of personal information to point out a paradox: in survey after survey, respondents express deep concern for privacy, oppose growing surveillance and data practices, and object to online tracking and behavioral advertising. Yet when confronted with actual choices involving the capture or exchange of information, few people demonstrate restraint: we sign up for frequent flyer and frequent buyer programs; we are carefree in our use of social networks and mobile apps; and we blithely hop from one website to the next, filling out forms, providing feedback, and contributing ratings. Privacy skeptics suggest that actions should be considered a truer indicator than words. Even if people are honest in their positive valuation of privacy in surveys, in action and behavior, they reveal even greater valuation of those benefits that might come at a privacy cost. In other words, people care about privacy, but not that much. † MEASURING PRIVACY 177 The inconsistencies between survey responses and observed behaviors that skeptics gleefully observe require a nuanced interpretation—one that we have offered through our studies. We argue that the disconnect between actions and survey findings is not because people do not care about privacy, but because individuals' actions are finely modulated to contextual variables. Questions in surveys that do not include such important contextual variables explicitly are highly ambiguous. A more nuanced view of privacy is able to explain away a great deal of what skeptics claim is a divergence of behavior from stated preference and opinion. People care about and value privacy—privacy defined as respecting the appropriate norms of information flow for a given context. When respondents are given a chance to offer more fine-grained judgments about specific information-sharing situations, these judgments are quite nuanced. This is problematic since public policy relies on survey measurements of privacy concerns—such as Alan Westin's measurement of individuals as privacy 'pragmatists' or 'unconcerned'— to drive privacy regulations. Specifically, Westin's categories give credence to the regulation of privacy based by Fair Information Practice Principles (FIPPs), which relies heavily on assuring individuals notice and choice. We examine two historically influential measurements of privacy that have shaped discussion about public views and sentiments as well as practices, regulations, and policies: (1) surveys of individuals' ratings of 'sensitive' information and (2) Alan Westin's privacy categorization of individuals as fundamentalists, pragmatists, and unconcerned. In addition to replicating key components in these two survey streams, we used a factorial vignette survey to identify important contextual elements driving privacy expectations. A sample of 569 respondents rated how a series of vignettes, in which contextual elements of data recipient and data use had been systematically varied, met their privacy expectations. We find, first, that how well sensitive information meets privacy expectations is highly dependent on these contextual elements. Second, Westin's privacy categories proved relatively unimportant in relation to contextual elements in privacy judgments. Even privacy 'unconcerned' respondents rated the vignettes as not meeting privacy expectations on average, and respondents across categories had a common vision of what constitutes a privacy violation. This study has important implications for public policy and research. For public policy, these results suggest that relying on one dimension—sensitive information or Westin's privacy categorization of respondents—is limiting. In particular, focusing on differences in privacy expectations across consumers obscures the common vision of what is appropriate use of information for consumers. This paper has significant public policy implications for the reliance on consumer choice as a necessary approach to accommodate consumer variance: our results suggest consumers agree as to the inappropriate use of information. Our study has called privacy concepts into question by showing that 'sensitivity' of information and 'concern' about privacy are not stable in the face of confounding variables: privacy categories and sensitivity labels prove to be highly influenced by the context and use of the situation. Our work demonstrates the importance of teasing out confounding variables in these historically influential studies.
Article
Full-text available
Provides a nontechnical introduction to the partial least squares (PLS) approach. As a logical base for comparison, the PLS approach for structural path estimation is contrasted to the covariance-based approach. In so doing, a set of considerations are then provided with the goal of helping the reader understand the conditions under which it might be reasonable or even more appropriate to employ this technique. This chapter builds up from various simple 2 latent variable models to a more complex one. The formal PLS model is provided along with a discussion of the properties of its estimates. An empirical example is provided as a basis for highlighting the various analytic considerations when using PLS and the set of tests that one can employ is assessing the validity of a PLS-based model. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Although mobile apps are already an influential medium in the new media industry as a whole, these apps have received little academic attention within the communication and marketing literature. This study develops and tests a hypothesized model to explain antecedents affecting app usage among smartphone users. The analysis of the structural equation model determined a final model with four significant factors (perceived informative and entertaining usefulness, perceived ease of use, and user review). Cost-effectiveness, a key variable of this study due to the particularity of 99-cent app price, had no influence on app usage. This study not only includes marketing implications but also offers insight into various theoretical applications to the field of mobile communication research by suggesting a conceptual model for the acceptance of mobile apps.
Book
Full-text available
Der Methoden-Koffer für Studium, Forschung und Praxis. Der Klassiker zu den Forschungsmethoden – in der 5. Auflage rundum erneuert, didaktisch verbessert und aktueller denn je! Dieses Buch ist ein fundierter und verlässlicher Begleiter für Studierende, Forschende und Berufstätige. Alles drin… • Grundlagen: Quantitative und qualitative Sozialforschung, Wissenschaftstheorie, wissenschaftliche Qualitätskriterien und Forschungsethik. • Anwendung: Alle Phasen des Forschungsprozesses von der Festlegung des Forschungsthemas, des Untersuchungsdesigns und der Operationalisierung über Stichprobenziehung, Datenerhebungs- und Datenanalysemethoden bis zur Ergebnispräsentation. • Vertiefung: Effektgröße, Teststärke und optimaler Stichprobenumfang, Metaanalysen, Strukturgleichungsmodelle, Evaluationsforschung. … und grundlegend überarbeitet – das ist neu! • Klarheit: Verbesserte Gliederung der Kapitel sowie des gesamten Buches. • Aktualität: Beiträge zu Online-Methoden, Mixed-Methods-Designs und anderen neueren Entwicklungen. • Lernfreundlichkeit: Viele Abbildungen, Tabellen, Definitionsboxen, Cartoons, Übungsaufgaben und Lernquiz mit Lösungen. • Praxisbezug: Reale Studienbeispiele aus verschiedenen sozial- und humanwissenschaftlichen Fächern (z. B. Psychologie, Kommunikationswissenschaft, Erziehungswissenschaft, Medizin, Soziologie). Mit Begleit-Website: Lern-Tools für Studierende und Materialien für Lehrende.
Conference Paper
Full-text available
In a series of experiments, we examined how the timing impacts the salience of smartphone app privacy notices. In a web survey and a field experiment, we isolated different timing conditions for displaying privacy notices: in the app store, when an app is started, during app use, and after app use. Participants installed and played a history quiz app, either virtually or on their phone. After a distraction or delay they were asked to recall the privacy notice's content. Recall was used as a proxy for the attention paid to and salience of the notice. Showing the notice during app use significantly increased recall rates over showing it in the app store. In a follow-up web survey, we tested alternative app store notices, which improved recall but did not perform as well as notices shown during app use. The results suggest that even if a notice contains information users care about, it is unlikely to be recalled if only shown in the app store.
Article
Full-text available
Trust plays an important role in many Information Systems (IS)-enabled situations. Most IS research employs trust as a measure of interpersonal or person-to-firm relations, such as trust in a Web vendor or a virtual team member. Although trust in other people is important, this paper suggests that trust in the information technology (IT) itself also plays a role in shaping IT-related beliefs and behavior. To advance trust and technology research, this paper presents a set of trust in technology construct definitions and measures. We also empirically examine these construct measures using tests of convergent, discriminant, and nomological validity. This study contributes to the literature by providing: a) a framework that differentiates trust in technology from trust in people, b) a theory-based set of definitions necessary for investigating different kinds of trust in technology, and c) validated trust in technology measures useful to research and practice.
Article
Full-text available
Due to the amount of data that smartphone applications can potentially access, platforms enforce permission systems that allow users to regulate how applications access protected resources. If users are asked to make security decisions too frequently and in benign situations, they may become habituated and approve all future requests without regard for the consequences. If they are asked to make too few security decisions, they may become concerned that the platform is revealing too much sensitive information. To explore this tradeoff, we instrumented the Android platform to collect data regarding how often and under what circumstances smartphone applications are accessing protected resources regulated by permissions. We performed a 36-person field study to explore the notion of "contextual integrity," that is, how often are applications accessing protected resources when users are not expecting it? Based on our collection of 27 million data points and exit interviews with participants, we examine the situations in which users would like the ability to deny applications access to protected resources. We found out that at least 80% of our participants would have preferred to prevent at least one permission request, and overall, they thought that over a third of requests were invasive and desired a mechanism to block them.
Conference Paper
Full-text available
Utility that modern smartphone technology provides to individuals is most often enabled by technical capabilities that are privacy-affecting by nature, i.e. smartphone apps are provided with access to a multiplicity of sensitive resources required to implement context-sensitivity or personalization. Due to the ineffectiveness of current privacy risk communication methods applied in smartphone ecosystems, individuals' risk assessments are biased and accompanied with uncertainty regarding the potential privacy-related consequences of long-term app usage. Warning theory suggests that an explicit communication of potential consequences can reduce uncertainty and enable individuals to make better-informed cost-benefit trade-off decisions. We extend this design theory to the field of information privacy warning design by experimentally investigating the effects of explicitness in privacy warnings on individuals' perceived risk and trustworthiness of smartphone apps. Our results suggest that explicitness leads to more accurate risk and trust perceptions and provides an improved foundation for informed decision-making.
Article
Full-text available
We describe a qualitative study investigating the acceptability of the Google Glass eyewear computer to people with Parkinson's disease (PD). We held a workshop with 5 PD patients and 2 carers exploring perceptions of Glass. This was followed by 5-day field trials of Glass with 4 PD patients, where participants wore the device during everyday activities at home and in public. We report generally positive responses to Glass as a device to instil confidence and safety for this potentially vulnerable group. We also raise concerns related to the potential for Glass to reaffirm dependency on others and stigmatise wearers.
Article
Full-text available
Discriminant validity assessment has become a generally accepted prerequisite for analyzing relationships between latent variables. For variance-based structural equa-tion modeling, such as partial least squares, the Fornell-Larcker criterion and the examination of cross-loadings are the dominant approaches for evaluating discriminant validity. By means of a simulation study, we show that these ap-proaches do not reliably detect the lack of discriminant valid-ity in common research situations. We therefore propose an alternative approach, based on the multitrait-multimethod ma-trix, to assess discriminant validity: the heterotrait-monotrait ratio of correlations. We demonstrate its superior performance by means of a Monte Carlo simulation study, in which we compare the new approach to the Fornell-Larcker criterion and the assessment of (partial) cross-loadings. Finally, we provide guidelines on how to handle discriminant validity issues in variance-based structural equation modeling.
Conference Paper
Full-text available
Smartphone security research has produced many useful tools to analyze the privacy-related behaviors of mobile apps. However, these automated tools cannot assess people's perceptions of whether a given action is legitimate, or how that action makes them feel with respect to privacy. For example, automated tools might detect that a blackjack game and a map app both use one's location information, but people would likely view the map's use of that data as more legitimate than the game. Our work introduces a new model for privacy, namely privacy as expectations. We report on the results of using crowdsourcing to capture users' expectations of what sensitive resources mobile apps use. We also report on a new privacy summary interface that prioritizes and highlights places where mobile apps break people's expectations. We conclude with a discussion of implications for employing crowdsourcing as a privacy evaluation technique.
Article
Full-text available
This study seeks to clarify the nature of control in the context of information privacy to generate insights into the effects of different privacy assurance approaches on context-specific concerns for information privacy. We theorize that such effects are exhibited through mediation by perceived control over personal information and develop arguments in support of the interaction effects involving different privacy assurance approaches (individual self-protection, industry self-regulation, and government legislation). We test the research model in the context of location-based services using data obtained from 178 individuals in Singapore. In general, the results support our core assertion that perceived control over personal information is a key factor affecting context-specific concerns for information privacy. In addition to enhancing our theoretical understanding of the link between control and privacy concerns, these findings have important implications for service providers and consumers as well as for regulatory bodies and technology developers.
Article
Full-text available
The use of mobile applications continues to experience exponential growth. Using mobile apps typically requires the disclosure of location data, which often accompanies requests for various other forms of private information. Existing research on information privacy has implied that consumers are willing to accept privacy risks for relatively negligible benefits, and the offerings of mobile apps based on location-based services (LBS) appear to be no different. However, until now, researchers have struggled to replicate realistic privacy risks within experimental methodologies designed to manipulate independent variables. Moreover, minimal research has successfully captured actual information disclosure over mobile devices based on realistic risk perceptions. The purpose of this study is to propose and test a more realistic experimental methodology designed to replicate real perceptions of privacy risk and capture the effects of actual information disclosure decisions. As with prior research, this study employs a theoretical lens based on privacy calculus. However, we draw more detailed and valid conclusions due to our use of improved methodological rigor. We report the results of a controlled experiment involving consumers (n=1025) in a range of ages, levels of education, and employment experience. Based on our methodology, we find that only a weak, albeit significant, relationship exists between information disclosure intentions and actual disclosure. In addition, this relationship is heavily moderated by the consumer practice of disclosing false data. We conclude by discussing the contributions of our methodology and the possibilities for extending it for additional mobile privacy research.
Article
Full-text available
We test the hypothesis that increasing individuals’ perceived control over the release and access of private information—even information that allows them to be personally identified––will increase their willingness to disclose sensitive information. If their willingness to divulge increases sufficiently, such an increase in control can, paradoxically, end up leaving them more vulnerable. Our findings highlight how, if people respond in a sufficiently offsetting fashion, technologies designed to protect them can end up exacerbating the risks they face.
Article
Full-text available
Online users often need to make adoption decisions without accurate information about the product values. An informa- tional cascade occurs when it is optimal for an online user, having observed others' actions, to follow the adoption deci- sion of the preceding individual without regard to his own information. Informational cascades are often rational for individual decision making; however, it may lead to adoption of inferior products. With easy availability of information about other users' choices, the Internet offers an ideal envi- ronment for informational cascades. In this paper, we empi- rically examine informational cascades in the context of online software adoption. We find user behavior in adopting software products is consistent with the predictions of the informational cascades literature. Our results demonstrate that online users' choices of software products exhibit distinct jumps and drops with changes in download ranking, as predicted by informational cascades theory. Furthermore, we find that user reviews have no impact on user adoption of the most popular product, while having an increasingly positive impact on the adoption of lower ranking products. The phenomenon persists after controlling for alternative explana- tions such as network effects, word-of-mouth effects, and product diffusion. Our results validate informational cas- cades as an important driver for decision making on the Internet. The finding also offers an explanation for the mixed results reported in prior studies with regard to the influence of online user reviews on product sales. We show that the mixed results could be due to the moderating effect of infor- mational cascades.
Article
Full-text available
Information Security (InfoSec) research is far reaching and includes many approaches to deal with protecting and mitigating threats to the information assets and technical resources available within computer based systems. Although a predominant weakness in properly securing information assets is the individual user within an organization, much of the focus of extant security research is on technical issues. The purpose of this paper is to highlight future directions for Behavioral InfoSec research, which is a newer, growing area of research. The ensuing paper presents information about challenges currently faced and future directions that Behavioral InfoSec researchers should explore. These areas include separating insider deviant behavior from insider misbehavior, approaches to understanding hackers, improving information security compliance, cross-cultural Behavioral InfoSec research, and data collection and measurement issues in Behavioral InfoSec research.
Article
Full-text available
Structural equation modeling (SEM) has become a quasi-standard in marketing and management research when it comes to analyzing the cause-effect relations between latent constructs. For most researchers, SEM is equivalent to carrying out covariance-based SEM (CB-SEM). While marketing researchers have a basic understanding of CB-SEM, most of them are only barely familiar with the other useful approach to SEM-partial least squares SEM (PLS-SEM). The current paper reviews PLS-SEM and its algorithm, and provides an overview of when it can be most appropriately applied, indicating its potential and limitations for future research. The authors conclude that PLS-SEM path modeling, if appropriately applied, is indeed a "silver bullet" for estimating causal models in many theoretical models and empirical data situations.
Article
Full-text available
To date, many important threads of information privacy research have developed, but these threads have not been woven together into a cohesive fabric. This paper provides an interdisciplinary review of privacy-related research in order to enable a more cohesive treatment. With a sample of 320 privacy articles and 128 books and book sections, we classify previous literature in two ways: (1) using an ethics-based nomenclature of normative, purely descriptive, and empirically descriptive, and (2) based on their level of analysis: individual, group, organizational, and societal. Based upon our analyses via these two classification approaches, we identify three major areas in which previous research contributions reside: the conceptualization of information privacy, the relationship between information privacy and other constructs, and the contextual nature of these relationships. As we consider these major areas, we draw three overarching conclusions. First, there are many theoretical developments in the body of normative and purely descriptive studies that have not been addressed in empirical research on privacy. Rigorous studies that either trace processes associated with, or test implied assertions from, these value-laden arguments could add great value. Second, some of the levels of analysis have received less attention in certain contexts than have others in the research to date. Future empirical studies — both positivist and interpretive — could profitably be targeted to these under-researched levels of analysis. Third, positivist empirical studies will add the greatest value if they focus on antecedents to privacy concerns and on actual outcomes. In that light, we recommend that researchers be alert to an overarching macro-model that we term APCO (Antecedents -> Privacy -> Concerns -> Outcomes).
Article
Recent privacy-related incidents of mobile services have shown that app stores and providers face the challenge of mobile users' information privacy concerns, which can prevent users from installing mobile apps or induce them to uninstall an app. In this paper, we investigate the role of app permission requests and compare the impact on privacy concerns with other antecedents of information privacy concerns, i.e., prior privacy experience , computer anxiety, and perceived control. To test these effects empirically, we conducted an online survey with 775 participants. Results of our structural equation modeling show that prior privacy experience, computer anxiety, and perceived control have significant effects on privacy concerns. However, concerns for app permission requests have approximately twice as much predictive value than the other factors put together to explain mobile users' overall information privacy concerns. We expect that our findings can provide a theoretical contribution for future mobile privacy research as well as practical implications for app stores and providers.
Article
We shed light on a money-for-privacy trade-off in the market for smartphone applications (“apps”). Developers offer their apps at lower prices in return for greater access to personal information, and consumers choose between low prices and more privacy. We provide evidence for this pattern using data from 300,000 apps obtained from the Google Play Store (formerly Android Market) in 2012 and 2014. Our findings show that the market’s supply and demand sides both consider an app’s ability to collect private information, measured by the apps’s use of privacy-sensitive permissions: (1) cheaper apps use more privacy-sensitive permissions; (2) given price and functionality, demand is lower for apps with sensitive permissions; and (3) the strength of this relationship depends on contextual factors, such as the targeted user group, the app’s previous success, and its category. Our results are robust and consistent across several robustness checks, including the use of panel data, a difference-in-differences analysis, “twin” pairs of apps, and various measures of privacy-sensitivity and app demand. This paper was accepted by Anandhi Bharadwaj, information systems.
Chapter
We investigate privacy concerns and the privacy behavior of users of the AR smartphone game Pokémon Go. Pokémon Go accesses several functionalities of the smartphone and, in turn, collects a plethora of data of its users. For assessing the privacy concerns, we conduct an online study in Germany with 683 users of the game. The results indicate that the majority of the active players are concerned about the privacy practices of companies. This result hints towards the existence of a cognitive dissonance, i.e. the privacy paradox. Since this result is common in the privacy literature, we complement the first study with a second one with 199 users, which aims to assess the behavior of users with regard to which measures they undertake for protecting their privacy. The results are highly mixed and dependent on the measure, i.e. relatively many participants use privacy-preserving measures when interacting with their smartphone. This implies that many users know about risks and might take actions to protect their privacy, but deliberately trade-off their information privacy for the utility generated by playing the game.
Article
Please use my homepage to get access to this article: http://vous-etes-ici.net/wp-content/uploads/2018/04/BenthalletalTrends.pdf
Article
We shed light on a money-for-privacy trade-off in the market for smartphone applications ("apps"). Developers offer their apps cheaper in return for greater access to personal information, and consumers choose between lower prices and more privacy. We provide evidence for this pattern using data on 300,000 mobile applications which were obtained from the Android Market in 2012 and 2014. We augmented these data with information from Alexa.com and Amazon Mechanical Turk. Our findings show that both the market's supply and the demand side consider an app's ability to collect private information, measured by their use of privacy-sensitive permissions: (1) cheaper apps use more privacy-sensitive permissions; (2) installation numbers are lower for apps with sensitive permissions; (3) circumstantial factors, such as the reputation of app developers, mitigate the strength of this relationship. Our results emerge consistently across several robustness checks, including the use of panel data analysis, the use of selected matched "twin"-pairs of apps and the use of various alternative measures of privacy-sensitiveness.
Article
A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), by Hair, Hult, Ringle, and Sarstedt, provides a concise yet very practical guide to understanding and using PLS structural equation modeling (PLS-SEM). PLS-SEM is evolving as a statistical modeling technique and its use has increased exponentially in recent years within a variety of disciplines, due to the recognition that PLS-SEM’s distinctive methodological features make it a viable alternative to the more popular covariance-based SEM approach. This text includes extensive examples on SmartPLS software, and is accompanied by multiple data sets that are available for download from the accompanying website (www.pls-sem.com).
Article
In the mobile age, protecting users' information from privacy-invasive apps becomes increasingly critical. To precaution users against possible privacy risks, a few Android app stores prominently disclose app permission requests on app download pages. Focusing on this emerging practice, this study investigates the effects of contextual cues (perceived permission sensitivity, permission justification and perceived app popularity) on Android users' privacy concerns, download intention, and their contingent effects dependent on users' mobile privacy victim experience. Drawing on Elaboration Likelihood Model, our empirical results suggest that perceived permission sensitivity makes users more concerned about privacy, while permission justification and perceived app popularity make them less concerned. Interestingly, users' mobile privacy victim experience negatively moderates the effect of permission justification. In particular, the provision of permission justification makes users less concerned about their privacy only for those with less mobile privacy victim experience. Results also reveal a positive effect of perceived app popularity and a negative effect of privacy concerns on download intention. This study provides a better understanding of Android users' information processing and the formation of their privacy concerns in the app download stage, and proposes and tests emerging privacy protection mechanisms including the prominent disclosure of app permission requests and the provision of permission justifications.
Article
Retail settings are being challenged to become smarter and provide greater value to both consumers and retailers. An increasingly recognised approach having potential for enabling smart retail is mobile augmented reality (MAR) apps. In this research, we seek to describe and discover how, why and to what extent MAR apps contribute to smart retail settings by creating additional value to customers as well as benefiting retailers. In particular, by adopting a retail customer experience perspective on value creation, analysing the content of MAR shopping apps currently available, and conducting large-scale surveys on United States smartphone users representing early technology adopters, we assess level of use, experiential benefits offered, and retail consequences. Our findings suggest that take-up is set to go mainstream as user satisfaction is relatively high and their use provides systematic experiential benefits along with advantages to retailers. Despite some drawbacks, their use is positively associated with multiple retail consequences. MAR apps are seen as changing consumer behaviour and are associated with increasingly high user valuations of retailers offering them. Implications for more effective use to enable smart retail settings are discussed.
Article
This research investigates the influence of core self-evaluations (CSE), stickiness, positive emotion and trust on smartphone users’ intentions to download free social media apps. An online questionnaire was used to collect data, and 477 valid questionnaires were collected. The outcomes show that CSE and the smartphone users’ stickiness significantly influence their positive emotion. Comparing with CSE, stickiness plays a key role in affecting users’ emotion. Smartphone users’ emotions are found to positively influence their trust, which in turn positively influences their intentions to download free social media apps. The findings provide insights into how an app developer can improve users’ emotions and their associated behaviours.
Article
Modern smartphone platforms offer a multitude of useful features to their users but at the same time they are highly privacy affecting. However, smartphone platforms are not effective in properly communicating privacy risks to their users. Furthermore, common privacy risk communication approaches in smartphone app ecosystems do not consider the actual data-access behavior of individual apps in their risk assessments. Beyond privacy risks such as the leakage of single information (first-order privacy risk), we argue that privacy risk assessments and risk communication should also consider threats to user privacy coming from user-profiling and data-mining capabilities based on the long-term data-access behavior of apps (second-order privacy risk). In this paper, we introduce Styx, a novel privacy risk communication system for Android that provides users with privacy risk information based on the second-order privacy risk perspective. We discuss results from an experimental evaluation of Styx regarding its effectiveness in risk communication and its effects on user perceptions such as privacy concerns and the trustworthiness of a smartphone. Our results suggest that privacy risk information provided by Styx improves the comprehensibility of privacy risk information and helps the users in comparing different apps regarding their privacy properties. The results further suggest that an improved privacy risk communication on smartphones can increase trust towards a smartphone and reduce privacy concern.
Article
This Review summarizes and draws connections between diverse streams of empirical research on privacy behavior. We use three themes to connect insights from social and behavioral sciences: people's uncertainty about the consequences of privacy-related behaviors and their own preferences over those consequences; the context-dependence of people's concern, or lack thereof, about privacy; and the degree to which privacy concerns are malleable—manipulable by commercial and governmental interests. Organizing our discussion by these themes, we offer observations concerning the role of public policy in the protection of privacy in the information age. Copyright © 2015, American Association for the Advancement of Science.
Article
This paper studies Facebook users’ learning-based attitude formation and the relationship between member attitude and self-disclosure. Through the theoretical lens of learning theories, we recognize the key antecedents to member attitude toward a social networking as stemming from classical conditioning, operant conditioning, and social learning-related factors. In addition, we explore the underlying process through which member attitude affects self-disclosure extent and theorize the mediating role of site usage rate on the relationship between attitude and self-disclosure extent. Analysis of 822 survey data results provides strong support for the role of learning theories in explaining Facebook members’ attitude development. The results also confirm a significant, partial mediating effect of site usage rate. A series of post-hoc analyses on gender difference further reveal that attitude formation mechanisms remain constant between male and female Facebook users; gender difference exists on the association between attitude and self-disclosure extent and the association between site usage rate and self-disclosure extent; and the mediating effect of site usage rate exists in male user group only. Our research, therefore, contributes to the literature on social networking sites, as well as providing behavioral analysis useful to the service providers of these sites.
Article
In Apple's iOS 6, when an app requires access to a protected resource (e.g., location or photos), the user is prompted with a permission request that she can allow or deny. These permission request dialogs include space for developers to optionally include strings of text to explain to the user why access to the resource is needed. We examine how app developers are using this mechanism and the effect that it has on user behavior. Through an online survey of 772 smartphone users, we show that permission requests that include explanations are significantly more likely to be approved. At the same time, our analysis of 4,400 iOS apps shows that the adoption rate of this feature by developers is relatively small: Around 19% of permission requests include developer-specified explanations. Finally, we surveyed 30 iOS developers to better understand why they do or do not use this feature.
Article
Augmented reality (AR) devices are poised to enter the market. It is unclear how the properties of these devices will affect individuals' privacy. In this study, we investigate the privacy perspectives of individuals when they are bystanders around AR devices. We conducted 12 field sessions in cafés and interviewed 31 bystanders regarding their reactions to a co-located AR device. Participants were predominantly split between having indifferent and negative reactions to the device. Participants who expressed that AR devices change the bystander experience attributed this difference to subtleness, ease of recording, and the technology's lack of prevalence. Additionally, participants surfaced a variety of factors that make recording more or less acceptable, including what they are doing when the recording is being taken. Participants expressed interest in being asked permission before being recorded and in recording-blocking devices. We use the interview results to guide an exploration of design directions for privacy-mediating technologies.
Conference Paper
Smartphones have unprecedented access to sensitive personal information. While users report having privacy concerns, they may not actively consider privacy while downloading apps from smartphone application marketplaces. Currently, Android users have only the Android permissions display, which appears after they have selected an app to download, to help them understand how applications access their information. We investigate how permissions and privacy could play a more active role in app-selection decisions. We designed a short "Privacy Facts' display, which we tested in a 20-participant lab study and a 366-participant online experiment. We found that by bringing privacy information to the user when they were making the decision and by presenting it in a clearer fashion, we could assist users in choosing applications that request fewer permissions.
Conference Paper
Perceptual, "context-aware" applications that observe their environment and interact with users via cameras and other sensors are becoming ubiquitous on personal computers, mobile phones, gaming platforms, household robots, and augmented-reality devices. This raises new privacy risks. We describe the design and implementation of DARKLY, a practical privacy protection system for the increasingly common scenario where an untrusted, third-party perceptual application is running on a trusted device. DARKLY is integrated with OpenCV, a popular computer vision library used by such applications to access visual inputs. It deploys multiple privacy protection mechanisms, including access control, algorithmic privacy transforms, and user audit. We evaluate DARKLY on 20 perceptual applications that perform diverse tasks such as image recognition, object tracking, security surveillance, and face detection. These applications run on DARKLY unmodified or with very few modifications and minimal performance overheads vs. native OpenCV. In most cases, privacy enforcement does not reduce the applications' functionality or accuracy. For the rest, we quantify the tradeoff between privacy and utility and demonstrate that utility remains acceptable even with strong privacy protection.
Article
This paper aims to predict consumer acceptance of e-commerce by proposing a set of key drivers for engaging consumers in on-line transactions. The primary constructs for capturing consumer acceptance of e-commerce are intention to transact and on-line transaction behavior. Following the theory of reasoned action (TRA) as applied to a technology-driven environment, technology acceptance model (TAM) variables (perceived usefulness and ease of use) are posited as key drivers of e-commerce acceptance. The practical utility of TAM stems from the fact that e-commerce is technology-driven. The proposed model integrates trust and perceived risk, which are incorporated given the implicit uncertainty of the e-commerce environment. The proposed integration of the hypothesized independent variables is justified by placing all the variables under the nomological TRA structure and proposing their interrelationships. The resulting research model is tested using data from two empirical studies. The first, exploratory study comprises three experiential scenarios with 103 students. The second, confirmatory study uses a sample of 155 on-line consumers. Both studies strongly support the e-commerce acceptance model by validating the proposed hypotheses. The paper discusses the implications for e-commerce theory, research, and practice, and makes several suggestions for future research.
Article
Reluctance to provide personal health information could impede the success of web-based healthcare services. This paper focuses on the role of personal dispositions in disclosing health information online. The conceptual model argues that individuals' intention to disclose such information depends on their trust, privacy concern, and information sensitivity, which are determined by personal dispositions—personality traits, information sensitivity, health status, prior privacy invasions, risk beliefs, and experience—acting as intrinsic antecedents of trust. The data (collected via a lab experiment) and the analysis shed light on the role of personal dispositions. This could assist in enhancing healthcare websites and increase the success of online delivery of health services.