Article

Investigating Privacy Concerns Related to Mobile Augmented Reality Apps – A Vignette Based Online Experiment

Authors:
  • Capgemini Invent
  • Continental Automotive Technologies GmbH
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Augmented reality (AR) gained much public attention after the success of Pokémon Go in 2016, and has found application in online games, social media, interior design, and other services since then. AR is highly dependent on various different sensors gathering real time context-specific personal information about the users causing more severe and new privacy threats compared to other technologies. These threats have to be investigated as long as AR is still shapeable in order to ensure users’ privacy and foster market adoption of privacy-friendly AR systems. To provide viable recommendations regarding the design of privacy-friendly AR systems, we follow a user-centric approach and investigate the role and causes of privacy concerns within the context of mobile AR (MAR) apps. We design a vignette-based online experiment adapting ideas from the framework of contextual integrity to analyze drivers of privacy concerns related to MAR apps, such as characteristics of permissions, trust-evoking signals, and AR-related contextual factors. The results of the large-scale experiment with 1,100 participants indicate that privacy concerns are mainly determined by the sensitivity of app permissions (i.e., whether sensitive resources on the smartphone are accessed) and the number of prior app downloads. Furthermore, we devise detailed practical and theoretical implications for developers, regulatory authorities and future research.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The privacy calculus theory (PCT) posits that individuals evaluate the perceived risks and benefits of privacy before deciding to disclose personal information (Dienlin & Metzger, 2016). While AR-powered experiences have proven useful for making purchase decisions (Anifa & Sanaji, 2022), consumers make privacy-related decisions by weighing the potential benefits of sharing their information against the risks associated with user data disclosure (Harborth & Pape, 2021). In the context of AR-powered advertising, the tension between delivering personalized experiences and safeguarding users' privacy is heightened: users are encouraged to share personal data to access personalized recommendations and retail benefits (Barth et al., 2022), while privacy concerns about AR may also hinder future behavioral intentions (Chang et al., 2024;Cowan et al., 2021). ...
... Drawing from Privacy Calculus Theory (PCT), consumers may weigh the immersive benefits against heightened privacy risks when biometric and environmental data are collected (Acquisti et al., 2014;Boerman et al., 2017). Unlike traditional personalization methods, AR introduces distinct privacy concerns as it integrates with AI to enable real-time customization and personalized recommendations through spatial and personal data analysis (Harborth & Pape, 2021). This literature review examines how AR's immersive benefits compete with intrusiveness concerns in shaping consumer attitudes and behaviors. ...
... Bonnin (2020) found that individuals more familiar with AR technology tend to trust it more for making purchase decisions. Additionally, privacy concerns have been identified as a moderator of the persuasiveness of AR-powered experiences (Feng & Xie, 2019;Harborth & Pape, 2021), with individuals with higher privacy concerns perceiving AR as more intrusive. To address potential confounding factors, we controlled participants' familiarity with AR technology using three semantic differential items ("Unfamiliar -Familiar," "Inexperienced -Experienced," and "Not knowledgeable -Knowledgeable"). ...
Article
Full-text available
In e-commerce, Augmented Reality (AR) employs computer vision and artificial intelligence (AI) techniques to enhance the shopping experience through personalized recommendations based on users’ physical data. However, concerns regarding privacy and perceived intrusiveness can undermine the persuasive appeal of personalized AR for e-commerce experiences. Can overtly communicating information transparency mitigate such concerns? Two between-subjects online experiments were conducted, revealing that consumers were drawn to the AR-powered e-commerce site due to its heightened perceived immersion and usefulness but also considered it more intrusive than the non-AR site. However, when both websites were AR-powered, the one with high information transparency significantly enhanced perceived privacy protection, reducing perceived intrusiveness. Perceived transparency positively mediated the effects, particularly among individuals with lower pre-existing trust in algorithms. These findings have implications for theory pertaining to the use of AR in strategic communications and the design of information transparency for AR-powered e-commerce.
... These sensors are required for the functioning of the technologies, but represent risks for users' privacy, as discussed for each technology in isolation [7]. [9] 2018 ✓ ✓ ✓ ✓ 22 L L Rauschnabel et al. [10] 2018 ✓ ✓ ✓ 285|21 O O Adams et al. [11] 2018 ✓ ✓ ✓ ✓ 10 10 O Maloney et al. [12] 2020 ✓ ✓ ✓ 30 O Cowan et al. [13] 2021 ✓ ✓ ✓ ✓ 251|165 O Harborth and Pape [14] 2021 ✓ ✓ ✓ ✓ 1100 O Abraham et al. [15] 2022 ✓ ✓ ✓ ✓ ✓ ✓ ✓ 5|4|4 L Sykownik et al. [16] 2022 ✓ ✓ ✓ 126 O O'Hagan et al. [17] 2022 ...
... Lastly, most user studies on privacy perceptions [8,9,10,11,12,13,14,16,17,18,19,20], focus on a sole technology in XR, i.e., either MAR, HMD-based AR/MR, or VR. To the best of our knowledge, only two studies [15,21] gather privacy and security insights on XR as a whole. ...
... Need of Cameras to Track Planes in XR. P 4,8,13,14,17,19 do not understand why the MAR variant requests access to the camera. "I could have had exactly the same experience without the camera being on. ...
Article
Full-text available
Extended Reality (XR) devices are becoming available in all shapes and forms. They are foreseen to be the backbone of the Metaverse, which is expected to increasingly lead toward more interconnected XR experiences in the future. However, these devices include a large number of sensors that collect sensitive data about users and their surroundings, thus posing threats to their privacy. Until now, research on how users perceive these threats has rather focused on either Mobile Augmented Reality (MAR), Mixed Reality (MR), or Virtual Reality (VR). Still, adopting a global vision including all these technologies, i.e., XR, is necessary to understand the potential differences in privacy between users that future cross-platform experiences may cause. This understanding is needed to bring usable Privacy-Enhancing Technologies (PETs) to XR users. In this paper, we therefore consider different XR technologies together, and analyze users’ related privacy perceptions. By doing so, we observe differences and similarities between each of these technologies by comparing them against each other. In our study, 20 participants have visited a virtual house guided by a real-estate agent, with a cross-platform application that we developed for (1) Android (MAR), (2) Microsoft Hololens (MR), and (3) Meta Quest 2 (VR). They tested our application with two of these devices. We then conducted a semi-structured interview to gather comparisons and insights on their experience with both technologies, including permission requests, sensor data collection, and privacy perceptions. Our findings suggest that our participants are more concerned about MAR and MR than VR . We found they were less aware about the use of camera and eye tracking data than microphone data in the context of our application. In addition, half of our participants were more concerned about XR than more common technologies (i.e.,computers, smartphones), despite overall low concerns on XR and low awareness on biometric data sensitivity. These insights underline aspects that must further be developed to raise XR users’ awareness and help them in better controlling their privacy, such as more adapted sets of permissions to track surfaces in XR.
... Second, it is acknowledged that the most significant factors in limiting technological behavior adoption are privacy and physical risks. According to Harborth and Pape (2021), studies have shown that individuals worry about their privacy when using AR. Privacy concerns encompass issues such as the potential for individuals to be recorded by AR devices without their consent, the involuntary distribution of personal data, and the increased surveillance that may result from the use of these devices (Dacko, 2017;Harborth and Pape, 2018). ...
... Malhotra et al. (2004) defined privacy risk as a factor that reflects consumers' natural worries about the possible exposure of personal data when using a certain technology or media. Privacy concerns emerge about AR applications, and research reveals that users express apprehension regarding their privacy while utilizing AR (Harborth and Pape, 2021). The worries revolve around the potential for being recorded by AR devices as bystanders, having personal data shared without consent, and being subjected to surveillance because of using such devices (Dacko, 2017;Rauschnabel et al., 2018). ...
... Privacy concerns have been observed to decrease users' willingness to embrace a technology. Previous studies (Cagiltay et al., 2015;Faqih, 2022;Harborth and Pape, 2021;Oliveira et al., 2014;Rauschnabel et al., 2018; EuroMed Journal of Business Thongmak, 2019) have confirmed that privacy concerns negatively affect the adoption of AR technologies. Based on the above, the following hypothesis can be presented. ...
Article
Purpose Due to augmented reality (AR) technology improvements, the retail industry has embraced smart retailing as its primary business model. Therefore, organizations must comprehend the intricacies of AR adoption to persuade clients to adopt this revolutionary technology effectively. Thus, the current study proposes and evaluates a comprehensive model that includes unified theory of acceptance and use of technology (UTAUT2), privacy concerns, physical risks and technological anxiety to predict customers’ intention to use AR apps in the retail industry in the Egyptian context. Design/methodology/approach The current study examines 398 responses from Egyptian shoppers using partial least squares structural equation modeling (PLS-SEM). Snowball sampling was employed in the existing study. The participants were selected using a “self-selection” strategy, which is an excellent method for research investigations in which the participants freely participate. Findings Consumers’ intentions to use AR apps in retail settings are positively impacted by task-technology fit, performance expectation, effort expectancy, social influence, facilitating conditions and hedonic motivation. Conversely, privacy and physical risks negatively affect customers’ intention to use AR apps in retail. Furthermore, technological anxiety serves as a moderator factor in these connections. Originality/value To the best of our knowledge, the current study is considered the first to test the role of UTAUT2, privacy and physical risks on users' behavioral intentions toward adopting AR apps in retail. It also examines technological anxiety as a moderator in the retail setting.
... The transmission principle covers a wide variety of possible factors (Nissenbaum, 2010). For example, Harborth et al. (2021) manipulated the transmission principle by introducing the permission justification as a way to introduce transparency and provide a valid purpose for collecting. According to CI theory, it is crucial to determine the values of all five parameters to assess the privacy impact of any information flow practice (Nissenbaum, 2004). ...
... CI has been applied in various studies and different contexts (e.g. Apthorpe et al., 2018;Grodzinsky et al., 2011;Norval et al., 2017;Zhang et al., 2022) such as the context of augmented reality apps (Harborth et al., 2021), photo posting online (Hoyle et al., 2020), smart homes (Apthorpe et al., 2018), covid-19 vaccination certificates (Zhang et al., 2022), and accessing public records (Martin & Nissenbaum, 2016). Apthorpe et al. (2018), who investigated the appropriateness of information flows in smart homes, provided actionable recommendations based on discovered privacy norms for device manufacturers. ...
... Transmission principles in CI refer to the constraints on the flow of information, which can be defined broadly or narrowly. For this study, we decided to consider a purpose factor, following (Gilbert et al., 2021;Harborth et al., 2021). The purposes were related to the secondary use of the collected data. ...
Article
Full-text available
People are increasingly open to sharing personal data collected by wearables, while concerns have emerged on how companies, governments and organisations process this data. This paper applies Nissenbaum’s theory of contextual integrity to explore the perceived appropriateness of information flows linked to wearables. A vignette study was conducted (N = 500) to examine the influence of the type of data shared, its purpose, and the sender on the appropriateness of different wearables’ information flow scenarios. Results revealed a significant impact of information type, sharing purpose, and sender on the perceived appropriateness of data sharing. Notably, data collected for research purposes or to develop new functionalities was deemed most appropriate, while data used for advertising was viewed unfavourably. Further, the user-controlled sharing received higher appropriateness ratings. This research underscores the need for meaningful consent in data sharing and suggests that manufacturers of wearable devices should utilise user agency to supplement information flow automation based on societal and contextual privacy norms.
... Adams et al. [2] revealed that VR users' privacy concerns often originate from "always on" sensors and the low reputation of device manufacturers. Harborth and Pape [28] suggested that AR users' privacy concerns are directly driven by perceived permission sensitivity (i.e., perceived sensitivity of the information to which they give permissions), trust in the application, and the general feeling of being a victim of privacy invasion online. Similarly, O'Brien [50] revealed that AR users might be comfortable with data collection depending on perceived data sensitivity and the purpose of use. ...
... Although previous work has explored privacy issues in XR environments, little is known about privacy concerns from the perspectives of XR users. In addition, user-centred XR privacy studies have primarily focused on the requirements of specific XR systems (e.g., AR [28,50,52], VR [2,22,43]), making it necessary to investigate other devices that fall on the Reality-Virtuality Spectrum. Our research lays the groundwork and helps to understand the baseline of user privacy concerns and factors contributing to their comfort with data collection in XR environments. ...
... We aim to determine whether these factors remained influential on user privacy concerns across various XR systems, and we validate experts' assumptions of privacy issues within XR from users' perspectives [1]. Given that privacy norms are context-dependent [49], we used a scenario-based approach inspired by prior work [28,37,48]. As the first attempt to seek essential insights into users' privacy concerns in XR, we narrow our scope by focusing on personal XR systems in single-user scenarios. ...
Conference Paper
Full-text available
Extended Reality (XR) technology is changing online interactions, but its granular data collection sensors may be more invasive to user privacy than web, mobile, and the Internet of Things technologies. Despite an increased interest in studying developers' concerns about XR device privacy, user perceptions have rarely been addressed. We surveyed 464 XR users to assess their awareness, concerns, and coping strategies around XR data in 18 scenarios. Our findings demonstrate that many factors, such as data types and sensitivity, affect users' perceptions of privacy in XR. However, users' limited awareness of XR sensors' granular data collection capabilities , such as involuntary body signals of emotional responses, restricted the range of privacy-protective strategies they used. Our results highlight a need to enhance users' awareness of data privacy threats in XR, design privacy-choice interfaces tailored to XR environments, and develop transparent XR data practices. CCS CONCEPTS • Human-centered computing → Empirical studies in HCI; • Security and privacy → Human and societal aspects of security and privacy; • Computing methodologies → Perception; Virtual reality; Mixed / augmented reality.
... Adams et al. [2] revealed that VR users' privacy concerns often originate from "always on" sensors and the low reputation of device manufacturers. Harborth and Pape [28] suggested that AR users' privacy concerns are directly driven by perceived permission sensitivity (i.e., perceived sensitivity of the information to which they give permissions), trust in the application, and the general feeling of being a victim of privacy invasion online. Similarly, O'Brien [50] revealed that AR users might be comfortable with data collection depending on perceived data sensitivity and the purpose of use. ...
... Although previous work has explored privacy issues in XR environments, little is known about privacy concerns from the perspectives of XR users. In addition, user-centred XR privacy studies have primarily focused on the requirements of specific XR systems (e.g., AR [28,50,52], VR [2,22,43]), making it necessary to investigate other devices that fall on the Reality-Virtuality Spectrum. Our research lays the groundwork and helps to understand the baseline of user privacy concerns and factors contributing to their comfort with data collection in XR environments. ...
... We aim to determine whether these factors remained influential on user privacy concerns across various XR systems, and we validate experts' assumptions of privacy issues within XR from users' perspectives [1]. Given that privacy norms are context-dependent [49], we used a scenario-based approach inspired by prior work [28,37,48]. As the first attempt to seek essential insights into users' privacy concerns in XR, we narrow our scope by focusing on personal XR systems in single-user scenarios. ...
Preprint
Full-text available
Extended Reality (XR) technology is changing online interactions, but its granular data collection sensors may be more invasive to user privacy than web, mobile, and the Internet of Things technologies. Despite an increased interest in studying developers' concerns about XR device privacy, user perceptions have rarely been addressed. We surveyed 464 XR users to assess their awareness, concerns, and coping strategies around XR data in 18 scenarios. Our findings demonstrate that many factors, such as data types and sensitivity, affect users' perceptions of privacy in XR. However, users' limited awareness of XR sensors' granular data collection capabilities , such as involuntary body signals of emotional responses, restricted the range of privacy-protective strategies they used. Our results highlight a need to enhance users' awareness of data privacy threats in XR, design privacy-choice interfaces tailored to XR environments, and develop transparent XR data practices.
... Last but not the least, there is a notable research gap in the existing literature regarding the effects of privacy concerns on the use of MAR apps (Gong et al., 2022). Given the unique characteristics and risks associated with MAR apps, it becomes imperative to investigate how privacy concerns may manifest in the context of these innovative technologies (Harborth and Pape, 2021). ...
... However, it is important to acknowledge that they gather and retain a substantial amount of user privacy information to cater to individualized user requirements, for instance, user's real time geolocation, personal identification information, mobile device information and visual and audio recordings of the user's surroundings. While this data collection is intended to improve the app performance, it simultaneously raises significant privacy concerns (Harborth and Pape, 2021). ...
... According to the privacy calculus theory, users' behavior is influenced by a deliberate evaluation of the competing positive and negative beliefs (e.g., benefits and privacy concerns) on users' willingness to engage with the technology (Laufer and Wolfe, 1977). Previous studies have explored the privacy calculus as a determinant of individual's behavioral responses in a variety of contexts, including MAR apps (Harborth and Pape, 2021). However, it is important to note that this particular study focused solely on its impact on users' download intention, rather than examining its influences on continuous use and recommendation intention, which, however, are the primary response factors addressed in our study. ...
Article
Purpose The application of mobile augmented reality (MAR) for enhancing user experiences and consumer patronizing intention has been the focus of recent MAR literature. Few studies examine the differences between apps. This study fills the research gap by examining how consumers assess their experiences with different MAR applications and how their decision-making process is performed, particularly in the setting of smartphones. Design/methodology/approach A web-based online survey was administered to collect data on consumers' perceptions of two different MAR apps: utilitarian and hedonic apps. Reliability and validity of the measurement scales, non-response bias and comment method bias were assessed. With the support of measurement model, partial least square (PLS) was employed to test the research hypotheses. Findings This study reveals that the technological attributes of augmented reality (AR) apps have significant effects on consumer perceptions of their utilitarian and hedonic benefits, including interactivity, visual quality, service quality, technicality and aesthetics. Moreover, this study shows that consumers of hedonic apps place more importance on their enjoyment with the MAR app; consumers of utilitarian apps focus more on the accrued functional values. The findings provide practical insights for retailers in AR marketing and application development in the MAR environment. Originality/value This study provides a comprehensive viewpoint for analyzing ongoing use and purchase intentions simultaneously in a unified theoretical framework. In addition, it compares different types of MAR apps: hedonic and utilitarian. Furthermore, it is one of the first few studies attempting to provide a comprehensive understanding of the predictive role of MAR technologies by incorporating privacy concerns into the research model based on user and gratification framework.
... As Theodorou et al. [37] note, "What is effectively transparent varies by who the observer is, and what their goals and obligations are." Lack of transparency hampers an individual's ability make informed decisions around systems in their environment [38], [39] and could lead to lack of trust in the systems [40]. Transparency with respect to mobile AR applications have been addressed in the context of permission settings, for example, whether to allow or decline object or face recognition [39]. ...
... Lack of transparency hampers an individual's ability make informed decisions around systems in their environment [38], [39] and could lead to lack of trust in the systems [40]. Transparency with respect to mobile AR applications have been addressed in the context of permission settings, for example, whether to allow or decline object or face recognition [39]. Existing evidence shows that users' trust in technology increases when they "understand the capabilities of the system, see how well it is performing and forecast future behaviour," [41], [42] from [43]. ...
Article
Full-text available
Augmented reality (AR) glasses are likely to become omnipresent, providing a continuous and ubiquitous experience of computer-mediated reality. This new Pervasive AR will lead to perceptual, acceptance, and ethical issues which are increasingly discussed in the literature. However, given such Pervasive AR prototypes are currently not commercially available, little is known about potential end-users’ input into this discussion. To address this, we developed a Pervasive AR (PAR) prototype serving as a technology probe and conducted an empirical study in a semi-public space involving 54 participants. We collected data from focus groups, questionnaires, and observations of users and bystanders. Extending concerns with existing technology, like smartphones and augmented reality, PAR exposes privacy and security breaches with its unprompted, all-seeing capability, has a higher potential to cause societal fractures and divisions, and raises new questions on information transparency and trust with significant implications for the design of future PAR systems.
... Prior work capturing user attitudes towards privacy has used comfort as a proxy for how participants feel about a given topic [36,46,60], including AR [34,54,86]. Some of this work [34] focused on developing scales that measure how much benefits of using AR outweigh privacy concerns or vice versa. ...
... Prior work capturing user attitudes towards privacy has used comfort as a proxy for how participants feel about a given topic [36,46,60], including AR [34,54,86]. Some of this work [34] focused on developing scales that measure how much benefits of using AR outweigh privacy concerns or vice versa. In contrast, our study uses qualitative methods to elicit more and richer detail than such scales could capture about potential privacy concerns, through the lens of participants' feelings of comfort or acceptability and discomfort or non-acceptability of AR glasses data collection or use. ...
Article
Full-text available
As technology companies develop mass market augmented reality (AR) glasses that are increasingly sensor-laden and affordable, uses of such devices pose potential privacy and security problems. Though prior work has broadly addressed some of these problems, our work specifically addresses the potential data collection of 15 data types by AR glasses and five potential data uses. Via semi-structured interviews, we explored the attitudes and concerns of 21 current AR technology users regarding potential data collection and data use by hypothetical consumer-grade AR glasses. Participants expressed diverse concerns and suggested potential limits to AR data collection and use, evoking privacy concepts and informational norms. We discuss how participants’ attitudes and reservations about data collection and use, like definitions of privacy, are varying and context-dependent, and make recommendations for designers and policy makers, including customizable and multidimensional privacy solutions.
... Sophisticated AR apps have gained popularity and can now run on standard consumer mobile devices [27]. However, compared with traditional mobile apps, mobile AR apps request access to cameras, microphones, and other sensors, making them more vulnerable to potential risks to users' information [28]. Regrettably, a significant number of users are oblivious to these potential dangers [29,30]. ...
... The very existence and adoption of these apps by users highlight the danger inherent in incentive-based mechanisms for information sharing. There is a scarcity of studies that investigate the attitudes and privacy apprehensions of end-users towards AR technologies [28]. Although the limited empirical evidence available suggests that AR raises privacy concerns among users-such as being unintentionally recorded by AR devices as bystanders [38], having their data involuntarily shared, and being subject to surveillance through the use of these devices [39,40]-none of these studies delve into the underlying causes of these concerns. ...
Article
Full-text available
This research studied people’s responses to requests that ask for accessing their personal information when using augmented reality (AR) technology. AR is a new technology that superimposes digital information onto the real world, creating a unique user experience. As such, AR is often associated with the collection and use of personal information, which may lead to significant privacy concerns. To investigate these potential concerns, we adopted an experimental approach and examined people’s actual responses to real-world requests for various types of personal information while using a designated AR application on their personal smartphones. Our results indicate that the majority (57%) of people are willing to share sensitive personal information with an unknown third party without any compensation other than using the application. Moreover, there is variability in the individuals’ willingness to allow access to various kinds of personal information. For example, while 75% of participants were open to granting access to their microphone, only 35% of participants agreed to allow access to their contacts. Lastly, monetary compensation is linked with an increased willingness to share personal information. When no compensation was offered, only 35% of the participants agreed to grant access to their contacts, but when a low compensation was offered, 57.5% of the participants agreed. These findings combine to suggest several practical implications for the development and distribution of AR technologies.
... This issue is especially relevant for AR/VR applications, which gather a broad range of sensitive data, including biometrics and environmental details [28], [30]. As AR/VR technologies grow, ensuring their privacy policies are transparent and compliant with regulations is crucial for user trust [23], [24]. However, these policies are often overly complex and lengthy, discouraging user engagement [28], [29]. ...
Preprint
Full-text available
\begin{abstract} This paper comprehensively analyzes privacy policies in AR/VR applications, leveraging BERT, a state-of-the-art text classification model, to evaluate the clarity and thoroughness of these policies. By comparing the privacy policies of AR/VR applications with those of free and premium websites, this study provides a broad perspective on the current state of privacy practices within the AR/VR industry. Our findings indicate that AR/VR applications generally offer a higher percentage of positive segments than free content but lower than premium websites. The analysis of highlighted segments and words revealed that AR/VR applications strategically emphasize critical privacy practices and key terms. This enhances privacy policies' clarity and effectiveness.
... AR systems must scan and map the surrounding scenario and identify and track objects and actions to identify users' movements and interactions with the 3D surroundings. When scanning the environment, such applications may capture sensitive information of the user or bystanders [15]. For instance, head-mounted cameras may inadvertently capture sensitive information displayed on screens, while mobile device cameras might record facial images of bystanders without their consent. ...
Chapter
Augmented Reality applications overlay our physical world with digital components in an interactive 3D space. These applications generally capture information about the physical world around the user through cameras and sensors, which can identify user movements and interactions with objects in the real world. In recent years, Location-Based Augmented Reality Games (LBARGs) have been used in several contexts, such as entertainment, tourism, and education. However, by capturing information about the environment, AR applications can lead to failures in maintaining user and bystander privacy. This paper addresses the identification and protection of sensitive data in LBARGs. We introduce LootAR, a location-based mobile AR game, and the SafeARUnity library, a real-time image processing middleware that acts as a layer between the AR application and the device’s camera, identifying and sanitizing sensitive data prior to rendering. Implementation aspects are discussed, involving Unity Sentis, a toolkit for running machine learning models in Unity, and YOLO, a fast single-stage object detector optimized for real-time applications. We also demonstrate the integration of SafeARUnity in mobile games, using LootAR as a case study.
... However, trust is also emphasised as a sign of people's positive attitudes about the technology and their intention to use it if they trust the technology (Kaur and Arora, 2020). Other than that, Harborth and Pape (2021) articulated that trust in AR technology is mostly influenced by pre-existing attitudes as well as institutional and environmental variables. ...
Article
Purpose With an emphasis on the moderating impact of trust, this study examines determinants influencing the purchase intentions of young consumers in augmented reality (AR) shopping platforms. This research study aims to pinpoint essential elements, including the enjoyment dimensions (i.e. entertainment, visual appeal and hedonic component) and practicality dimensions [i.e. informativeness, navigation and perceived usefulness (PU)], that are imperative in influencing young consumers’ purchase intentions in AR-based shopping platforms. Design/methodology/approach The present study used a quantitative approach grounded in the stimulus-organism-response model and the extended technology acceptance model, which analysed purchase intention among the youth using AR shopping platforms. One hundred seventy-two samples were gathered through self-administered questionnaires and underwent partial least squares structural equation modelling analysis to predict the relationships between the proposed variables. Findings The results of the current study suggested the independent variables (e.g. entertainment, visual appeal, informativeness and navigation) had a significant impact on hedonic components and PU. Furthermore, both hedonic components and PU had a significant and positive influence on purchase intention. Nevertheless, it is noteworthy that hedonic components and PU in relation to purchase intention were not moderated by trust. Originality/value The developed research framework is significant for understanding the perceptions of shopping behaviour among young consumers in the Borneo region of Malaysia. This is one of the few studies that explored the interplay between enjoyment and practicality’s dimensions on purchase intention via AR shopping platforms in the less explored region of Malaysia. Hence, this study plays a pivotal role in contributing to the existing marketing and technology management literature. Moreover, it holds practical importance for business operators and marketers as it aids in decision-making and strategic planning for the future direction of businesses in the young consumer market.
... There is great excitement about using AR technology in marketing; however, there is a little concern about the privacy issues. This study proposes to explore, how privacy issues of consumers affect the relationship between the perceived usefulness of AR and brand experience (Harborth & Pape, 2021;Rauschnabel et al. 2018). Therefore, this study aims to address the issue of limited empirical evidence for the moderating effect of consumer privacy concerns as a contingency factor for the relationship between the perceived usefulness of AR and brand experience. ...
Conference Paper
Full-text available
This paper examines the impact of social customer relationship management (SCRM) and customer engagement on profitability in the hospitality industry amidst the COVID-19 crisis. It highlights the need for hoteliers to retain employees and maintain service assurance. The study identifies hedonic value as predominant in hospitality services but notes a potential customer shift to lower-tier service providers due to reduced incomes. It explores the introduction of new value propositions focusing on customised, sustainable experiences. The research, involving surveys of hotel managers, employees, and customers in Mauritius, finds that brand loyalty and positive word of mouth (PWOM) are significantly influenced by customer factors, while management factors heavily influence customer engagement. The study emphasises a multi-stakeholder approach for effective SCRM implementation.
... Acts of surveillance are a recurring concern in the literature, whether it be player-driven [45] or by the very nature of the technology [40,88]. ...
... Additionally, a second control condition would, at minimum, have required another 50+ participants (2-h sessions each), which was beyond the human (e.g., participant subject pool availability) and financial resources of the current study. Medium or smaller sample sizes are a common limitation of lab-based educational technology research which also tends to use either a single control group or no control group (Cutumisu & Lou, 2020;Dever et al., 2021;Dietrich et al., 2021;Geden et al., 2021;Harborth & Pape, 2021;Lajoie et al., 2021;Li et al., 2021;Taub et al., 2018). In summary, while our sample size was appropriate for the analyses we conducted, adding a second control condition would have would have greatly limited the statistical power of our models and yielded findings of little practical or scientific use. ...
Article
Minority history education can support perspective-taking which is linked to decreasing stereotypes and prejudice. A pre-test post-test randomized control trial study with 114 pre-service teachers was conducted to examine the role of queer history instruction to improve learners' self-reported perspective-taking toward LGBTQ + minorities and knowledge of queer history. Participants in the Edmonton Queer History App (vs. control) condition learned significantly more and reported higher levels of perspective-taking towards both sexual orientation (SO) and gender identity (GI) minority members. Mediation analysis showed that learning outcome explained the effect of the app condition on the increase of perspective-taking towards SO (but not GI).
... Older established models, such as the global information privacy concern (GIPC) and the concern for information privacy (CFIP) (Smith et al., 1996) scale focus on the offline or organizational context and are therefore not transferable to this work. The APCO model by Smith et al. (2011) studies antecedents, privacy concerns and outcomes in the digital world and has been applied and adapted to differing contexts, such as augmented reality (Harborth & Pape, 2021) or CBDC . ...
Article
Full-text available
Central Bank Digital Currencies (CBDC) are being researched in academia and piloted by central banks around the world. Initial research highlights the importance of privacy concerns on adoption intention in CBDC. We took one step further and investigated the link between privacy concerns and adoption using the Chinese CBDC and digitalized version of the Yuan, the e-CNY. We integrated and applied the established Antecedent Privacy Concerns and Outcomes (APCO) model with the Task-Technology Fit model in a quantitative online-questionnaire with 682 Chinese participants to study the influence of privacy concerns on CBDC usage. The data was analyzed using partial least squares structural equation modeling (PLS-SEM) to identify significant path coefficients and effects in the developed model. The findings demonstrated that several antecedents significantly influenced privacy concerns, which in turn influenced e-CNY usage. In particular, perceived vulnerabilities impacted privacy concerns, while soft and hart trust factors were found to neither impact concerns or usage. When compared to prior research, the distinction between intention to use and usage of CBDC, under consideration of privacy concerns, seemed to be negligible. The often discussed 'privacy-paradox' could not be observed for CBDC. Observed differences in antecedents and other factors may have been due to cultural, political, and demographic factors, as well as different CBDC design choices. For practitioners, the results further emphasized the need for a privacy-friendly implementation of retail CBDC, which efficiently communicated user benefits while rebutting perceived vulnerabilities.
... The results of this work may not directly be transferable to other countries, citizens and CBDC, as this study was performed with solely German-speaking individuals to control for the differences in CBDC design and cultural factors. Defining privacy norms is not an easy task and often relies on assumptions and simplifications (Harborth & Pape, 2021). Actual information flows could therefore differ from the possibilities surveyed in this work in terms of recipients and data transferred. ...
Conference Paper
Full-text available
Central Bank Digital Currency (CBDC) are a rapidly evolving payment technology, with privacy being a crucial factor. Research on privacy in CBDC is limited and focuses mainly on technical considerations and its link to adoption intention. This paper presents a first step towards understanding privacy norms in digital euro transactions for German citizens. The study employs a large-scale questionnaire, based on contextual integrity theory, to investigate acceptable flows of information and privacy parameters for CBDC and other digital payment methods. We conduct a pretest with 127 respondents, followed by a main study with 1064 respondents to measure and compare acceptability of various information flows. The results reveal the importance of (un)acceptable recipients of transaction-and identity-related information and the influence of different transmission principles. The findings can be used by central banks and policymakers to design and implement CBDC that corresponds to individuals' privacy norms. https://scholarspace.manoa.hawaii.edu/server/api/core/bitstreams/8a2c8566-1616-494b-93c1-0356cfb94e14/content
... But so far it had not been applied to a PET such as an anonymization service. There is a major difference between PETs and other services, i. e., apps [30,35,53] or games [24,33] regarding the application of the IUIPC instrument. The other services had a certain use for their customer (primary use), and the users' privacy concerns were investigated for the use of the service. ...
Chapter
Full-text available
This chapter provides information about acceptance factors of privacy-enhancing technologies (PETs) based on our research why users are using Tor and JonDonym, respectively. For that purpose, we surveyed 124 Tor users (Harborth and Pape 2020) and 142 JonDonym users (Harborth Pape 2020) and did a quantitative evaluation (PLS-SEM) on different user acceptance factors. We investigated trust in the PET and perceived anonymity (Harborth et al. 2021; Harborth et al. 2020; Harborth and Pape 2018), privacy concerns, and risk and trust beliefs (Harborth and Pape 2019) based on Internet Users Information Privacy Concerns (IUIPC) and privacy literacy (Harborth and Pape 2020). The result was that trust in the PET seems to be the major driver. Furthermore, we investigated the users’ willingness to pay or donate for/to the service (Harborth et al. 2019). In this case, risk propensity and the frequency of perceived improper invasions of users’ privacy were relevant factors besides trust in the PET. While these results were new in terms of the application of acceptance factors to PETs, none of the identified factors was surprising. To identify new factors and learn about differences in users’ perceptions between the two PETs, we also did a qualitative analysis of the questions if users have any concerns about using the PET, when they would be willing to pay or donate, which features they would like to have and why they would (not) recommend the PET (Harborth et al. 2021; Harborth et al. 2020). To also investigate the perspective of companies, we additionally interviewed 12 experts and managers dealing with privacy and PETs in their daily business and identified incentives and hindrances to implement PETs from a business perspective (Harborth et al. 2018).
... As privacy norms and privacy concerns can differ substantially between countries and cultures, the findings obtained here through interviews and surveys with mainly German citizens may not be applicable to other countries or CBDC. Indeed, the problem of defining norms for specific research contexts, such as digital payments, is well known and simplifying assumptions are often made in empirical research on CI to overcome this issue [14]. In comparison to other digital payment methods, it should be noted that digital euro transactions could potentially include more information types that could be transferred to similar recipients with central banks as additional recipients. ...
Conference Paper
Privacy is regarded as a crucial factor in the development of Central Bank Digital Currency (CBDC), particularly for the digital euro in Europe. Currently, research on privacy in CBDC is scarce and focuses largely on its technical implementation or its influence on technology adoption. This work aims to act as a first step towards uncovering privacy norms in digital euro transactions for German citizens. To this end, we investigate privacy parameters and acceptable flows of information for digital euro transactions using an exploratory mixed-method approach based and contextual integrity theory. The privacy parameters, derived through the analysis of 21 qualitative interviews of experts and non-experts, are used to measure acceptability of various information flows in digital euro transactions for 129 respondents in a first quantitative evaluation. The results demonstrate the importance of acceptable and unacceptable recipients of transaction-and identity-related information as well as different transmission principles. The contributions of this work, the creation of a contextual integrity framework and the evaluation of first privacy norms in digital euro transactions, can be used by central banks and policy makers to design and implement CBDC that does not violate individuals' privacy norms. CCS CONCEPTS • Applied computing → Digital cash; • Security and privacy → Social aspects of security and privacy.
... In addition to star ratings (Molina, 2019), quantitative social proof cues have been measured through number of downloads (Harborth & Pape, 2021;Klump et al., 2020;Roethke et al., 2020), app reviews (Kim & Gambino, 2016), engagement metrics (Xu, 2013) and percentage of users who made a particular choice (Sundar et al., 2020). Research has consistently found a positive relationship between the presence of quantitative social proof and the behavior of interest (e.g., Shengli & Fang, 2019). ...
Article
Background Despite extensive research into technology users’ privacy concerns, a critical gap remains in understanding why individuals adopt different standards for data protection across contexts. The rise of advanced technologies such as the Internet of Things (IoT), artificial intelligence (AI), augmented reality (AR), and big data has created rapidly evolving and complex privacy landscapes. However, privacy is often treated as a static construct, failing to reflect the fluid, context-dependent nature of user concerns. This oversimplification has led to fragmented research, inconsistent findings, and limited capacity to address the nuanced challenges posed by these technologies. Understanding these dynamics is especially crucial in fields such as digital health and informatics, where sensitive data and user trust are central to adoption and ethical innovation. Objective This study synthesized existing research on privacy behaviors in emerging technologies, focusing on IoT, AI, AR, and big data. Its primary objectives were to identify the psychological antecedents, outcomes, and theoretical frameworks explaining privacy behavior, and to assess whether insights from traditional online privacy literature, such as e-commerce and social networking, apply to these advanced technologies. It also advocates a context-dependent approach to understanding privacy. Methods A systematic review of 179 studies synthesized psychological antecedents, outcomes, and theoretical frameworks related to privacy behaviors in emerging technologies. Following established guidelines and using leading research databases such as ScienceDirect (Elsevier), SAGE, and EBSCO, studies were screened for relevance to privacy behaviors, focus on emerging technologies, and empirical grounding. Methodological details were analyzed to assess the applicability of traditional privacy findings from e-commerce and social networking to today’s advanced technologies. Results The systematic review revealed key gaps in the privacy literature on emerging technologies, such as IoT, AI, AR, and big data. Contextual factors, such as data sensitivity, recipient transparency, and transmission principles, were often overlooked, despite their critical role in shaping privacy concerns and behaviors. The findings also showed that theories developed for traditional technologies often fall short in addressing the complexities of modern contexts. By synthesizing psychological antecedents, behavioral outcomes, and theoretical frameworks, this study underscores the need for a context-contingent approach to privacy research. Conclusions This study advances understanding of user privacy by emphasizing the critical role of context in data sharing, particularly amid ubiquitous and emerging health technologies. The findings challenge static views of privacy and highlight the need for tailored frameworks that reflect dynamic, context-dependent behaviors. Practical implications include guiding health care providers, policy makers, and technology developers toward context-sensitive strategies that build trust, enhance data protection, and support ethical digital health innovation. Trial Registration PROSPERO CRD420251037954; https://www.crd.york.ac.uk/PROSPERO/view/CRD420251037954
Article
Purpose The purpose of the study is to have an understanding about the impact of augmented reality (AR) on user experience in case of a makeup app. This article tries to explore how personalisation, an AR process, impacts the various aspects of user experience (pragmatic quality, hedonic quality by stimulation, hedonic quality by identification and attractiveness). This study also evaluates the moderating role of privacy concern on the relationship of personalisation and user experience. Methodology This research empirically analyses data from an experiment conducted in a controlled lab setting with 200 valid responses from users of a makeup app, which incorporates AR technology. SPSS and SmartPLS4 were used for the analysis. Findings The results show that personalisation significantly impacts the user experience, particularly in terms of enhancing pragmatic quality. However, the results did not show a moderating effect of privacy concerns on the relationship of personalisation and user experience. Implications This research offers marketers a foundation on leveraging AR technology in enhancing the app experience. It also contributes to the AR literature by understanding the interplay of personalisation, privacy concern and user experience. Originality/Value This study examines how personalisation in AR distinctly shapes the user experience. It also addresses the contemporary dilemma of privacy concerns, investigating whether marketers should prioritise enhancing personalisation or exercise caution to uphold user privacy. While previous studies related to AR and user experience have been conducted in Western contexts, this study is unique in its kind in India.
Article
Purpose Blockchain technology has been labeled as the most disruptive technological innovation of the current decade due to its impact on almost every major industry. Based on privacy calculus theory and prior adoption literature on emerging technologies, this research investigates the impact of blockchain technology in the consumer technology segment. It elaborated on the mechanism through which blockchain technology influences users’ willingness to share information with technology products enabled by blockchain. Design/methodology/approach Taking a heterogeneous pool of users, this study conducted multiple experiments with the application of blockchain (vs. regular database) technology to high (vs. low) sensitive data to study the impact of blockchain perception on users’ information-sharing tendencies. Findings Through a mediated moderation analysis, the result shows that the use of blockchain technology enhances the sense of security among users. However, the impact of this heightened sense of security only develops a higher willingness to share information when the data is highly sensitive. Practical implications The research reflects on the perception of blockchain technology and the leading impact on willingness to share information with firms. This could be a critical criterion for determining investment in blockchain technologies for consumer products, particularly based on the sensitivity of the data the consumer is sharing. Originality/value This research focuses on the perception of blockchain technology among consumers and its impact on consumers’ decision-making related to their data sharing. People have a higher sense of safety when it comes to blockchain-enabled products. However, we find that it would not be the same for all contexts, and the sensitivity of the data collected would have an impact on this relationship and consumers’ data-sharing decisions.
Chapter
Full-text available
This chapter presents a research perspective that explores the transformative impact of blockchain technology on Behavioral and Experimental Economics. It addresses critical digital challenges such as subject identity verification and privacy, trust in researchers, and the design of experimental incentives. By advocating for a blockchain-integrated framework, the chapter aims to enhance data authenticity, privacy, and incentivization through decentralized mechanisms and smart contracts, thereby ensuring research that is transparent, tamper-proof, and practical. Additionally, the chapter proposes a paradigm shift toward a “play to learn” model, which bridges decentralized science with the realm of gaming finance to advance research and development. This integration signals a new era of interdisciplinary research, offering profound insights into human behavior within the digital economy and illuminating new research pathways that connect Web2 to Web3 environments.
Conference Paper
Full-text available
Smart home cameras (SHCs) offer convenience and security to users, but also cause greater privacy concerns than other sensors due to constant collection and processing of sensitive data. Moreover, privacy perceptions may differ between primary users and other users at home. To address these issues, we developed three physical cover prototypes for SHCs: Manual, Hybrid, and Automatic, based on design criteria of observability, understandability, and tangibility. With 90 SHC users, we ran an online survey using video vignettes of the prototypes. We evaluated how the physical covers alleviated privacy concerns by measuring perceived creepiness and trustworthiness. Our results show that the physical covers were well received, even though primary SHC users valued always-on surveillance. We advocate for the integration of physical covers into future SHCs, emphasizing their potential to establish a shared understanding of surveillance status. Additionally, we provide design recommendations to support this proposition.
Article
Full-text available
Purpose: The integration of Marketing Technology (MarTech) in Mobile Banking (MB) apps gains recognition in marketing automation, previous research lacks a comprehensive framework for understanding customer behavior. This study addresses this gap by proposing a new model within Financial Technology (FinTech), incorporating customer characteristics. Design/methodology/approach: The Integrated MarTech Usage Behavior Model (IMTUBM) triangulates three theories to explore the MarTech landscape in a longitudinal survey of 400 MB app users in Sri Lanka, utilizing Partial Least Squares Structural Equation Modeling (PLS-SEM) with Smart-PLS software. Findings: The resultant Integrated MB App Usage Behavior Model (IMBUBM) provides a foundational understanding of customer characteristics in the MarTech domain. Notably, this study conceptualizes awareness, elucidating that experiential aspects are shaped by both previous and pre-experience. Originality: This study introduces the concept of current pre-experience as a moderator in the MarTech landscape within FinTech, arguing for its deeper exploration compared to previous experience. Implications: These findings not only suggest avenues for future research in MarTech but also provide managerial insights, encouraging refinement of strategies based on heightened customer awareness. Additionally, the study emphasizes the importance of current pre-experience in bridging the gap between customer intention and behavior in MarTech usage. D. SANGARATHAS, S. SHANMUGATHAS & M.A.A. MALKANTHIE 56
Article
Purpose The purpose of this study is to consolidate the fragmented research on augmented reality (AR) as a marketing tool and provide a comprehensive understanding of its possible marketing applications. Design/methodology/approach The study conducted a systematic review and bibliometric analysis of 103 papers on AR-marketing to identify the most prevalent topics and conceptual frameworks. Performance analysis and science mapping were utilized to examine the key marketing domains influenced by AR. Findings The analysis revealed that AR has had the biggest impact on marketing domains such as consumer acceptability, customer interactivity, retail, and destination marketing. Practical implications The results of this study provide organizations with insights into the current state of AR-marketing, enabling them to successfully use AR to improve their marketing strategies. Furthermore, the study highlights potential areas for further research and development in AR for marketing. Originality/value This research offers a valuable, comprehensive overview of AR’s role in marketing by systematically reviewing and analyzing the existing literature. The findings open doors for organizations and researchers to explore AR’s potential applications in marketing strategies and future research opportunities.
Article
Full-text available
This comprehensive systematic study aimed to review the contemporary influential factors in augmented reality (AR) applications. The selection of relevant articles was conducted following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, with databases, such as IEEE Xplore, Scopus, Wiley Online Library, Emerald Insight, and ScienceDirect, being utilized. A total of 69 articles were analyzed, resulting in the identification and analysis of 207 unique factors. The findings revealed that in 2018, AR researchers were particularly interested in passion and competition, while in 2019, their focus shifted toward awareness, expectations, and technology. The year 2020 highlighted the prominence of the visual component compared to other factors. Notably, in 2021, the factor of experience gained significant popularity amidst the widespread impact of the COVID-19 pandemic. The analysis of existing theoretical models demonstrated that the combination of enjoyment, interactivity, system quality, escapism, innovativeness, and facilitating conditions yielded the most advantageous outcomes. Additionally, the study identified 23 association rules that underscored the relationships between the investigated factors. Hypothesis testing confirmed 489 hypotheses, while 133 assumptions were disproven. Enjoyment emerged as a reliable independent variable for various outcomes, including satisfaction, attitude, and utility. Moreover, usability, engagement, satisfaction, and flow were found to be potential indicators of enjoyment. Satisfaction, frequently employed as an independent variable, was predicted and validated by factors, such as benefits, education, entertainment, visual appeal, quality, and satisfaction itself. This study provides valuable insights for scholars as a basis for further research in the field of augmented reality or as a reference to validate their own findings. Link: https://rdcu.be/dlKDK
Chapter
Full-text available
One way to reduce privacy risks for consumers when using the internet is to inform them better about the privacy practices they will encounter. Tailored privacy information provision could outperform the current practice where information system providers do not much more than posting unwieldy privacy notices. Paradoxically, this would require additional collection of data about consumers’ privacy preferences—which constitute themselves sensitive information so that sharing them may expose consumers to additional privacy risks. This chapter presents insights on how this paradoxical interplay can be outmaneuvered. We discuss different approaches for privacy preference elicitation, the data required, and how to best protect the sensitive data inevitably to be shared with technical privacy-preserving mechanisms. The key takeaway of this chapter is that we should put more thought into what we are building and using our systems for to allow for privacy through human-centered design instead of static, predefined solutions which do not meet consumer needs.
Chapter
Full-text available
Mobile computing devices have become ubiquitous; however, they are prone to observation and reconstruction attacks. In particular, shoulder surfing, where an adversary observes another user’s interaction without prior consent, remains a significant unresolved problem. In the past, researchers have primarily focused their research on making authentication more robust against shoulder surfing—with less emphasis on understanding the attacker or their behavior. Nonetheless, understanding these attacks is crucial for protecting smartphone users’ privacy. This chapter aims to bring more attention to research that promotes a deeper understanding of shoulder surfing attacks. While shoulder surfing attacks are difficult to study under natural conditions, researchers have proposed different approaches to overcome this challenge. We compare and discuss these approaches and extract lessons learned. Furthermore, we discuss different mitigation strategies of shoulder surfing attacks and cover algorithmic detection of attacks and proposed threat models as well. Finally, we conclude with an outlook of potential next steps for shoulder surfing research.
Chapter
Full-text available
Users should always play a central role in the development of (software) solutions. The human-centered design (HCD) process in the ISO 9241-210 standard proposes a procedure for systematically involving users. However, due to its abstraction level, the HCD process provides little guidance for how it should be implemented in practice. In this chapter, we propose three concrete practical methods that enable the reader to develop usable security and privacy (USP) solutions using the HCD process. This chapter equips the reader with the procedural knowledge and recommendations to: (1) derive mental models with regard to security and privacy, (2) analyze USP needs and privacy-related requirements, and (3) collect user characteristics on privacy and structure them by user group profiles and into privacy personas. Together, these approaches help to design measures for a user-friendly implementation of security and privacy measures based on a firm understanding of the key stakeholders.
Chapter
Full-text available
A variety of methods and techniques are used in usable privacy and security (UPS) to study users’ experiences and behaviors. When applying empirical methods, researchers in UPS face specific challenges, for instance, to represent risk to research participants. This chapter provides an overview of the empirical research methods used in UPS and highlights associated opportunities and challenges. This chapter also draws attention to important ethical considerations in UPS research with human participants and highlights possible biases in study design.
Article
The SARS-CoV-2 pandemic is a pressing societal issue today. The German government promotes a contract tracing app named Corona-Warn-App (CWA), aiming to change citizens' health behaviors during the pandemic by raising awareness about potential infections and enable infection chain tracking. Technical implementations, citizens' perceptions, and public debates around apps differ between countries, e. g., in Germany there has been a huge discussion on potential privacy issues of the app. Thus, we analyze effects of privacy concerns regarding the CWA, perceived CWA benefits, and trust in the German healthcare system to answer why citizens use the CWA. In our initial conference publication at ICT Systems Security and Privacy Protection - 37th IFIP TC 11 International Conference, SEC 2022, we used a sample with 1752 actual users and non-users of the CWA and and support for the privacy calculus theory, i. e., individuals weigh privacy concerns and benefits in their use decision. Thus, citizens privacy perceptions about health technologies (e. g., shaped by public debates) are crucial as they can hinder adoption and negatively affect future fights against pandemics. In this special issue, we adapt our previous work by conducting a second survey 10 months after our initial study with the same pool of participants (830 participants from the first study participated in the second survey). The goal of this longitudinal study is to assess changes in the perceptions of users and non-users over time and to evaluate the influence of the significantly lower hospitalization and death rates on the use behavior which we could observe during the second survey. Our results show that the privacy calculus is relatively stable over time. The only relationship which significantly changes over time is the effect of privacy concerns on the use behavior which significantly decreases over time, i. e., privacy concerns have a lower negative effect one the CWA use indicating that it did not play such an important role in the use decision at a later point in time in the pandemic. We contribute to the literature by introducing one of the rare longitudinal analyses in the literature focusing on the privacy calculus and changes over time in the relevant constructs as well as the relationships between the calculus constructs and target variables (in our case use behavior of a contact tracing app). We can see that the explanatory power of the privacy calculus model is relatively stable over time even if strong externalities might affect individual perceptions related to the model.
Article
Full-text available
"Paid" digital services have been touted as straightforward alternatives to the ostensibly "free" model, in which users actually face a high price in the form of personal data, with limited awareness of the real cost incurred and little ability to manage their privacy preferences. Yet, the actual privacy behavior of paid services, and consumer expectations about that behavior, remain largely unknown. This Article addresses that gap. It presents empirical data both comparing the true cost of "paid" services as compared to their so-called "free" counterparts, and documenting consumer expectations about the relative behaviors of each. We first present an empirical study that documents and compares the privacy behaviors of 5,877 Android apps that are offered both as free and paid versions. The sophisticated analysis tool we employed, AppCensus, allowed us to detect exactly which sensitive user data is accessed by each app and with whom it is shared. Our results show that paid apps often share the same implementation characteristics and resulting behaviors as their free counterparts. Thus, if users opt to pay for apps to avoid privacy costs, in many instances they do not receive the benefit of the bargain. Worse, we find that there are no obvious cues that consumers can use to determine when the paid version of a free app offers better privacy protections than its free counterpart. We complement this data with a second study: we surveyed 1,000 Android mobile app users as to their perceptions of the privacy behaviors of paid and free app versions. Participants indicated that consumers are more likely to expect the paid version to engage in privacy-protective practices, to demonstrate transparency with regard to its data collection and sharing behaviors, and to offer more granular control over the collection of user data in that context. Together, these studies identify ways in which the actual behavior of apps fails to comport with users' expectations, and the way that representations of an app as "paid" or "ad-free" can mislead users. They also raise questions about the salience of those expectations for consumer choices. In light of this combined research, we then explore three sets of ramifications for policy and practice. First, our findings that paid services often conduct equally extensive levels of data collection and sale as free ones challenge understandings about how the "pay for privacy" model operates in practice, its promise as a privacy-protective alternative, and the legality of paid app behavior. Second, our findings offer important insights for legal approaches to privacy protection, undermining the legitimacy of legal regimes relying on fictive "notice" and "consent" that do not reflect user understandings as bases for the collection, sale, and processing of information. They fortify demands for a privacy law that focuses on vindicating actual consumer expectations and prohibiting practices that exploit them, and strengthen the argument for ex ante regulation of exploitative data practices where consumers are offered no opportunity for meaningful choice or consent. Third, our work provides technical tools for offering transparency about app behaviors, empowering consumers and regulators, law enforcement, consumer protections organizations, and private parties seeking to remedy undesirable or illegal privacy behavior in the most dominant example of a free vs. paid market-mobile apps-where there turns out to be no real privacy-protective option.
Article
Full-text available
Conference Paper
Full-text available
Augmented reality (AR) greatly diffused into the public consciousness in the last years, especially due to the success of mobile applications like Pokémon Go. However, only few people experienced different forms of augmented reality like head-mounted displays (HMDs). Thus, people have only a limited actual experience with AR and form attitudes and perceptions towards this technology only partially based on actual use experiences, but mainly based on hearsay and narratives of others, like the media or friends. Thus, it is highly difficult for developers and product managers of AR solutions to address the needs of potential users. Therefore, we disentangle the perceptions of individuals with a focus on their concerns about AR. Perceived concerns are an important factor for the acceptance of new technologies. We address this research topic based on twelve intensive interviews with laymen as well as AR experts and analyze them with a qualitative research method.
Conference Paper
Full-text available
Augmented reality (AR) gained much public attention since the success of Poke ́mon Go in 2016. Technology companies like Apple or Google are currently focusing primarily on mobile AR (MAR) technologies, i.e. applications on mobile devices, like smartphones or tablets. Associated privacy issues have to be investigated early to foster market adoption. This is especially relevant since past research found several threats associated with the use of smartphone applications. Thus, we investigate two of the main privacy risks for MAR application users based on a sample of 19 of the most downloaded MAR applications for Android. First, we assess threats arising from bad privacy policies based on a machine-learning approach. Second, we investigate which smartphone data resources are accessed by the MAR applications. Third, we combine both approaches to evaluate whether privacy policies cover certain data accesses or not. We provide theoretical and practical implications and recommendations based on our results.
Conference Paper
Full-text available
Companies are experiencing more and more pressure to increase productivity and quality while cutting costs in the digital era. The integration of innovative new technologies in the work process is crucial when transforming businesses to cope with these increasing requirements. In this research article, we investigate the current integration of such an innovative technology - augmented reality (AR) - in the manufacturing industry. For that purpose, we conduct a systematic literature review as well as a practically oriented search for augmented reality use cases in the field of manufacturing. We contribute to the current literature on augmented reality and digital transformation by analysing and synthesizing 95 articles and use cases to identify the current and the potential future role of augmented reality in the manufacturing industry and its impact on different work processes. We show that theoretical proof of concepts articles mostly focus on improving production operations, especially assembly processes, while the majority of practical use cases of currently applied AR solutions involve maintenance and inspection processes. Based on these findings, relevant future work opportunities for researchers as well as practitioners are derived.
Article
Full-text available
Privacy decision making has been examined in the literature from alternative perspectives. A dominant “normative” perspective has focused on rational processes by which consumers with stable preferences for privacy weigh the expected benefits of privacy choices against their potential costs. More recently, a behavioral perspective has leveraged theories from decision research to construe privacy decision making as a process in which cognitive heuristics and biases predictably occur. In a series of experiments, we compare the predictive power of these two perspectives by evaluating the impact of changes in the objective risk of disclosure and the impact of changes in the relative perceptions of risk of disclosure on both hypothetical and actual consumer privacy choices. We find that both relative and objective risks can, in fact, influence consumer privacy decisions. However, and surprisingly, the impact of objective changes in risk diminishes between hypothetical and actual choice settings. Vice versa, the impact of relative risk becomes more pronounced going from hypothetical to actual choice settings. Our results suggest a way to integrate diverse streams of the information systems literature on privacy decision making: in hypothetical choice contexts, relative to actual choice contexts, consumers may both overestimate their response to normative factors and underestimate their response to behavioral factors.
Article
Full-text available
Mixed reality (MR) technology is now gaining ground due to advances in computer vision, sensor fusion, and realistic display technologies. With most of the research and development focused on delivering the promise of MR, there is only barely a few working on the privacy and security implications of this technology. This survey paper aims to put in to light these risks, and to look into the latest security and privacy work on MR. Specifically, we list and review the different protection approaches that have been proposed to ensure user and data security and privacy in MR. We extend the scope to include work on related technologies such as augmented reality (AR), virtual reality (VR), and human-computer interaction (HCI) as crucial components, if not the origins, of MR, as well as a number of work from the larger area of mobile devices, wearables, and Internet-of-Things (IoT). We highlight the lack of investigation, implementation, and evaluation of data protection approaches in MR. Further challenges and directions on MR security and privacy are also discussed.
Book
Full-text available
Klappentext: Der Klassiker zu den Forschungsmethoden – rundum erneuert, didaktisch verbessert und aktueller denn je! Dieses Buch ist ein fundierter und verlässlicher Begleiter für Studierende, Forschende und Berufstätige – da ist alles drin: Grundlagen: Wissenschaftstheorie, Qualitätskriterien sowie ethische Aspekte. Anwendung: Alle Phasen des Forschungsprozesses von der Festlegung des Forschungsthemas, des Untersuchungsdesigns und der Operationalisierung über Stichprobenziehung, Datenerhebungs- und Datenanalysemethoden bis zur Ergebnispräsentation. Vertiefung: Effektgrößen, Metaanalysen, Strukturgleichungsmodelle, Evaluationsforschung. Die 5. Auflage wurde grundlegend überarbeitet: Klarheit: Verbesserte Gliederung der Kapitel sowie des gesamten Buches. Aktualität: Beiträge zu Online-Methoden, Mixed-Methods-Designs und anderen neueren Entwicklungen. Lernfreundlichkeit: Viele Abbildungen, Tabellen, Definitionsboxen, Cartoons, Übungsaufgaben und Lernquiz mit Lösungen. Praxisbezug: Reale Studienbeispiele aus verschiedenen sozial- und humanwissenschaftlichen Fächern (z.B. Psychologie, Kommunikationswissenschaft, Erziehungswissenschaft, Medizin, Soziologie). Eine Begleit-Website bietet Lern-Tools für Studierende und Materialien für Lehrende: http://lehrbuch-psychologie.springer.com/forschungsmethoden-und-evaluation-den-sozial-und-humanwissenschaften
Article
Full-text available
It is commonplace for those who support less restrictive privacy regulation on the collection and use of personal information to point out a paradox: in survey after survey, respondents express deep concern for privacy, oppose growing surveillance and data practices, and object to online tracking and behavioral advertising. Yet when confronted with actual choices involving the capture or exchange of information, few people demonstrate restraint: we sign up for frequent flyer and frequent buyer programs; we are carefree in our use of social networks and mobile apps; and we blithely hop from one website to the next, filling out forms, providing feedback, and contributing ratings. Privacy skeptics suggest that actions should be considered a truer indicator than words. Even if people are honest in their positive valuation of privacy in surveys, in action and behavior, they reveal even greater valuation of those benefits that might come at a privacy cost. In other words, people care about privacy, but not that much. † MEASURING PRIVACY 177 The inconsistencies between survey responses and observed behaviors that skeptics gleefully observe require a nuanced interpretation—one that we have offered through our studies. We argue that the disconnect between actions and survey findings is not because people do not care about privacy, but because individuals' actions are finely modulated to contextual variables. Questions in surveys that do not include such important contextual variables explicitly are highly ambiguous. A more nuanced view of privacy is able to explain away a great deal of what skeptics claim is a divergence of behavior from stated preference and opinion. People care about and value privacy—privacy defined as respecting the appropriate norms of information flow for a given context. When respondents are given a chance to offer more fine-grained judgments about specific information-sharing situations, these judgments are quite nuanced. This is problematic since public policy relies on survey measurements of privacy concerns—such as Alan Westin's measurement of individuals as privacy 'pragmatists' or 'unconcerned'— to drive privacy regulations. Specifically, Westin's categories give credence to the regulation of privacy based by Fair Information Practice Principles (FIPPs), which relies heavily on assuring individuals notice and choice. We examine two historically influential measurements of privacy that have shaped discussion about public views and sentiments as well as practices, regulations, and policies: (1) surveys of individuals' ratings of 'sensitive' information and (2) Alan Westin's privacy categorization of individuals as fundamentalists, pragmatists, and unconcerned. In addition to replicating key components in these two survey streams, we used a factorial vignette survey to identify important contextual elements driving privacy expectations. A sample of 569 respondents rated how a series of vignettes, in which contextual elements of data recipient and data use had been systematically varied, met their privacy expectations. We find, first, that how well sensitive information meets privacy expectations is highly dependent on these contextual elements. Second, Westin's privacy categories proved relatively unimportant in relation to contextual elements in privacy judgments. Even privacy 'unconcerned' respondents rated the vignettes as not meeting privacy expectations on average, and respondents across categories had a common vision of what constitutes a privacy violation. This study has important implications for public policy and research. For public policy, these results suggest that relying on one dimension—sensitive information or Westin's privacy categorization of respondents—is limiting. In particular, focusing on differences in privacy expectations across consumers obscures the common vision of what is appropriate use of information for consumers. This paper has significant public policy implications for the reliance on consumer choice as a necessary approach to accommodate consumer variance: our results suggest consumers agree as to the inappropriate use of information. Our study has called privacy concepts into question by showing that 'sensitivity' of information and 'concern' about privacy are not stable in the face of confounding variables: privacy categories and sensitivity labels prove to be highly influenced by the context and use of the situation. Our work demonstrates the importance of teasing out confounding variables in these historically influential studies.
Article
Full-text available
Provides a nontechnical introduction to the partial least squares (PLS) approach. As a logical base for comparison, the PLS approach for structural path estimation is contrasted to the covariance-based approach. In so doing, a set of considerations are then provided with the goal of helping the reader understand the conditions under which it might be reasonable or even more appropriate to employ this technique. This chapter builds up from various simple 2 latent variable models to a more complex one. The formal PLS model is provided along with a discussion of the properties of its estimates. An empirical example is provided as a basis for highlighting the various analytic considerations when using PLS and the set of tests that one can employ is assessing the validity of a PLS-based model. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Although mobile apps are already an influential medium in the new media industry as a whole, these apps have received little academic attention within the communication and marketing literature. This study develops and tests a hypothesized model to explain antecedents affecting app usage among smartphone users. The analysis of the structural equation model determined a final model with four significant factors (perceived informative and entertaining usefulness, perceived ease of use, and user review). Cost-effectiveness, a key variable of this study due to the particularity of 99-cent app price, had no influence on app usage. This study not only includes marketing implications but also offers insight into various theoretical applications to the field of mobile communication research by suggesting a conceptual model for the acceptance of mobile apps.
Article
Full-text available
This research investigates the influence of core self-evaluations (CSE), stickiness, positive emotion and trust on smartphone users’ intentions to download free social media apps. An online questionnaire was used to collect data, and 477 valid questionnaires were collected. The outcomes show that CSE and the smartphone users’ stickiness significantly influence their positive emotion. Comparing with CSE, stickiness plays a key role in affecting users’ emotion. Smartphone users’ emotions are found to positively influence their trust, which in turn positively influences their intentions to download free social media apps. The findings provide insights into how an app developer can improve users’ emotions and their associated behaviours.
Conference Paper
Full-text available
In a series of experiments, we examined how the timing impacts the salience of smartphone app privacy notices. In a web survey and a field experiment, we isolated different timing conditions for displaying privacy notices: in the app store, when an app is started, during app use, and after app use. Participants installed and played a history quiz app, either virtually or on their phone. After a distraction or delay they were asked to recall the privacy notice's content. Recall was used as a proxy for the attention paid to and salience of the notice. Showing the notice during app use significantly increased recall rates over showing it in the app store. In a follow-up web survey, we tested alternative app store notices, which improved recall but did not perform as well as notices shown during app use. The results suggest that even if a notice contains information users care about, it is unlikely to be recalled if only shown in the app store.
Article
Full-text available
Trust plays an important role in many Information Systems (IS)-enabled situations. Most IS research employs trust as a measure of interpersonal or person-to-firm relations, such as trust in a Web vendor or a virtual team member. Although trust in other people is important, this paper suggests that trust in the information technology (IT) itself also plays a role in shaping IT-related beliefs and behavior. To advance trust and technology research, this paper presents a set of trust in technology construct definitions and measures. We also empirically examine these construct measures using tests of convergent, discriminant, and nomological validity. This study contributes to the literature by providing: a) a framework that differentiates trust in technology from trust in people, b) a theory-based set of definitions necessary for investigating different kinds of trust in technology, and c) validated trust in technology measures useful to research and practice.
Article
Full-text available
Due to the amount of data that smartphone applications can potentially access, platforms enforce permission systems that allow users to regulate how applications access protected resources. If users are asked to make security decisions too frequently and in benign situations, they may become habituated and approve all future requests without regard for the consequences. If they are asked to make too few security decisions, they may become concerned that the platform is revealing too much sensitive information. To explore this tradeoff, we instrumented the Android platform to collect data regarding how often and under what circumstances smartphone applications are accessing protected resources regulated by permissions. We performed a 36-person field study to explore the notion of "contextual integrity," that is, how often are applications accessing protected resources when users are not expecting it? Based on our collection of 27 million data points and exit interviews with participants, we examine the situations in which users would like the ability to deny applications access to protected resources. We found out that at least 80% of our participants would have preferred to prevent at least one permission request, and overall, they thought that over a third of requests were invasive and desired a mechanism to block them.
Conference Paper
Full-text available
Utility that modern smartphone technology provides to individuals is most often enabled by technical capabilities that are privacy-affecting by nature, i.e. smartphone apps are provided with access to a multiplicity of sensitive resources required to implement context-sensitivity or personalization. Due to the ineffectiveness of current privacy risk communication methods applied in smartphone ecosystems, individuals' risk assessments are biased and accompanied with uncertainty regarding the potential privacy-related consequences of long-term app usage. Warning theory suggests that an explicit communication of potential consequences can reduce uncertainty and enable individuals to make better-informed cost-benefit trade-off decisions. We extend this design theory to the field of information privacy warning design by experimentally investigating the effects of explicitness in privacy warnings on individuals' perceived risk and trustworthiness of smartphone apps. Our results suggest that explicitness leads to more accurate risk and trust perceptions and provides an improved foundation for informed decision-making.
Article
Full-text available
In Apple's iOS 6, when an app requires access to a protected resource (e.g., location or photos), the user is prompted with a permission request that she can allow or deny. These permission request dialogs include space for developers to optionally include strings of text to explain to the user why access to the resource is needed. We examine how app developers are using this mechanism and the effect that it has on user behavior. Through an online survey of 772 smartphone users, we show that permission requests that include explanations are significantly more likely to be approved. At the same time, our analysis of 4,400 iOS apps shows that the adoption rate of this feature by developers is relatively small: Around 19% of permission requests include developer-specified explanations. Finally, we surveyed 30 iOS developers to better understand why they do or do not use this feature.
Article
Full-text available
We describe a qualitative study investigating the acceptability of the Google Glass eyewear computer to people with Parkinson's disease (PD). We held a workshop with 5 PD patients and 2 carers exploring perceptions of Glass. This was followed by 5-day field trials of Glass with 4 PD patients, where participants wore the device during everyday activities at home and in public. We report generally positive responses to Glass as a device to instil confidence and safety for this potentially vulnerable group. We also raise concerns related to the potential for Glass to reaffirm dependency on others and stigmatise wearers.
Article
Full-text available
Discriminant validity assessment has become a generally accepted prerequisite for analyzing relationships between latent variables. For variance-based structural equa-tion modeling, such as partial least squares, the Fornell-Larcker criterion and the examination of cross-loadings are the dominant approaches for evaluating discriminant validity. By means of a simulation study, we show that these ap-proaches do not reliably detect the lack of discriminant valid-ity in common research situations. We therefore propose an alternative approach, based on the multitrait-multimethod ma-trix, to assess discriminant validity: the heterotrait-monotrait ratio of correlations. We demonstrate its superior performance by means of a Monte Carlo simulation study, in which we compare the new approach to the Fornell-Larcker criterion and the assessment of (partial) cross-loadings. Finally, we provide guidelines on how to handle discriminant validity issues in variance-based structural equation modeling.
Conference Paper
Full-text available
Smartphones have unprecedented access to sensitive personal information. While users report having privacy concerns, they may not actively consider privacy while downloading apps from smartphone application marketplaces. Currently, Android users have only the Android permissions display, which appears after they have selected an app to download, to help them understand how applications access their information. We investigate how permissions and privacy could play a more active role in app-selection decisions. We designed a short "Privacy Facts' display, which we tested in a 20-participant lab study and a 366-participant online experiment. We found that by bringing privacy information to the user when they were making the decision and by presenting it in a clearer fashion, we could assist users in choosing applications that request fewer permissions.
Conference Paper
Full-text available
Smartphone security research has produced many useful tools to analyze the privacy-related behaviors of mobile apps. However, these automated tools cannot assess people's perceptions of whether a given action is legitimate, or how that action makes them feel with respect to privacy. For example, automated tools might detect that a blackjack game and a map app both use one's location information, but people would likely view the map's use of that data as more legitimate than the game. Our work introduces a new model for privacy, namely privacy as expectations. We report on the results of using crowdsourcing to capture users' expectations of what sensitive resources mobile apps use. We also report on a new privacy summary interface that prioritizes and highlights places where mobile apps break people's expectations. We conclude with a discussion of implications for employing crowdsourcing as a privacy evaluation technique.
Article
Full-text available
This study seeks to clarify the nature of control in the context of information privacy to generate insights into the effects of different privacy assurance approaches on context-specific concerns for information privacy. We theorize that such effects are exhibited through mediation by perceived control over personal information and develop arguments in support of the interaction effects involving different privacy assurance approaches (individual self-protection, industry self-regulation, and government legislation). We test the research model in the context of location-based services using data obtained from 178 individuals in Singapore. In general, the results support our core assertion that perceived control over personal information is a key factor affecting context-specific concerns for information privacy. In addition to enhancing our theoretical understanding of the link between control and privacy concerns, these findings have important implications for service providers and consumers as well as for regulatory bodies and technology developers.
Article
Full-text available
The use of mobile applications continues to experience exponential growth. Using mobile apps typically requires the disclosure of location data, which often accompanies requests for various other forms of private information. Existing research on information privacy has implied that consumers are willing to accept privacy risks for relatively negligible benefits, and the offerings of mobile apps based on location-based services (LBS) appear to be no different. However, until now, researchers have struggled to replicate realistic privacy risks within experimental methodologies designed to manipulate independent variables. Moreover, minimal research has successfully captured actual information disclosure over mobile devices based on realistic risk perceptions. The purpose of this study is to propose and test a more realistic experimental methodology designed to replicate real perceptions of privacy risk and capture the effects of actual information disclosure decisions. As with prior research, this study employs a theoretical lens based on privacy calculus. However, we draw more detailed and valid conclusions due to our use of improved methodological rigor. We report the results of a controlled experiment involving consumers (n=1025) in a range of ages, levels of education, and employment experience. Based on our methodology, we find that only a weak, albeit significant, relationship exists between information disclosure intentions and actual disclosure. In addition, this relationship is heavily moderated by the consumer practice of disclosing false data. We conclude by discussing the contributions of our methodology and the possibilities for extending it for additional mobile privacy research.
Article
Full-text available
We test the hypothesis that increasing individuals’ perceived control over the release and access of private information—even information that allows them to be personally identified––will increase their willingness to disclose sensitive information. If their willingness to divulge increases sufficiently, such an increase in control can, paradoxically, end up leaving them more vulnerable. Our findings highlight how, if people respond in a sufficiently offsetting fashion, technologies designed to protect them can end up exacerbating the risks they face.
Article
Full-text available
Online users often need to make adoption decisions without accurate information about the product values. An informa- tional cascade occurs when it is optimal for an online user, having observed others' actions, to follow the adoption deci- sion of the preceding individual without regard to his own information. Informational cascades are often rational for individual decision making; however, it may lead to adoption of inferior products. With easy availability of information about other users' choices, the Internet offers an ideal envi- ronment for informational cascades. In this paper, we empi- rically examine informational cascades in the context of online software adoption. We find user behavior in adopting software products is consistent with the predictions of the informational cascades literature. Our results demonstrate that online users' choices of software products exhibit distinct jumps and drops with changes in download ranking, as predicted by informational cascades theory. Furthermore, we find that user reviews have no impact on user adoption of the most popular product, while having an increasingly positive impact on the adoption of lower ranking products. The phenomenon persists after controlling for alternative explana- tions such as network effects, word-of-mouth effects, and product diffusion. Our results validate informational cas- cades as an important driver for decision making on the Internet. The finding also offers an explanation for the mixed results reported in prior studies with regard to the influence of online user reviews on product sales. We show that the mixed results could be due to the moderating effect of infor- mational cascades.
Article
Full-text available
Information Security (InfoSec) research is far reaching and includes many approaches to deal with protecting and mitigating threats to the information assets and technical resources available within computer based systems. Although a predominant weakness in properly securing information assets is the individual user within an organization, much of the focus of extant security research is on technical issues. The purpose of this paper is to highlight future directions for Behavioral InfoSec research, which is a newer, growing area of research. The ensuing paper presents information about challenges currently faced and future directions that Behavioral InfoSec researchers should explore. These areas include separating insider deviant behavior from insider misbehavior, approaches to understanding hackers, improving information security compliance, cross-cultural Behavioral InfoSec research, and data collection and measurement issues in Behavioral InfoSec research.
Article
Full-text available
Structural equation modeling (SEM) has become a quasi-standard in marketing and management research when it comes to analyzing the cause-effect relations between latent constructs. For most researchers, SEM is equivalent to carrying out covariance-based SEM (CB-SEM). While marketing researchers have a basic understanding of CB-SEM, most of them are only barely familiar with the other useful approach to SEM-partial least squares SEM (PLS-SEM). The current paper reviews PLS-SEM and its algorithm, and provides an overview of when it can be most appropriately applied, indicating its potential and limitations for future research. The authors conclude that PLS-SEM path modeling, if appropriately applied, is indeed a "silver bullet" for estimating causal models in many theoretical models and empirical data situations.
Article
Recent privacy-related incidents of mobile services have shown that app stores and providers face the challenge of mobile users' information privacy concerns, which can prevent users from installing mobile apps or induce them to uninstall an app. In this paper, we investigate the role of app permission requests and compare the impact on privacy concerns with other antecedents of information privacy concerns, i.e., prior privacy experience , computer anxiety, and perceived control. To test these effects empirically, we conducted an online survey with 775 participants. Results of our structural equation modeling show that prior privacy experience, computer anxiety, and perceived control have significant effects on privacy concerns. However, concerns for app permission requests have approximately twice as much predictive value than the other factors put together to explain mobile users' overall information privacy concerns. We expect that our findings can provide a theoretical contribution for future mobile privacy research as well as practical implications for app stores and providers.
Article
We shed light on a money-for-privacy trade-off in the market for smartphone applications (“apps”). Developers offer their apps at lower prices in return for greater access to personal information, and consumers choose between low prices and more privacy. We provide evidence for this pattern using data from 300,000 apps obtained from the Google Play Store (formerly Android Market) in 2012 and 2014. Our findings show that the market’s supply and demand sides both consider an app’s ability to collect private information, measured by the apps’s use of privacy-sensitive permissions: (1) cheaper apps use more privacy-sensitive permissions; (2) given price and functionality, demand is lower for apps with sensitive permissions; and (3) the strength of this relationship depends on contextual factors, such as the targeted user group, the app’s previous success, and its category. Our results are robust and consistent across several robustness checks, including the use of panel data, a difference-in-differences analysis, “twin” pairs of apps, and various measures of privacy-sensitivity and app demand. This paper was accepted by Anandhi Bharadwaj, information systems.
Chapter
We investigate privacy concerns and the privacy behavior of users of the AR smartphone game Pokémon Go. Pokémon Go accesses several functionalities of the smartphone and, in turn, collects a plethora of data of its users. For assessing the privacy concerns, we conduct an online study in Germany with 683 users of the game. The results indicate that the majority of the active players are concerned about the privacy practices of companies. This result hints towards the existence of a cognitive dissonance, i.e. the privacy paradox. Since this result is common in the privacy literature, we complement the first study with a second one with 199 users, which aims to assess the behavior of users with regard to which measures they undertake for protecting their privacy. The results are highly mixed and dependent on the measure, i.e. relatively many participants use privacy-preserving measures when interacting with their smartphone. This implies that many users know about risks and might take actions to protect their privacy, but deliberately trade-off their information privacy for the utility generated by playing the game.
Article
Please use my homepage to get access to this article: http://vous-etes-ici.net/wp-content/uploads/2018/04/BenthalletalTrends.pdf
Article
We shed light on a money-for-privacy trade-off in the market for smartphone applications ("apps"). Developers offer their apps cheaper in return for greater access to personal information, and consumers choose between lower prices and more privacy. We provide evidence for this pattern using data on 300,000 mobile applications which were obtained from the Android Market in 2012 and 2014. We augmented these data with information from Alexa.com and Amazon Mechanical Turk. Our findings show that both the market's supply and the demand side consider an app's ability to collect private information, measured by their use of privacy-sensitive permissions: (1) cheaper apps use more privacy-sensitive permissions; (2) installation numbers are lower for apps with sensitive permissions; (3) circumstantial factors, such as the reputation of app developers, mitigate the strength of this relationship. Our results emerge consistently across several robustness checks, including the use of panel data analysis, the use of selected matched "twin"-pairs of apps and the use of various alternative measures of privacy-sensitiveness.
Article
A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), by Hair, Hult, Ringle, and Sarstedt, provides a concise yet very practical guide to understanding and using PLS structural equation modeling (PLS-SEM). PLS-SEM is evolving as a statistical modeling technique and its use has increased exponentially in recent years within a variety of disciplines, due to the recognition that PLS-SEM’s distinctive methodological features make it a viable alternative to the more popular covariance-based SEM approach. This text includes extensive examples on SmartPLS software, and is accompanied by multiple data sets that are available for download from the accompanying website (www.pls-sem.com).
Article
In the mobile age, protecting users' information from privacy-invasive apps becomes increasingly critical. To precaution users against possible privacy risks, a few Android app stores prominently disclose app permission requests on app download pages. Focusing on this emerging practice, this study investigates the effects of contextual cues (perceived permission sensitivity, permission justification and perceived app popularity) on Android users' privacy concerns, download intention, and their contingent effects dependent on users' mobile privacy victim experience. Drawing on Elaboration Likelihood Model, our empirical results suggest that perceived permission sensitivity makes users more concerned about privacy, while permission justification and perceived app popularity make them less concerned. Interestingly, users' mobile privacy victim experience negatively moderates the effect of permission justification. In particular, the provision of permission justification makes users less concerned about their privacy only for those with less mobile privacy victim experience. Results also reveal a positive effect of perceived app popularity and a negative effect of privacy concerns on download intention. This study provides a better understanding of Android users' information processing and the formation of their privacy concerns in the app download stage, and proposes and tests emerging privacy protection mechanisms including the prominent disclosure of app permission requests and the provision of permission justifications.
Article
Retail settings are being challenged to become smarter and provide greater value to both consumers and retailers. An increasingly recognised approach having potential for enabling smart retail is mobile augmented reality (MAR) apps. In this research, we seek to describe and discover how, why and to what extent MAR apps contribute to smart retail settings by creating additional value to customers as well as benefiting retailers. In particular, by adopting a retail customer experience perspective on value creation, analysing the content of MAR shopping apps currently available, and conducting large-scale surveys on United States smartphone users representing early technology adopters, we assess level of use, experiential benefits offered, and retail consequences. Our findings suggest that take-up is set to go mainstream as user satisfaction is relatively high and their use provides systematic experiential benefits along with advantages to retailers. Despite some drawbacks, their use is positively associated with multiple retail consequences. MAR apps are seen as changing consumer behaviour and are associated with increasingly high user valuations of retailers offering them. Implications for more effective use to enable smart retail settings are discussed.
Article
Modern smartphone platforms offer a multitude of useful features to their users but at the same time they are highly privacy affecting. However, smartphone platforms are not effective in properly communicating privacy risks to their users. Furthermore, common privacy risk communication approaches in smartphone app ecosystems do not consider the actual data-access behavior of individual apps in their risk assessments. Beyond privacy risks such as the leakage of single information (first-order privacy risk), we argue that privacy risk assessments and risk communication should also consider threats to user privacy coming from user-profiling and data-mining capabilities based on the long-term data-access behavior of apps (second-order privacy risk). In this paper, we introduce Styx, a novel privacy risk communication system for Android that provides users with privacy risk information based on the second-order privacy risk perspective. We discuss results from an experimental evaluation of Styx regarding its effectiveness in risk communication and its effects on user perceptions such as privacy concerns and the trustworthiness of a smartphone. Our results suggest that privacy risk information provided by Styx improves the comprehensibility of privacy risk information and helps the users in comparing different apps regarding their privacy properties. The results further suggest that an improved privacy risk communication on smartphones can increase trust towards a smartphone and reduce privacy concern.
Article
This Review summarizes and draws connections between diverse streams of empirical research on privacy behavior. We use three themes to connect insights from social and behavioral sciences: people's uncertainty about the consequences of privacy-related behaviors and their own preferences over those consequences; the context-dependence of people's concern, or lack thereof, about privacy; and the degree to which privacy concerns are malleable—manipulable by commercial and governmental interests. Organizing our discussion by these themes, we offer observations concerning the role of public policy in the protection of privacy in the information age. Copyright © 2015, American Association for the Advancement of Science.
Article
This paper studies Facebook users’ learning-based attitude formation and the relationship between member attitude and self-disclosure. Through the theoretical lens of learning theories, we recognize the key antecedents to member attitude toward a social networking as stemming from classical conditioning, operant conditioning, and social learning-related factors. In addition, we explore the underlying process through which member attitude affects self-disclosure extent and theorize the mediating role of site usage rate on the relationship between attitude and self-disclosure extent. Analysis of 822 survey data results provides strong support for the role of learning theories in explaining Facebook members’ attitude development. The results also confirm a significant, partial mediating effect of site usage rate. A series of post-hoc analyses on gender difference further reveal that attitude formation mechanisms remain constant between male and female Facebook users; gender difference exists on the association between attitude and self-disclosure extent and the association between site usage rate and self-disclosure extent; and the mediating effect of site usage rate exists in male user group only. Our research, therefore, contributes to the literature on social networking sites, as well as providing behavioral analysis useful to the service providers of these sites.
Article
Augmented reality (AR) devices are poised to enter the market. It is unclear how the properties of these devices will affect individuals' privacy. In this study, we investigate the privacy perspectives of individuals when they are bystanders around AR devices. We conducted 12 field sessions in cafés and interviewed 31 bystanders regarding their reactions to a co-located AR device. Participants were predominantly split between having indifferent and negative reactions to the device. Participants who expressed that AR devices change the bystander experience attributed this difference to subtleness, ease of recording, and the technology's lack of prevalence. Additionally, participants surfaced a variety of factors that make recording more or less acceptable, including what they are doing when the recording is being taken. Participants expressed interest in being asked permission before being recorded and in recording-blocking devices. We use the interview results to guide an exploration of design directions for privacy-mediating technologies.
Conference Paper
Perceptual, "context-aware" applications that observe their environment and interact with users via cameras and other sensors are becoming ubiquitous on personal computers, mobile phones, gaming platforms, household robots, and augmented-reality devices. This raises new privacy risks. We describe the design and implementation of DARKLY, a practical privacy protection system for the increasingly common scenario where an untrusted, third-party perceptual application is running on a trusted device. DARKLY is integrated with OpenCV, a popular computer vision library used by such applications to access visual inputs. It deploys multiple privacy protection mechanisms, including access control, algorithmic privacy transforms, and user audit. We evaluate DARKLY on 20 perceptual applications that perform diverse tasks such as image recognition, object tracking, security surveillance, and face detection. These applications run on DARKLY unmodified or with very few modifications and minimal performance overheads vs. native OpenCV. In most cases, privacy enforcement does not reduce the applications' functionality or accuracy. For the rest, we quantify the tradeoff between privacy and utility and demonstrate that utility remains acceptable even with strong privacy protection.
Article
This paper aims to predict consumer acceptance of e-commerce by proposing a set of key drivers for engaging consumers in on-line transactions. The primary constructs for capturing consumer acceptance of e-commerce are intention to transact and on-line transaction behavior. Following the theory of reasoned action (TRA) as applied to a technology-driven environment, technology acceptance model (TAM) variables (perceived usefulness and ease of use) are posited as key drivers of e-commerce acceptance. The practical utility of TAM stems from the fact that e-commerce is technology-driven. The proposed model integrates trust and perceived risk, which are incorporated given the implicit uncertainty of the e-commerce environment. The proposed integration of the hypothesized independent variables is justified by placing all the variables under the nomological TRA structure and proposing their interrelationships. The resulting research model is tested using data from two empirical studies. The first, exploratory study comprises three experiential scenarios with 103 students. The second, confirmatory study uses a sample of 155 on-line consumers. Both studies strongly support the e-commerce acceptance model by validating the proposed hypotheses. The paper discusses the implications for e-commerce theory, research, and practice, and makes several suggestions for future research.
Article
Reluctance to provide personal health information could impede the success of web-based healthcare services. This paper focuses on the role of personal dispositions in disclosing health information online. The conceptual model argues that individuals' intention to disclose such information depends on their trust, privacy concern, and information sensitivity, which are determined by personal dispositions—personality traits, information sensitivity, health status, prior privacy invasions, risk beliefs, and experience—acting as intrinsic antecedents of trust. The data (collected via a lab experiment) and the analysis shed light on the role of personal dispositions. This could assist in enhancing healthcare websites and increase the success of online delivery of health services.