Article

Open Materials Discourse: Re-Evaluating Internet Users' Information Privacy Concerns: The Case in Japan

Authors:
  • Continental Automotive Technologies GmbH
  • Capgemini Invent
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

This paper provides the survey materials used to collect the data for the conceptual replication of the Internet Users' Information Privacy Concerns (IUIPC) model by Malhotra et al. (2004). The replication paper (Pape et al., 2020) used awareness, collection and control as constructs for the second order construct of IUIPC, as well as risk and trusting beliefs from the original paper. Instead of intended behavior the self-developed construct of willingness to share was used. Altogether more than 9,000 data points were collected. This paper provides additional materials and details on the participants, and the Japanese survey questions along with an English version for readers who are unfamiliar with Japanese. We hope that the additional information and in particular the Japanese questions provide some background on our study which will allow others a better understanding of our research and to make use of the questions themselves.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The IUIPC construct has been used in various contexts, such as Internet of Things [51], Internet transactions [39], and mobile apps [59]. Furthermore, it has recently been re-evaluated in several studies [54,55]. But so far it had not been applied to a PET such as an anonymization service. ...
Chapter
Full-text available
This chapter provides information about acceptance factors of privacy-enhancing technologies (PETs) based on our research why users are using Tor and JonDonym, respectively. For that purpose, we surveyed 124 Tor users (Harborth and Pape 2020) and 142 JonDonym users (Harborth Pape 2020) and did a quantitative evaluation (PLS-SEM) on different user acceptance factors. We investigated trust in the PET and perceived anonymity (Harborth et al. 2021; Harborth et al. 2020; Harborth and Pape 2018), privacy concerns, and risk and trust beliefs (Harborth and Pape 2019) based on Internet Users Information Privacy Concerns (IUIPC) and privacy literacy (Harborth and Pape 2020). The result was that trust in the PET seems to be the major driver. Furthermore, we investigated the users’ willingness to pay or donate for/to the service (Harborth et al. 2019). In this case, risk propensity and the frequency of perceived improper invasions of users’ privacy were relevant factors besides trust in the PET. While these results were new in terms of the application of acceptance factors to PETs, none of the identified factors was surprising. To identify new factors and learn about differences in users’ perceptions between the two PETs, we also did a qualitative analysis of the questions if users have any concerns about using the PET, when they would be willing to pay or donate, which features they would like to have and why they would (not) recommend the PET (Harborth et al. 2021; Harborth et al. 2020). To also investigate the perspective of companies, we additionally interviewed 12 experts and managers dealing with privacy and PETs in their daily business and identified incentives and hindrances to implement PETs from a business perspective (Harborth et al. 2018).
... That changed with a series of papers investigating reasons for the (non-)adoption of Tor [20] and JonDonym [17]. Based on the construct of internet users' information privacy concerns [42,43] Harborth and Pape found that trust beliefs in the anonymization service played a huge role for the adoption [18,19]. Further work [21] indicates that the providers' reputation, aka trust in the provider, played also a major role in the users' willingness to pay for or donate to these services. ...
Article
Full-text available
Users report that they have regretted accidentally sharing personal information on social media. There have been proposals to help protect the privacy of these users, by providing tools which analyze text or images and detect personal information or privacy disclosure with the objective to alert the user of a privacy risk and transform the content. However, these proposals rely on having access to users’ data and users have reported that they have privacy concerns about the tools themselves. In this study, we investigate whether these privacy concerns are unique to privacy tools or whether they are comparable to privacy concerns about non-privacy tools that also process personal information. We conduct a user experiment to compare the level of privacy concern towards privacy tools and nonprivacy tools for text and image content, qualitatively analyze the reason for those privacy concerns, and evaluate which assurances are perceived to reduce that concern. The results show privacy tools are at a disadvantage: participants have a higher level of privacy concern about being surveilled by the privacy tools, and the same level concern about intrusion and secondary use of their personal information compared to non-privacy tools. In addition, the reasons for these concerns and assurances that are perceived to reduce privacy concern are also similar. We discuss what these results mean for the development of privacy tools that process user content.
... That changed with a series of papers investigating reasons for the (non-)adoption of Tor [20] and JonDonym [17]. Based on the construct of internet users' information privacy concerns [42,43] Harborth and Pape found that trust beliefs in the anonymization service played a huge role for the adoption [18,19]. Further work [21] indicates that the providers' reputation, aka trust in the provider, played also a major role in the users' willingness to pay for or donate to these services. ...
Article
Users report that they have regretted accidentally sharing personal information on social media. There have been proposals to help protect the privacy of these users, by providing tools which analyze text or images and detect personal information or privacy disclosure with the objective to alert the user of a privacy risk and transform the content. However, these proposals rely on having access to users' data and users have reported that they have privacy concerns about the tools themselves. In this study, we investigate whether these privacy concerns are unique to privacy tools or whether they are comparable to privacy concerns about non-privacy tools that also process personal information. We conduct a user experiment to compare the level of privacy concern towards privacy tools and non-privacy tools for text and image content, qualitatively analyze the reason for those privacy concerns, and evaluate which assurances are perceived to reduce that concern. The results show privacy tools are at a disadvantage: participants have a higher level of privacy concern about being surveilled by the privacy tools, and the same level concern about intrusion and secondary use of their personal information compared to non-privacy tools. In addition, the reasons for these concerns and assurances that are perceived to reduce privacy concern are also similar. We discuss what these results mean for the development of privacy tools that process user content.
... For that purpose we used the IUIPC construct [125,159,160]. The IUIPC construct has been used in various contexts, such as internet of things [136], internet transactions [95] and mobile apps [174], but so far it had not been applied to a PET such as an anonymization service. ...
Thesis
Full-text available
In order to address security and privacy problems in practice, it is very important to have a solid elicitation of requirements, before trying to address the problem. In this thesis, specific challenges of the areas of social engineering, security management and privacy enhancing technologies are analyzed: Social Engineering: An overview of existing tools usable for social engineering is provided and defenses against social engineering are analyzed. Serious games are proposed as a more pleasant way to raise employees’ awareness and to train them. Security Management: Specific requirements for small and medium sized energy providers are analyzed and a set of tools to support them in assessing security risks and improving their security is proposed. Larger enterprises are supported by a method to collect security key performance indicators for different subsidiaries and with a risk assessment method for apps on mobile devices. Furthermore, a method to select a secure cloud provider – the currently most popular form of outsourcing – is provided. Privacy Enhancing Technologies: Relevant factors for the users’ adoption of privacy enhancing technologies are identified and economic incentives and hindrances for companies are discussed. Privacy by design is applied to integrate privacy into the use cases e-commerce and internet of things.
Chapter
Full-text available
One way to reduce privacy risks for consumers when using the internet is to inform them better about the privacy practices they will encounter. Tailored privacy information provision could outperform the current practice where information system providers do not much more than posting unwieldy privacy notices. Paradoxically, this would require additional collection of data about consumers’ privacy preferences—which constitute themselves sensitive information so that sharing them may expose consumers to additional privacy risks. This chapter presents insights on how this paradoxical interplay can be outmaneuvered. We discuss different approaches for privacy preference elicitation, the data required, and how to best protect the sensitive data inevitably to be shared with technical privacy-preserving mechanisms. The key takeaway of this chapter is that we should put more thought into what we are building and using our systems for to allow for privacy through human-centered design instead of static, predefined solutions which do not meet consumer needs.
Chapter
Full-text available
Mobile computing devices have become ubiquitous; however, they are prone to observation and reconstruction attacks. In particular, shoulder surfing, where an adversary observes another user’s interaction without prior consent, remains a significant unresolved problem. In the past, researchers have primarily focused their research on making authentication more robust against shoulder surfing—with less emphasis on understanding the attacker or their behavior. Nonetheless, understanding these attacks is crucial for protecting smartphone users’ privacy. This chapter aims to bring more attention to research that promotes a deeper understanding of shoulder surfing attacks. While shoulder surfing attacks are difficult to study under natural conditions, researchers have proposed different approaches to overcome this challenge. We compare and discuss these approaches and extract lessons learned. Furthermore, we discuss different mitigation strategies of shoulder surfing attacks and cover algorithmic detection of attacks and proposed threat models as well. Finally, we conclude with an outlook of potential next steps for shoulder surfing research.
Chapter
Full-text available
Users should always play a central role in the development of (software) solutions. The human-centered design (HCD) process in the ISO 9241-210 standard proposes a procedure for systematically involving users. However, due to its abstraction level, the HCD process provides little guidance for how it should be implemented in practice. In this chapter, we propose three concrete practical methods that enable the reader to develop usable security and privacy (USP) solutions using the HCD process. This chapter equips the reader with the procedural knowledge and recommendations to: (1) derive mental models with regard to security and privacy, (2) analyze USP needs and privacy-related requirements, and (3) collect user characteristics on privacy and structure them by user group profiles and into privacy personas. Together, these approaches help to design measures for a user-friendly implementation of security and privacy measures based on a firm understanding of the key stakeholders.
Chapter
Full-text available
A variety of methods and techniques are used in usable privacy and security (UPS) to study users’ experiences and behaviors. When applying empirical methods, researchers in UPS face specific challenges, for instance, to represent risk to research participants. This chapter provides an overview of the empirical research methods used in UPS and highlights associated opportunities and challenges. This chapter also draws attention to important ethical considerations in UPS research with human participants and highlights possible biases in study design.
ResearchGate has not been able to resolve any references for this publication.