Conference Paper

Leech: Let's Expose Evidently bad data Collecting Habits -Towards a Serious Game on Understanding Privacy Policies (Abstract)

If you want to read the PDF, try requesting it from the authors.


Most privacy policies are incomprehensive and largely unreadable. As a consequence, most users do not bother to read them. We propose Leech, a serious game developed in a students’ project for learning about the contents and structure of privacy policies so that users get a rough understanding what to expect in privacy policies. Leech is an adventure game and the player has to solve quests to complete the game. Two of the tasks are implemented as a mini game to allow more complexity. Two pre-tests led to promising results and we intend to quantitatively evaluate the game in the next step by investigating players’ online privacy literacy, demographics, values on privacy policies, actions within the game, and their in-game experience.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Fig. 4b) to allow more complexity. Two pre-tests led to promising results and a quantitative evaluation of the game is planned as the next step by investigating players' online privacy literacy, demographics, values on privacy policies, actions within the game, and their in-game experience [37]. ...
Serious games seem to be a good alternative to traditional trainings since they are supposed to be more entertaining and engaging. However, serious games also create specific challenges: The serious games should not only be adapted to specific target groups, but also be capable of addressing recent attacks. Furthermore, evaluation of the serious games turns out to be challenging. While this already holds for serious games in general, it is even more difficult for serious games on security and privacy awareness. On the one hand, because it is hard to measure security and privacy awareness. On the other hand, because both of these topics are currently often in the main stream media requiring to make sure that a measured change really results from the game session. This paper briefly introduces three serious games to counter social engineering attacks and one serious game to raise privacy awareness. Based on the introduced games the raised challenges are discussed and partially existing solutions are presented.
Conference Paper
Full-text available
The new information and communication technology providers collect increasing amounts of personal data, a lot of which is user generated. Unless use policies are privacy-friendly, this leaves users vulnerable to privacy risks such as exposure through public data visibility or intrusive commercialisation of their data through secondary data use. Due to complex privacy policies, many users of online services unwillingly agree to privacy-intruding practices. To give users more control over their privacy, scholars and regulators have pushed for short, simple, and prominent privacy policies. The premise has been that users will see and comprehend such policies, and then rationally adjust their disclosure behaviour. In this paper, on a use case of social network service site, we show that this premise does not hold. We invited 214 regular Facebook users to join a new fictitious social network. We experimentally manipulated the privacy-friendliness of an unavoidable and simple privacy policy. Half of our participants miscomprehended even this transparent privacy policy. When privacy threats of secondary data use were present, users remembered the policies as more privacy-friendly than they actually were and unwittingly uploaded more data. To mitigate such behavioural pitfalls we present design recommendations to improve the quality of informed consent.
Full-text available
Context: Recent studies have shown that despite serious concerns regarding online pri- vacy, users usually share their personal information online which may be, in turn, used by third party companies and possibly by strangers or social engineers. There is a discrepancy in what people say and what they actually do regarding their privacy; this behavior is known as “Privacy Paradox.” Excessive online information disclosure is one of the reasons of proliferating concerns regarding privacy and is also helpful for social engineers to easily gather information related to their targets. Objective: The objectives of this study are to: i) gather user privacy concerns reported in the literature and further categorize them by themes or codes; ii) design the serious game using privacy concerns identified; iii) evaluate a consequential game to educate participants regarding dangers associated with excessive online (personal) information disclosure. Method: To achieve the objectives and answer the research questions, we have adopted two research methods. Firstly, we have performed a literature review (109+ studies) to extract user privacy concerns reported in the literature. Secondly, using the privacy concerns, a serious game is designed, developed and empirically (preliminary) evaluated for participants’ awareness regarding dangers associated with excessive online information disclosure. Result: From the results of our study we can summarize that: i) privacy concerns tend to generate a positive long-run impact on users’ tendency to get educated of the dangers associated with excessive information disclosure. This awareness will spillover in controlled information sharing online. On the other hand, in the short-run, social rewards/incentives make users share their information online, dominating the effects of privacy concerns. However, these rewards have short-run impacts. Eventually, in the long-run, users understand the associated dangers. This realization spills-over into controlled online information sharing and strengthens the results of increased Privacy concerns; ii) proposed serious game which has, in its initial phases, shown positive results in its aim to make participants aware of dangers associated with excessive online disclosure.
Full-text available
In this paper, we address the problem of enhancing young people's awareness of the mechanisms involving privacy in online social networks by presenting an innovative approach based on gamification. In particular, we propose a web application that allows kids and teenagers to experience the typical dynamics of information spread through a realistic interactive simulation. Under the supervision of the teacher, the students are inserted in a small artificial social graph, and, through the different stages of game, they can post sentences with different levels of sensitivity, and "like" or share messages published by friends. At the end of game session, the application calculate multiple behavioral scores, that can be used by the teacher to raise the curiosity of the students and stimulate discussions. Moreover, a complete interactive report is generated to analyze every individual action of the terminated game sessions. Our educational tool has been employed within an extensive experimental study involving more than 450 kids and 22 teachers in seven Italian primary school institutes. The results show that our approach is stimulating and supports teachers in helping kids discover and recognize potential privacy risks in social network activities.
Full-text available
Internet privacy policies describe an organization's practices on data collection, use, and disclosure. These privacy policies both protect the organization and signal integrity commitment to site visitors. Consumers use the stated website policies to guide browsing and transaction decisions. This paper compares the classes of privacy protection goals (which express desired protection of consumer privacy rights) and vulnerabilities (which potentially threaten consumer privacy) with consumer privacy values. For this study, we looked at privacy policies from nearly 50 websites and surveyed over 1000 Internet users. We examined Internet users' major expectations about website privacy and revealed a notable discrepancy between what privacy policies are currently stating and what users deem most significant. Our findings suggest several implications to privacy managers and software project managers. Results from this study can help managers determine the kinds of policies needed to both satisfy user values and ensure privacy-aware website development efforts.
Conference Paper
With the continuing growth of the Internet landscape, users share large amount of personal, sometimes, privacy sensitive data. When doing so, often, users have little or no clear knowledge about what service providers do with the trails of personal data they leave on the Internet. While regulations impose rather strict requirements that service providers should abide by, the defacto approach seems to be communicating data processing practices through privacy policies. However, privacy policies are long and complex for users to read and understand, thus failing their mere objective of informing users about the promised data processing behaviors of service providers. To address this pertinent issue, we propose a machine learning based approach to summarize the rather long privacy policy into short and condensed notes following a risk-based approach and using the European Union (EU) General Data Protection Regulation (GDPR) aspects as assessment criteria. The results are promising and indicate that our tool can summarize lengthy privacy policies in a short period of time, thus supporting users to take informed decisions regarding their information disclosure behaviors.
Shomir Wilson, et al. Finding a choice in a haystack: automatic extraction of opt-out statements from privacy policy text
  • Roger Vinayshekhar Bannihatti Kumar
  • Namita Iyengar
  • Yuanyuan Nisal
  • Hana Feng
  • Peter Habib
  • Sushain Story
  • Margaret Cherivirala
  • Lorrie Hagan
  • Cranor
Vinayshekhar Bannihatti Kumar, Roger Iyengar, Namita Nisal, Yuanyuan Feng, Hana Habib, Peter Story, Sushain Cherivirala, Margaret Hagan, Lorrie Cranor, Shomir Wilson, et al. Finding a choice in a haystack: automatic extraction of opt-out statements from privacy policy text. In Proceedings of The Web Conference 2020, pages 1943-1954, 2020.
Privacity: A chatbot serious game to raise the privacy awareness of teenagers
  • Erlend Berger
  • Torjus Hansen Saethre
Erlend Berger and Torjus Hansen Saethre. Privacity: A chatbot serious game to raise the privacy awareness of teenagers. Master's thesis, NTNU, 2018.
Friend inspector: A serious game to enhance privacy awareness in social networks
  • Alexandra Cetto
  • Michael Netter
  • Günther Pernul
  • Christian Richthammer
  • Moritz Riesner
  • Christian Roth
  • Johannes Sänger
Alexandra Cetto, Michael Netter, Günther Pernul, Christian Richthammer, Moritz Riesner, Christian Roth, and Johannes Sänger. Friend inspector: A serious game to enhance privacy awareness in social networks. CoRR, abs/1402.5878, 2014.
Limesurvey: An open source survey tool
  • Limesurvey Gmbh
Limesurvey GmbH. Limesurvey: An open source survey tool., 2021.
Zynga's privacyville -it's not fun, but it gets the job done
  • Kashmir Hill
Kashmir Hill. Zynga's privacyville -it's not fun, but it gets the job done. kashmirhill/2011/07/08/zyngas-privacyvilleits-not-fun-but-it-gets-the-job-done/, 2011.
The game experience questionnaire
  • Yvonne Aw Wijnand A Ijsselsteijn
  • Karolien De Kort
  • Poels
Wijnand A IJsselsteijn, Yvonne AW de Kort, and Karolien Poels. The game experience questionnaire. Eindhoven: Technische Universiteit Eindhoven, 46(1), 2013.
Do people know about privacy and data protection strategies? towards the "online privacy literacy scale"(oplis). In Reforming European data protection law
  • Sabine Trepte
  • Doris Teutsch
  • K Philipp
  • Carolin Masur
  • Mona Eicher
  • Alisa Fischer
  • Fabienne Hennhöfer
  • Lind
Sabine Trepte, Doris Teutsch, Philipp K Masur, Carolin Eicher, Mona Fischer, Alisa Hennhöfer, and Fabienne Lind. Do people know about privacy and data protection strategies? towards the "online privacy literacy scale"(oplis). In Reforming European data protection law, pages 333-365. Springer, 2015.