Conference PaperPDF Available



Abstract and Figures

The use of persuasion has become ubiquitous in interactive systems with the intent of increasing revenue, gathering user information and maximizing user engagement. However, it has been argued that users' autonomy is an ethical concern within persuasive UX design. This paper argues that there is a need to integrate ethics education within UX pedagogy and practice. Focusing on the topic of 'persuasive UX design', the authors test a framework to educate design students about the ethics of persuasion from a user autonomy perspective. We introduced autonomy related ethical considerations to a graduate user experience class of 22 students. Through a pre/post activity, we show that the introduction of these ethical considerations increased students' critical attitudes towards persuasive designs. We also conducted in-depth qualitative interviews with five students to understand students' experiences of ethics education and its impact on their ethics understanding. Through these interviews, we identified the needs and gaps in ethics education, including the difficulties which design students face to understand and incorporate ethics in their design process. Based on the findings from this study, we discuss future directions for integrating ethics education in user experience design.
Content may be subject to copyright.
Author Name *
Affiliation *
Address *
Author Name *
Affiliation *
Address *
The use of persuasion has become ubiquitous in interactive computing systems with the intent of increasing revenue,
gathering user information and maximizing user engagement. However, it has been argued that users autonomy is an
ethical concern within persuasive UX design. This paper argues that there is a need to integrate ethics education within
UX pedagogy and practice. Focusing on the topic of persuasive UX design, the authors propose a framework to educate
design students about the ethics of persuasion. We introduced the framework to a user experience class of 22 students, to
integrate ethical perspectives within their design education. Through a pre/post activity, we show that the proposed
framework increased students critical attitudes towards persuasive designs. We also conducted in-depth qualitative
interviews with five students to understand the students experiences of the ethics education and its impact on their ethics
understanding. Through these interviews, we identified the needs and gaps in ethics education, including the difficulties
which design students face to understand and incorporate ethics in their design process. Based on the findings from this
study, we discuss future directions for integrating ethics education in persuasive user experience design.
persuasive technology; user experience design; ethics; autonomy; design education; ethics education
Persuasive technologies refer to interactive computing systems designed to influence people’s attitudes and
behavior (Fogg 2002). Recent years have seen the emergence of dark persuasive strategies, also known as
‘dark patterns’ (Brignull 2010). The term dark patterns refers to interface designs which intend to manipulate
or coerce users into acting in ways not desired by them or not in their best interests. Dark patterns are widely
used in interaction design and have been systematically identified in different contexts such as e-commerce,
games and privacy management (Bösch et al. 2016; Mathur et al. 2019; Zagal et al. 2013). The widespread
use of dark patterns and their potential harms have led to significant criticism. Brignull (2010) created a Hall
of Shame to criticize the use of dark patterns by various technology companies. On social media forums,
dark patterns have been met with public condemnation (Gray et al. 2020). Dark patterns have also received
the attention of policymakers, especially in privacy contexts, with several ongoing attempts to regulate them
(Warner & Fischer 2019). Despite these attempts, much of the application areas of dark patterns currently
remain beyond the purview of regulation. Therefore, it been argued that there is a need to better educate
designers about the ethical aspects of persuasive design. Gray et al. (2018) argued that to address the problem
of dark patterns, there is a need to integrate ethics education within UX pedagogy and practice. However,
there are few reports of integrating persuasive design ethics within design education courses in literature.
To address this gap, we developed and tested an educational framework for persuasive UX design ethics
at our university. We integrated the proposed ethics framework within a graduate user experience class. We
investigated the impact of the framework on students’ critical attitudes towards persuasive designs through an
in-class pre/post activity. We further conducted in-depth qualitative interviews with five students to
understand the impact of ethics education, and to identify needs and gaps which further need to be addressed.
Our findings suggest that the proposed ethics framework may enhance students criticality of persuasive UX
design. The findings from the interviews suggest that ethics education is necessary to create a space of
awareness and articulation of ethical concerns. It also promotes ethics understanding as well as ethical
reflection. However, we observed tensions between students needs as designers to create the most effective
designs, and their developing ethical inclinations provoked by the class. Through these observations, we
provide future directions for integrating ethics education in university-level interaction design curriculums.
Persuasive design is an important aspect of UX design, and is often taught within interaction design courses.
Various theories of persuasion aid designers in influencing user decision making and affecting behavior
change (Fogg 2009; Oinas-Kukkonen & Harjumaa 2009). However, ethical concerns have also emerged.
Berdichevsky and Neuenschwander (1999) articulated perhaps the earliest ethical concerns about persuasion
in interactive technologies. They argued that persuasion should only be used to promote ethical outcomes,
and should respect the autonomy of the individual. Since then, several discourses have touched upon the
ethics of persuasive technologies. Fogg (2002) raised concerns about persuasive strategies such as using
emotions to persuade, deception, coercion, operant conditioning and surveillance. Verbeek (2009) argued that
ambient intelligence creates radically new experiences of persuasive design, which can encroach greatly on
our everyday activities and our choice processes. Ahuja and Kumar (2020) outlined the possible degrees of
autonomy harms inherent within the design of surveillance based persuasive technologies.
However, even though autonomy has been identified as an important lens with which to evaluate
persuasive designs, it is not always clear how each persuasive design impacts (enhances or undermines) the
autonomy of users (Mathur et al. 2021). Such an assessment requires a careful normative consideration of the
context, the users the methods, the intentions and the outcomes of persuasion (Fogg 2002). Therefore, there is
a need to sensitize designers towards ethical concerns as a part of their education, and guide them to be more
adept at making such normative evaluations. Fiesler et al. (2021) argued that integrating ethics into technical
courses not only supports in-situ learning but also emphasizes to students that ethical practice is inherently a
part of technical practice. In educational contexts, Fiesler et al. (2020) report some university courses which
introduce students to ethics in technology design and to topics such as ethical theories, value-sensitive design
and dark patterns. However, these courses do not include in-depth ethical perspectives on persuasive UX
design from the perspective of users autonomy. The considerations of autonomy have not been sufficiently
developed in design literature and therefore have not yet been integrated within educational contexts.
In this paper, we have developed a framework for the evaluation of persuasive UX design from the
perspective of users autonomy. We report on a study in which we integrated the concepts from the
framework into a graduate user experience class, to sensitize students towards ethical considerations
pertaining to users autonomy. Through a pre/post activity, we investigated the impact of the framework in
enhancing students critical attitudes towards persuasive designs. We also conducted in-depth interviews to
assess students’ experience of ethics education integrated within an interaction design course.
Within our proposed framework, we have expanded upon the consideration of autonomy for the normative
evaluation of persuasive UX design. Individual autonomy is understood in philosophical literature as an
individual’s right to self-governance (Buss & Westlund 2018). It is constituted by the freedom to be the
person one wants to be and to pursue one’s goals without unjustifiable hindrances or interferences (Roskies
2021). Within literature, autonomy has been argued to be an important normative consideration for the
evaluation of persuasive technologies (Susser et al. 2019). However, Mathur et al. (2021) argued that the
autonomy perspective is underdeveloped, appearing to label all interventions into decision making as dark.
Even though the autonomy perspective has not matured in design literature, within the field of behavioral
economics, there is significant literature which has developed this perspective. In a recent review, Vugts et al.
(2020) identified that within nudge literature, at least three distinct conceptualizations of autonomy were at
play within the ethics debate. The authors argued that these conceptualizations were distinct not only in their
interpretations, but also their normative implications in different contexts.
Fig. 1 Ethical Considerations from the Perspective of Users’ Autonomy
Based on past work, we have identified five distinct conceptualizations of autonomy for the normative
evaluation of persuasive UX design. These are agency, freedom, control, independence and authenticity. The
conceptualization of agency concerns an individual’s psychological capacity or competence to choose and
decide (Friedrich et al. 2018; Vugts et al. 2020). It involves the ability to reason based on information, in
accordance with ones preferences and goals. Agency is threatened by deceptive and manipulative persuasive
designs. The conceptualization of freedom concerns the availability of choices or options (Vugts et al. 2020).
Freedom of a user is violated when their choices are restricted, however, freedom also means that the user
has practical, and not just a theoretical accessibility of relevant choices (Hansen & Jespersen 2013). Threats
to freedom also emerge from pressure tactics such as fear, pressure and coercion (Brehm 1966). The
conceptualization of control signifies that a user has the opportunity to consent to an interactive systems
actions or system behaviors. Loss of control often manifests in the form of non-transparent automatic
behaviors of the system, such as automatic updates or background data processing (Gray et al. 2020).
Autonomy in the sense of independence is grounded in the notion of non-reliance. This conceptualization of
autonomy represents that a user is able to self-regulate. Independence implies absence of any compulsions
and dependencies, such as those created by games (Lewis 2014). The conceptualization of authenticity
requires a person to act for reasons grounded in her authentic self, reasons which can be considered her ‘own’
(Betzler 2009). Ethical concerns about authenticity have been discussed significantly in relation to
neurotechnology, however other non-invasive technologies such as marketing have also been argued to pose
threats to authenticity (Burwell et al. 2017).
This framework is the first to our knowledge to expand upon the normative consideration of users
autonomy. Within design literature, the perspective of autonomy has not been developed and the theoretical
underpinnings of ethical concerns have not been articulated. Wherever autonomy concerns have been
discussed, they have been raised under the terminology of manipulation, deception and coercion
(Mathur et al. 2021; Susser et al. 2019). These terminologies often accurately describe ethical concerns with
dark patterns, however, ethical concerns such as technology addiction, or continued reliance on technology
(the independence conceptualization), are not very well captured. Similarly, concerns about authenticity have
also not been articulated in design literature as part of the autonomy perspective. The aim of this framework
is to inculcate in students a holistic understanding of autonomy as an ethical consideration, and to broaden
the sphere of design responsibility. This framework can situate within a normative lens a wide range of
ethical concerns which have previously not received sufficient attention within the autonomy perspective.
This study was conducted within a user experience design course at our university. The student cohort
primarily consists of graduate students in Design, however the course is open for other students as an
elective. The sessions were virtual due to the ongoing pandemic.
4.1 Course Overview
This course is geared towards developing the students understanding of the UX design process. As part of
the course curriculum, the students are exposed to the persuasive design process. Concepts of persuasion are
introduced through Cialdinis (2007) six principles of persuasion. This is followed by an introduction to
cognitive biases, and a detailed discussion on biases such as framing, priming and anchoring (Tversky &
Kahneman 1974). The students are encouraged to look up the several documented biases in literature. The
course is evaluated only through assignments, activities, and a reflective journal, without any formal exams.
The course overview is presented in Fig. 2.
Fig 2. Course Overview of Relevant Contents
For this study, ethical considerations were explicitly introduced into this course after the students were
introduced to persuasive design and had already completed an assignment to design a persuasive e-commerce
website. The ethics discussions were introduced within two sessions of 1.5 hours each. We first introduced
the students to the term dark patterns and familiarized them with some dark pattern terminologies in e-
commerce, games and privacy management. We also introduced students to examples of persuasive designs
which could not be intuitively construed as ‘dark’ such as search order effects, one-click shopping and step
counting features in m-health apps. The purpose of these examples was to introduce a balanced perspective
on persuasive design, and indicate that several applications lied in a gray zone which could not be
immediately dismissed as dark. Then we introduced the concept of macrosuasion or macro-persuasion (Fogg
2002; Rogers et al. 2020). We wanted to help students understand how persuasive designs may work
cumulatively to serve an underlying intent geared towards a particular form of behavior change. More
recently, Westin and Chiasson (2021) have also described this phenomenon and labelled it as dark
infrastructure. After a broad introduction to the topic of dark patterns and dark infrastructure, we introduced
individual autonomy as an ethical consideration. At this stage, we only introduced students to the broad
understanding of autonomy (Buss & Westlund 2018; Roskies 2021).
4.2 Activity 1: Evaluation of Ethics
By this stage, the students only had a broad understanding of autonomy and had not yet been exposed to the
different ethical considerations within the autonomy framework. Before the introduction of our proposed
framework, we sought to capture students evaluations of ethics. Therefore, we conducted an activity in
which we asked students to evaluate seven different examples of macro-persuasions. We chose macro-
persuasions for this study because they each capture a wide range of autonomy concerns. After a discussion
of the persuasive designs which facilitated these macro-persuasions (Table 1), the students were asked if the
given macro-persuasions were ethical. They were asked to rate these persuasions on a scale of 1 (Not at all)
to 5 (Very much) in a Google form.
This activity was followed by an introduction to the proposed framework. We introduced students to
autonomy as an ethical consideration. We then introduced them to the concept of conceptualizations: how
autonomy can have different meanings and be experienced in its different senses. We showed how different
persuasive strategies impacted autonomy in a different sense, such as the difference between having limited
choices and being manipulated into making certain choices. The purpose of this framework was to broaden
students understanding of autonomy as an ethical value, and introduce relatively novel perspectives such as
control, independence and authenticity. After this introduction to the proposed framework, students were
again asked to rate the macro-persuasions in Table 1 and evaluate whether they felt they were ethical or not,
on a scale of 1 (Not at all) to 5 (Very much). We hypothesized that the students would become more critical
of the same macro-persuasions after being exposed to a broadened perspective on users autonomy.
Table 1. Macro-Persuasions Evaluated in Activity 1
Persuasive Designs which Facilitate the Macro-Persuasion
Digital Privacy Decision
Features designed to maximize data collection, such as cookie banner design, privacy
unfriendly defaults, restricted access to websites, forced registrations, timing of consent
and information complexity
E-Commerce Discounting
Features designed to drive up the perception of value in e-commerce, such as deals,
flash sales, offers, cashbacks, limited discounts, heavy discounts, urgency and scarcity
Digital Content Algorithms
Presence of fake and exaggerated news, amplification of polarized content, likes / shares
driven amplification of content, personalized targeting of vulnerable populations
User Engagement Through
Nagging, pestering or interruptive features such as advertisements, popups,
notifications, spam messages, emails and robocalls
Asymmetric User Effort
Digital choice architectures with disbalanced effort to access choices, such as flight
booking vs. cancellations, payments vs. refunds, ordering vs. returning, subscription vs.
cancellation, complicated IVRs, chatbot or customer complaint processes
Digital Addiction
Features designed to drive or exploit obsessive / compulsive engagement in games or
social media, such as grinding, streaks, likes, shares, rewards and punishments
Health Monitoring
Step counting, heart rate monitoring, activity logging, sleep monitoring, weight and
calorie management, linking of health monitoring to authoritative databases (school,
employer, insurance, government)
4.3 Student Interviews
Post the course, we interviewed five students on a voluntary basis to understand their experience of the class
and their takeaways. Three of the interview participants identified as male and two identified as male. The
mean age of the participants was 26. The interviews were audio recorded with the participants consent. Four
interviews were conducted in-person and one was conducted virtually. The interviews lasted an average of 1
hour each, with a total recorded audio of 4 h 54 m.
We began the interview by asking the participant if they remembered anything particular from the class.
We asked if they had previously been aware of the dark designs that were discussed. We asked about the
examples of persuasions which they found most unethical. We probed the participants on their perspective of
ethics, by asking which persuasive strategies they would not personally design. We occasionally brought up
examples of persuasive designs such as infinite scroll, cookie banner, games, and interruptive ads, to
understand how students came to their ethics judgments. We probed the participants on their sense of design
responsibility, gathering their perspective on how ethical responsibility should be shared between the
designer and the user. Then, we sought feedback on the class itself, and probed their understanding of and
perspective on the proposed framework that was introduced in the class.
5.1 Activity 1: Enhanced Critical Attitude
Through the analysis of Activity 1, we investigated the change in studentscritical attitudes of seven macro-
persuasions after being introduced to our proposed framework. The activity was completed by 22 students in
the pre-condition and 17 students in the post-condition, hence 5 students were excluded from the analysis.
The mean rating given by students to each macro-persuasion in the pre and post conditions is presented in
Table 2. We found significant overall reduction in the mean rating (p<0.05, one-tailed paired t-test). Among
individual scores of seven macro-persuasions, we found significant reduction in the mean for two cases (*).
In the other five cases, a statistically non-significant reduction of the mean was observed.
Table 2. Activity 1: Pre/Post Results
Digital Privacy Decision Making
E-Commerce Discounting
Digital Content Algorithms
User Engagement Through Interruptions
Asymmetric User Effort
Digital Addiction
Health Monitoring
Overall Average
5.2 Student Experiences: Needs, Gaps and Outcomes
Fig. 3 summarizes the findings from the interviews. We open coded the interview transcripts and identified
the emerging themes in the discussion. We observed that students experiences and comments on the sessions
belonged into three broad themes: Need for Ethics education, Expected Outcomes, and Gaps in Ethics
5.2.1 Need for Ethics Education
Awareness: The introduction of ethical considerations served the purpose of creating awareness. Several
participants remarked that they were not previously aware of these ethical issues. For instance, P1 said that
many things were very new to me, I was aware about few things before, but I didnt know the science or the
actual strategy. P2 mentioned that he was aware of ethical issues in AI and neuroscience but not in interface
design, and was not aware of dark patterns. P4 and P5 found the knowledge about dark patterns novel. P5
also remarked that earlier she used to ignore things like terms and conditions, but now she realizes that its
purposely designed like this.
Fig. 3 Findings from Student Interviews
Articulation: The discussion on ethics created a space for students to articulate their concerns. P2 mentioned
that he earlier had doubts such as why are we persuading people and why do we want to get more people if
they dont want, while persuasion was being taught within the curriculum. P5 also mentioned that questions
of ethics emerged when the class was taught persuasive principles. The students were often passively aware
of ethical issues but did not ponder over them. P3 said that we all know but never discuss, sometimes you
don’t even bother, because these are things that are so frequently happening. P4 mentioned that the examples
discussed in the class were very much in tandem with what I see in real life. Students also discovered a
frame to describe their observations and experiences. P2 discussed his past game addiction and P5 mentioned
her sisters Snapchat streak addiction. P5 also described how in the past she felt addicted to Instagram, where
her finger automatically kept going to the app location on her phone screen, even when the app was no longer
there. She described the realization as scary.
Importance: We also observed that students perceived value in ethics education. P1 said that teaching ethics
in the UX class made a lot of sense and he thought that ‘ethics must be taught in every design course’. P4
said that the course can act as a bible for ethics. P5 said, I feel as students we should be taught about the
other aspects, like we are doing certain things, okay the industry is demanding certain things, but is it ethical,
I mean that thought provoking at least should be there, and it should be taught ideally, and properly. She felt
that if collectively there are students learning this thing in education system, because these students go in
industry, and they build careers or they start companies, and I’m sure they would kind of implement it, so the
change can start from there where it is taught to us, and not just persuading people to perform in a certain
way, that can be a start.
5.2.2 Expected Outcomes
Ethics Sensitization: Students appeared to develop a sense of responsibility after the session. P1 said that
now I have to think a lot and plan a lot to implement any persuasion technique, I need to be more holistic in
a way, basically I really don’t want to violate the fundamental rights of people. P2 said that he wants his
design to be inclusive. P3 believed that design responsibility is shared. She said, being a designer I would
definitely want that my app is used, that means it has great experience and great UI, but then its harming the
user, so it is the designers responsibility to understand that the user is being harmed and then you provide
those limitations, those are the duties of the designer, and for the user it is to use those features which are
there. P5 said that after the sessions, she tried to incorporate concepts from the class into her final project as
part of the course.
Ethical Reflection: The interviews indicated that the session introduced ethical reflection within the design
process of students. P1 said that the session was not just knowledge, it was good reflection and it took
away that excitement about persuading people. P3 said that ‘every decision, every click that I would make in
that application, there would be a constant reflection on what would it do, is it asking for data, is it asking for
something which is not required, every decision every click would be conscious to see what is the future’. P5
said that when ethical concerns come into play, she needs to be able to take a step back. With regards to her
final project, she said after the session a lot of questioning started happening that what I am doing, is it okay
if Im doing like this, will it be overwhelming for the user or will the user not even realize that they are being
taken somewhere without giving them that choice.
Ethical Understanding: The session helped students develop their own perspectives and understanding of
ethics. Several students indicated examples for the kinds of designs they find unethical, and they would
personally not use in their design activity, such as fake discounts (P3), non-consensual data extraction (P3,
P5), dishonest information (P4), mindlessly addictive streak features (P5) and decoy effects (P5). Students
also discussed their strategic approach towards ethical design. P1 said that he would respect people’s
boundaries, P3 said that she would try to make persuasion beneficial for users, and persuade with
transparency. P4 said that he would use ‘consensual roadblocksin his design.
5.2.3 Gaps in Ethics Education
Addressing Roadblocks: We observed that several students imagined themselves in future corporate roles
where they may not have complete control over their design outcomes. Therefore, they might not always find
it practically possible to be as ethical as possible. P1 said, if Ill be working for a company then I believe
things wont be in my hand completely. P4 said that ethics is a ‘guilty pleasure’, and might be incompatible
with working in a company. He also said that in some cases design responsibility is not the role of the
designer because this decision would have come from somewhere else and the designers would have some
brief’. P5 was worried, when Ill be in the industry, then these corporations, theyll be asking me to do stuff
and how much of a say will I have and it is scary that ultimately it might cost me my job if I say no to it. She
also said that most of the times its not in our hands because designers are usually not in the CXO levels.
Tradeoffs and Dilemmas: The students expressed conflict in several situations about how to identify the
boundaries between ethical and unethical design, how to make tradeoffs of ethics against profit, how to
balance autonomy in relation to other values and how to prioritize different conceptualizations of autonomy
when they diverged. P3 expressed that she was conflicted about health monitoring systems, because they help
her but at the same time make her dependent. P4 expressed a form of empathy with designers or
businesspersons trying grow their companies and be profitable.
Need for Guidelines: Students expressed a desire for explicit guidelines, detailing how ethical design should
be approached. P1s suggestion for guidelines stemmed from a difficulty to resolve ethical tradeoffs.
However, the need for guidelines was also echoed by P4 who felt persuasion was essential to profit, and that
guidelines and/or regulation was essential for fair play in the market.
In Activity 1, we observed that introducing a detailed perspective on users autonomy enhanced students
critical attitudes towards macro-persuasions. Since this study was conducted with a small sample size and
with a limited time to introduce the framework, the effects were significant only for certain examples, even
when the overall effects when significant. However, partial results are encouraging towards the usefulness of
introducing a detailed perspective of autonomy for the normative evaluation of persuasive designs. The
framework introduces conceptualizations of autonomy which are not intuitively compatible with the current
understanding of design ethics and design responsibility, such as the conceptualizations of authenticity and
independence. We argue that due to the inevitable impact of persuasive design on users autonomy in these
diverse senses, there is a need to expand the sphere of design responsibility. The framework integrates a
broader and more holistic range of autonomy concerns within design education.
In student interviews, we observed a perception that being ethical is too idealized, that it is not
compatible with profit making. Students appeared to perceive their future selves as not being in control of
their design activity. They referred to being treated as a tool to fulfil the ends of business stakeholders.
Even when not concerned about external stakeholders, P4 sympathized with designers need to make profit in
a competitive business. Therefore, we observed a false dichotomy between ethics and profit, where students
come to believe that ethical design is incompatible with profit. We argue that there is a strong need to address
this sense of conflict, for ethics education to be effective. We also observed a call for guidelines. In several
cases, students were genuinely unable to make ethical tradeoffs, such as in the case of infinite scroll. In this
case, students were conflicted on whether such a feature is ethical, because it is enjoyable to a portion of
users, whereas a portion of users also tend to lose self-control, and hence it leads to compulsive or obsessive
engagement in some cases. The students also desired some guidelines so that all companies have to play by
the same rules, and this observation was related to the earlier observation of students perceiving some
incompatibility between ethics and profit.
Overall, the results of this study were encouraging. The introduction of ethics content within an
interaction design class fulfils a number of students needs. It creates much needed pragmatic awareness
about dark designs, and creates a space to articulate ethical concerns rooted in existing ethical intuitions and
personal experiences. The students themselves acknowledge the need and find value in such education. The
paper suggests that with further improvement based on the feedback, the proposed framework can be used to
sensitize students towards ethical concerns in persuasive UX design in educational contexts.
Ahuja, S., and Kumar, J., 2020. Surveillance based persuasion: the good, the bad and the ugly. CHIRA’20. Budapest.
Berdichevsky, D., and Neuenschwander, E., 1999. Toward an ethics of persuasive technology. Communications of the
ACM. Vol. 42, No. 5, pp. 5158.
Betzler, M., 2009. Authenticity and self-governance. In Emotions, ethics, and authenticity, ed. Mikko Salmela and
Verena Mayer, pp. 5168. John Benjamins Publishing Company.
Bösch, C. et al, 2016. Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns. Proceedings on
Privacy Enhancing Technologies. Vol. 4, pp. 237-254.
Brehm, J. W., 1966. A theory of psychological reactance. Academic Press.
Brignull, H., 2010. What are dark patterns?, viewed 20 May, 2022, <>.
Burwell, S. et al, 2017. Ethical aspects of brain computer interfaces: a scoping review. BMC Medical Ethics. Vol. 18, Art.
60, pp. 1-11.
Buss, S., and Westlund, A., 2018. Personal Autonomy. The Stanford Encyclopedia of Philosophy (Spring 2018 Edition),
Edward N. Zalta (ed.). <>
Chivukula, S. S. et al, 2021. Surveying the Landscape of Ethics-Focused Design Methods. ArXiv, abs/2102.08909.
Cialdini, R. B., 2007. Influence: The Psychology of Persuasion. Collins, New York.
Fiesler, C. et al, 2020. What Do We Teach When We Teach Tech Ethics? A Syllabi Analysis. Proceedings of the 51st
ACM Technical Symposium on Computer Science Education. ACM, New York, pp. 289295.
Fiesler, C. et al, 2021. Integrating Ethics into Introductory Programming Classes. In Proceedings of the 52nd ACM
Technical Symposium on Computer Science Education (SIGCSE ’21), Virtual Event, USA.
Fogg, B. J., 2002. Persuasive technology: using computers to change what we think and do. Morgan Kaufmann
Fogg, B. J., 2009. A Behavior Model for Persuasive Design. In Proceedings of the 4th International Conference on
Persuasive Technology (Persuasive '09). ACM, New York, Art. 40, pp. 17.
Friedrich, O. et al, 2018. An analysis of the impact of brain-computer interfaces on autonomy. Neuroethics. Vol. 14, pp.
Gray, C. M. et al, 2018. The Dark (Patterns) Side of UX Design. CHI Conference on Human Factors in Computing
Systems (CHI ’18). Montreal, QC, Canada.
Gray, C. M. et al, 2020. What Kind of Work Do "Asshole Designers" Create? Describing Properties of Ethical Concern
on Reddit. ACM Designing Interactive Systems Conference (DIS ’20). Eindhoven, Netherlands.
Hansen, P. G., and Jespersen, A. M., 2013. Nudge and the Manipulation of Choice: A Framework for the Responsible
Use of the Nudge Approach to Behaviour Change in Public Policy. European Journal of Risk Regulation. Vol. 1, pp.
Lewis, C, 2014. Irresistible Apps: Motivational Design Patterns for Apps, Games, and Web-based Communities. Apress.
Mathur, A. et al, 2019. Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites. Proceedings of the
ACM on Human Computer Interaction. Vol. 3, No. CSCW, pp. 1-32.
Mathur, A. et al, 2021. What Makes a Dark Pattern...Dark?: Design Attributes, Normative Considerations, and
Measurement Methods. CHI Conference on Human Factors in Computing Systems (CHI ’21). Yokohama, Japan.
Oinas-Kukkonen, H., and Harjumaa, M., 2009. Persuasive Systems Design: Key Issues, Process Model, and System
Features. In Communications of the Association for Information Systems. Vol. 24, Art. 28, pp. 485-500.
Rogers et al., 2020. The Dark Side of Interaction Design. CHI Conference on Human Factors in Computing Systems
(CHI ’20). Honululu, USA
Roskies, A., 2021. Neuroethics. The Stanford Encyclopedia of Philosophy (Spring 2021 Edition), Edward N. Zalta (ed.).
Susser, D. et al, 2019. Technology, autonomy and manipulation. Internet Policy Review, Vol. 8, No. 2, pp. 122.
Tversky, A., and Kahneman, D., 1974. Judgment under uncertainty: heuristics and biases. Science, Vol. 185, No. 4157,
pp. 11241131.
Warner, M., and Fischer, D., 2019. April 09). Senators Introduce Bipartisan Legislation to Ban Manipulative Dark
Patterns, viewed 20 May, 2022, <
Verbeek, P. P., 2009. Ambient Intelligence and Persuasive Technology: The Blurring Boundaries Between Human and
Technology. Nanoethics. Vol 3 No. 3, pp. 231242.
Vugts, A. et al, 2020. How autonomy is understood in discussions on the ethics of nudging. Behavioural Public Policy.
Vol. 4, No. 1, pp. 108 - 123.
Westin, F., and Chiasson, S., 2021. “It’s So Difficult to Sever that Connection”: The Role of FoMO in Users’ Reluctant
Privacy Behaviours. CHI Conference on Human Factors in Computing Systems (CHI ’21). Yokohama, Japan.
Zagal, J. P. et al, 2013. Dark patterns in the design of games. Foundations of Digital Games (FDG ’13). Society for the
Advancement of the Science of Digital Games. Santa Cruz, CA.
Conference Paper
Full-text available
Dark patterns are deceptive or manipulative interfaces that can lead users to act against their preferences or best interests. They are widely spread in the digital environment and there is multidisciplinary evidence of the individual and structural harms that they cause. In this article, we synthesize evidence of the prevalence of dark patterns, evidence of their harms, and the legal framework addressing them. Then, we propose a complementary area of research: a new taxonomy to contribute to solving the issue. We detail existing taxonomies, their main respective purposes, and we provide a gap analysis bridging human-centered principles and practitioners needs. Based on that analysis, we propose a new taxonomy to provide a usable, accessible, and sustainable tool to empower all stakeholders to take action and fight against dark patterns. In this taxonomy, we introduce the notion of fair pattern as a way to shift from a problem-oriented to a problem-solving perspective. Finally, the advantages and limitations of this taxonomy, such as the perspectives it opens as a countermeasure to solve the issue, are discussed.
Full-text available
Since 2016, when the Facebook/Cambridge Analytica scandal began to emerge, public concern has grown around the threat of “online manipulation”. While these worries are familiar to privacy researchers, this paper aims to make them more salient to policymakers—first, by defining “online manipulation”, thus enabling identification of manipulative practices; and second, by drawing attention to the specific harms online manipulation threatens. We argue that online manipulation is the use of information technology to covertly influence another person’s decision-making, by targeting and exploiting their decision-making vulnerabilities. Engaging in such practices can harm individuals by diminishing their economic interests, but its deeper, more insidious harm is its challenge to individual autonomy. We explore this autonomy harm, emphasising its implications for both individuals and society, and we briefly outline some strategies for combating online manipulation and strengthening autonomy in an increasingly digital world.
Conference Paper
Full-text available
Virtual assistants (VA) such as Alexa, Siri and Google Assistant are becoming increasingly popular. Recent literature has argued that VAs may raise ethical concerns for users’ autonomy. However, there is a lack of frameworks which can unite the wide range of autonomy concerns discussed in literature, as well as help designers envision the ethical implications of emerging VA technologies. This paper argues that designers and policymakers need to be sensitive to the ethical side of the future of virtual assistants, and systematic frameworks are required to aid their moral imagination. The paper proposes a framework to help designers imagine potential ethical concerns pertaining to users’ autonomy. We demonstrate the usefulness of the proposed framework by showing how existing ethical concerns can be situated within the framework. We also use the framework to imagine ethical concerns with emerging VA technologies. The proposed framework can aid in systematic identification of autonomy related ethical concerns within human computer interactions.
Conference Paper
Full-text available
Surveillance-based persuasive technologies have become ubiquitous in the form of fitness trackers, advertisement engines, recommendation systems and birthday reminder applications. They are also being integrated into socio-economic systems such as insurance, health and education. In reported literature, surveillance has raised significant ethical concerns about privacy and persuasive intentions of technology have come under scrutiny for undermining human autonomy. This paper discusses the ethical implications of persuasive technologies from the perspective of human autonomy and freedom. It begins by acknowledging the reported and possible future advantages of surveillance-based persuasive technologies, with an emphasis on the conditions which make them beneficial (the good). It then discusses the ethical trade-offs involved and the problems with how those trade-offs are designed and implemented in technology (the bad). Lastly, the paper discusses severe ethical concerns which involve coercion or manipulation of users into being persuaded for economic or even paternalistic needs of the technology (the ugly). This paper has argued for designers and businesses to employ an ethical approach to persuasive technology design and has presented possible suggestions for such an approach. These suggestions can help design technologies in a manner more conducive to autonomous decision making and freedom of choice for the users.
Conference Paper
Full-text available
Design practitioners are increasingly engaged in describing ethical complexity in their everyday work, exemplified by concepts such as "dark patterns" and "dark UX." In parallel, researchers have shown how interactions and discourses in online communities allow access to the various dimensions of design complexity in practice. In this paper, we conducted a content analysis of the subreddit "/r/assholedesign," identifying how users on Reddit engage in conversation about ethical concerns. We identify what types of artifacts are shared, and the salient ethical concerns that community members link with "asshole" behaviors. Based on our analysis, we propose properties that describe "asshole designers," both distinct and in relation to dark patterns, and point towards an anthro-pomorphization of ethics that foregrounds the inscription of designer's values into designed outcomes. We conclude with opportunities for further engagement with ethical complexity in online and offline contexts, stimulating ethics-focused conversations among social media users and design practitioners.
Full-text available
Research conducted on Brain-Computer Interfaces (BCIs) has grown considerably during the last decades. With the help of BCIs, users can (re)gain a wide range of functions. Our aim in this paper is to analyze the impact of BCIs on autonomy. To this end, we introduce three abilities that most accounts of autonomy take to be essential: (1) the ability to use information and knowledge to produce reasons; (2) the ability to ensure that intended actions are effectively realized (control); and (3) the ability to enact intentions within concrete relationships and contexts. We then consider the impact of BCI technology on each of these abilities. Although on first glance, BCIs solely enhance self-determination because they restore or improve abilities, we will show that there are other positive, but also negative impacts on user autonomy, which require further philosophical and ethical discussions.
Conference Paper
This panel will provoke the audience into reflecting on the dark side of interaction design. It will ask what role the HCI community has played in the inception and rise of digital addiction, digital persuasion, data exploitation and dark patterns and what to do about this state of affairs. The panelists will present their views about what we have unleashed. They will examine how 'stickiness' came about and how we might give users control over their data that is sucked up in this process. Finally, they will be asked to consider the merits and prospects of an alternative agenda, that pushes for interaction design to be fairer, more ethically-grounded and more transparent, while at the same time addressing head-on the dark side of interaction design.
Dark patterns are user interface design choices that benefit an online service by coercing, steering, or deceiving users into making unintended and potentially harmful decisions. We present automated techniques that enable experts to identify dark patterns on a large set of websites. Using these techniques, we study shopping websites, which often use dark patterns to influence users into making more purchases or disclosing more information than they would otherwise. Analyzing ~53K product pages from ~11K shopping websites, we discover 1,818 dark pattern instances, together representing 15 types and 7 broader categories. We examine these dark patterns for deceptive practices, and find 183 websites that engage in such practices. We also uncover 22 third-party entities that offer dark patterns as a turnkey solution. Finally, we develop a taxonomy of dark pattern characteristics that describes the underlying influence of the dark patterns and their potential harm on user decision-making. Based on our findings, we make recommendations for stakeholders including researchers and regulators to study, mitigate, and minimize the use of these patterns.