Conference PaperPDF Available

The Dark (Patterns) Side of UX Design

Authors:

Abstract and Figures

Interest in critical scholarship that engages with the complexity of user experience (UX) practice is rapidly expanding, yet the vocabulary for describing and assessing criticality in practice is currently lacking. In this paper, we outline and explore the limits of a specific ethical phenomenon known as "dark patterns," where user value is supplanted in favor of shareholder value. We assembled a corpus of examples of practitioner-identified dark patterns and performed a content analysis to determine the ethical concerns contained in these examples. This analysis revealed a wide range of ethical issues raised by practitioners that were frequently conflated under the umbrella term of dark patterns, while also underscoring a shared concern that UX designers could easily become com-plicit in manipulative or unreasonably persuasive practices. We conclude with implications for the education and practice of UX designers, and a proposal for broadening research on the ethics of user experience.
Content may be subject to copyright.
The Dark (Patterns) Side of UX Design
Colin M. Gray, Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L. Toombs
Purdue University
West Lafayette, IN
{gray42; kou2; bbattles; jhoggatt; toombsa}@purdue.edu
ABSTRACT
Interest in critical scholarship that engages with the complex-
ity of user experience (UX) practice is rapidly expanding,
yet the vocabulary for describing and assessing criticality in
practice is currently lacking. In this paper, we outline and
explore the limits of a specific ethical phenomenon known
as "dark patterns," where user value is supplanted in favor of
shareholder value. We assembled a corpus of examples of
practitioner-identified dark patterns and performed a content
analysis to determine the ethical concerns contained in these
examples. This analysis revealed a wide range of ethical issues
raised by practitioners that were frequently conflated under
the umbrella term of dark patterns, while also underscoring a
shared concern that UX designers could easily become com-
plicit in manipulative or unreasonably persuasive practices.
We conclude with implications for the education and practice
of UX designers, and a proposal for broadening research on
the ethics of user experience.
ACM Classification Keywords
H.5.m. Information Interfaces and Presentation (e.g. HCI):
Miscellaneous; K.7.4. Professional ethics: Codes of ethics.
Author Keywords
Dark patterns; ethics; design character; design responsibility;
UX practice; practice-led research.
INTRODUCTION
There is increasing interest in critical aspects of HCI and UX
practice in the CHI community, including engagement with
the impact of technology and design on society (e.g., [5, 22])
and the role of the designer in bringing about responsible
change, particularly for vulnerable populations (e.g., [20, 39,
74]). While the third paradigm of HCI has taken up critical-
ethical concerns as a key aspect of humanistically-inspired
praxis [8, 40], the everyday practice of designers in relation
to these concerns has not been sufficiently studied. We take
on a practice-led orientation in this paper to understand more
fully how practitioners are already engaging in and conceptu-
alizing social responsibility in their work on their own terms.
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for components of this work owned by others than the
author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or
republish, to post on servers or to redistribute to lists, requires prior specific permission
and/or a fee. Request permissions from permissions@acm.org.
CHI 2018, April 21–26, 2018, Montreal, QC, Canada
© 2018 Copyright held by the owner/author(s). Publication rights licensed to ACM.
ISBN 978-1-4503-5620-6/18/04. . . $15.00
DOI: https://doi.org/10.1145/3173574.3174108
We focus specifically on the practitioner-created construct of
"dark patterns" and its emergence in the practitioner lexicon,
defining a co-opting of human-centered values in the service
of deceptive or malicious aims.
While ethics and values have been studied extensively in the
HCI and Science & Technology Studies (STS) literature (e.g.,
value levers [63], values at play [25], critical design [10, 7, 23],
reflective design [60, 61]), these conversations are frequently
bound to the academic community and related discourses,
making practitioner access to these conversations difficult and
activation of their implications problematic. While there are
clear uptakes from these conversations for UX practice, it is
unclear whether these methods or approaches are used in UX
practice to raise or foreground awareness of criticality, an issue
which is complicated by a tenuous relationship between the
academic and practitioner communities surrounding the use
of methods and knowledge [18, 37, 36, 52].
We use the term dark patterns to define instances where design-
ers use their knowledge of human behavior (e.g., psychology)
and the desires of end users to implement deceptive function-
ality that is not in the user’s best interest [17, 38]. While
understudied in the HCI literature (see [38] for one rare ex-
ample), the popular press (e.g., [58, 65, 71]) and practitioners
(e.g., [16, 41, 59]) have latched onto this term as a means of
discussing the danger of manipulative design practices, and
often as a proxy for broader discussions of ethical and value-
centered practice. These discussions appear to have broad
relevance for the HCI community, both in grounding critical
discussions in a practice-led way, and in activating concepts
relevant to critical views of practice through specific instances.
In this study, we ground our understanding of dark patterns
in the artifacts and conversations of UX practitioners. First,
we generated a corpus of exemplars that were shared by prac-
titioners as indicative of dark patterns in action. Second, we
analyzed this corpus to reveal the kinds of ethically-dubious be-
haviors that could be linked to dark patterns, both on a surface
level, and in relation to which design features and interactions
were most ethically salient. We then cross-referenced these be-
haviors with instances from the critical HCI and STS literature
to reveal opportunities for further research and investigation
in both the academic and practitioner communities.
The contributions of this work are three-fold: 1) we describe
and analyze instances of dark patterns, grounding and clari-
fying this practitioner-defined phenomenon for an academic
audience; 2) we set an agenda for further study of the ethical
dimensions of UX practice, including instances where manipu-
lation, deception, and unwarranted levels of persuasion might
appear in emergent technologies and forms of interaction; and
3) we identify the implications of dark patterns in relation to
the ethical responsibilities of UX practitioners, including their
understanding and performance of values.
ETHICS IN THE DESIGN OF USER EXPERIENCE
Numerous framings of ethics and values have been explored
in the HCI community and beyond in the last two decades
(e.g., [7, 25, 27, 28, 30, 60, 62, 64]). Many methods and
research approaches to design (e.g., critical design, reflective
design, adversarial design) take on an ethical standpoint, but
these efforts have historically been largely focused on the
development of scholarship, with less clear implications for
"everyday" commercial UX design. While many of these
constructive approaches have been shown to be effective in a
research context as a generative tool to understand the ethical
implications of designs, none have been rigorously tested in
professional practice or educational settings for their efficacy
in increasing ethical awareness and decision-making. Even
fewer of these methods have been disseminated to practitioners
in ways that resonate with method use "on the ground" [34],
resulting in a lack of adoption and impact.
We will briefly summarize the dominant ethics or value-related
methods or approaches to further explore how HCI and STS
scholars have addressed this space in differing ways.
Value-Sensitive Methods
Several groups of scholars have proposed methods that fore-
ground values at varying points in the design process. Value-
Sensitive Design (VSD) has been one of the most comprehen-
sive frameworks developed to address the question of values in
design, described by its creators as "a theoretically grounded
approach to the design of technology that accounts for human
values in a principled and comprehensive manner throughout
the design process" [29]. VSD has been critiqued for its inad-
equate guidance in informing value classifications, selecting
appropriate empirical instruments, and ordering of the design
process [48], however there has been sustained interest in find-
ing ways to build value-discovery and action into the design
process [13, 44]. In parallel with this work, Flanagan and
Nissenbaum [25] have proposed a design framework known
as Values at Play that has been focused on game development
practices. This framework is intended to encourage designers
of digital games to include ethical considerations throughout
the design lifecycle, including discovery of values relevant to
the project early in the process, translation of these values into
concrete design decisions, and later verification that the final
design appropriately embodies the values that were identified.
Shilton and colleagues [64, 62, 63] have more recently pro-
posed the concept of value levers as a means of connecting
value-related insights with design action. Through ethno-
graphic engagement, Shilton has suggested the role of cer-
tain activities in facilitating conversations about values that
can then be addressed as design criteria [62], broadening the
conversation beyond the existence of VSD-related methods to
their use in practice [63]. Concurrently, Benford et al. [11]
have explored how VSD approaches might be constructively
extended as part of HCI’s turn to the cultural through concep-
tual applications such as transgression, boundaries, consent,
withdrawal, data, and integrity. Frauenberger et al. [28] have
also done recent work in this space, challenging static and for-
malized notions of ethics in HCI; diverging from the practice
focus of Shilton and colleagues, Frauenberger et al. identify
mechanisms for raising ethical issues in more situated and
exploratory ways within the HCI research tradition.
Critical and Reflective Design
Alongside humanist and value-based approaches to inquiry in
HCI, critical design has emerged as a means of foregrounding
ethical and societal concerns through design and interpretive
practices [10, 9, 56]. Critical design builds upon traditional
design practices, but rather than resulting in artifacts that af-
firm current societal norms, the designer creates artifacts or
experiences that allow key societal norms and values to be
openly interpreted and questioned [21]. Bardzell et al. [10,
9] have previously proposed an approach to analyzing critical
designs, building upon both a corpus of exemplars [23] and
patterns of humanistic interpretation [8] to foreground criti-
cal dimensions of these artifacts. However, these approaches
require substantial skill of interpretation that must be honed
over time, and there is little evidence that this method of pro-
ducing or interpreting critical artifacts has been taken up by
the practitioner community.
Sengers et al. [60] have proposed the reflective design method,
which allows the design team to activate criticality within the
user research process, improving elicitation of tacit knowledge
and encouraging defamiliarized or otherwise novel user and
designer perspectives on existing interactions. Reflective de-
sign has been used to increase awareness of underlying social
tensions or constraints that may have value for further design
activity, however it is unclear how often this method is used
in practitioner settings, and its commitment to foregrounding
and acting upon critical perspectives is ambiguous.
Persuasive Design
Design is inherently a persuasive act [54, 57, 67], where the
designer creates intentional change in the world that either di-
rectly or indirectly induces behavioral or social change. Stud-
ies of persuasive technology over the past two decades has
indicated the potential benefits of designing for persuasion
[26, 27], while other scholars have identified ethical concerns
surrounding the responsibility of designers in this framing [19,
12]. Fogg [27] views persuasive technology as "[designing
for] behavior as something we cause to occur [... and/or] pre-
venting a target behavior from happening." This shaping of
behavior is proposed to be accomplished through seven per-
suasive strategies: reduction, tunneling, tailoring, suggestion,
self-monitoring, surveillance, and conditioning [26].
While persuasive technology is often praised for the good it
is capable of producing in society and individual life, such as
encouraging socially responsible behavior or the bettering of
personal habits, there are also substantial ethical considera-
tions regarding designing explicitly to persuade. Berdichevsky
and Neuenschwander [12] identify some of the ethical patterns
designers engage in when creating persuasive technologies,
accounting for the role of motivation and intent as it relates to
future use of the technology (see also [50]). These scholars
also recognize the possibility that "dark" versions of these
design outcomes may exist, and propose a series of design
principles to guard against ethically dubious behaviors. These
principles include: dual-privacy, disclosure, accuracy, and the
"golden" principle [12]. In this study, we do not frame UX
design as being driven by a persuasive technological approach,
but do acknowledge the persuasive intent underlying all of
design activity. Therefore, we would expect that dark pat-
terns that have emerged in UX practice may have resonance
with persuasive strategies previously proposed by Fogg, albeit
twisted from their original purpose and ethical moorings.
A TURN TO (ETHICALLY-CENTERED) PRACTICE
Resonant with the third paradigm of HCI scholarship and prac-
tice, scholars have proposed a "turn to practice"—engaging
with the design complexity of practitioners rather than attempt-
ing to frame designer behavior only or primarily through extant
academic models or theories [45, 66]. Historically, a lack of
communication between HCI practitioners and the academic
community has resulted in a gap in knowledge-building and
poor resonance between the pragmatic needs of practitioners
and available theory and method [37, 66]. Some scholars have
attempted to address this gap, such as Colusso et al.’s [18]
recent proposal for translational resources to bridge the the-
ory/practice gap, and Gray et al.’s [37] model of trickle-down
and bubble-up flows of knowledge. In the context of this pa-
per, the conversation regarding humanistic, value-centered and
ethically-centered design practice has not spanned research
and practice concerns, resulting in silos of practitioner knowl-
edge regarding dark patterns that is primarily pragmatic and
grounded in ultimate particulars, while academic knowledge
is focused on higher-level ethical theories and methods.
Thus, we see multiple opportunities to connect ethical con-
cerns felt in practice with theorization of ethics and values
in the academic discourse. The model of van Wynsberghe
et al. [70] provides one such model of ethics in design prac-
tice, creating a space for ethical theory to be taken up in a
pragmatic and generative, rather than objective and static way.
This joining of design action and ethically-mediated judgment
has already found its way into the education of engineers and
technologists (e.g., [51, 68]), and is also present in the every-
day professional discourse of UX designers (e.g., [15, 42]). By
taking a practice-led approach, we hope to join these disparate
discourses and find common ground.
DARK PATTERNS AS CRITICAL—ETHICAL
While ethical and value-centered approaches and methods
have emerged primarily from academia (e.g., [25, 62, 60]),
and have not had a strong history of adoption in practice, the
concept of "dark patterns" has been raised by UX practition-
ers, often in conjunction with ethical concern more generally
(c.f., Nodder’s book Evil by Design [54]). Thus we take an
academically-initiated "bubbling-up" approach [37] to under-
standing this concept and its value for practitioners in situ.
To do this, we attempt to stitch together the interest of HCI
scholars in criticality, ethics, and values and the pragmatic
ethical points of interest for UX practitioners "on the ground."
To expand the context of dark patterns, we first go back to the
source, UX practitioner Harry Brignull, who has a doctoral
degree in cognitive science. He first proposed the idea of
ethically dubious design approaches through this neologism in
2010, defining dark patterns as: "a user interface that has been
carefully crafted to trick users into doing things...they are not
mistakes, they are carefully crafted with a solid understanding
of human psychology, and they do not have the user’s interests
in mind" [17]. However, this definition leaves many important
and unanswered questions for the HCI academic community.
Which user or stakeholder interests are or should be kept in
mind? What is the user being "tricked" into doing, and with
what motivation? Are there instances where being tricked
into doing something is desired by the user (e.g., persuasive
technologies [27])? Can interactions that were not designed to
trick the user later become dark patterns as use or technological
infrastructure changes?
To engage with these questions, we built upon an existing ty-
pology of dark patterns proposed by Brignull (Table 1). These
categories represent subversion of user-centered design princi-
ples, where designers use knowledge about users (and society
more broadly) against them. While some instances of these
behaviors are simply a result of poor design—a phenomenon
known as "anti-patterns" (c.f., [43])—many of these dark pat-
terns result from explicit, purposeful design intentions. This
typology has become a standard of reference for the UX design
community, but is currently disconnected from the critical-
ethical conversation in the academic HCI community. Further,
the typology mixes context, strategy, and outcome, making
comparison among patterns difficult. We extend Brignull’s
typology in this paper, attempting to connect it more strongly
to existing HCI literature while also making the categories
more tractable for use and interrogation by practitioners.
OUR APPROACH
In keeping with a practice-led research approach, we focus
our study on the identification of exemplars that were shared
by practitioners with the intention of illustrating dark patterns
in action. Since the identification of dark patterns as an ethical
phenomenon in 2010, Brignull has collected examples in a
"Hall of Shame" from which some of our corpus is drawn [17].
Numerous practitioners and journalists have also referenced
additional exemplars in articles, on personal websites, and
through other media outlets, leading us to believe that practi-
tioners are well-acquainted with the concept of dark patterns.
We collected and analyzed this corpus to identify potential
designer motivations these dark patterns might point towards.
Corpus Generation
Drawing on previous approaches to generating a compre-
hensive corpus (e.g., [23]), two researchers employed an ex-
ploratory and iterative process to collect artifacts related to
the idea of dark patterns for a two month period. Both re-
searchers were experienced in using Internet sources to per-
form everyday information seeking tasks. We purposefully
selected one researcher from a user experience background,
and one researcher from a computer science background. Us-
ing this approach, we were able to integrate perspectives from
both HCI/UX insiders and outsiders. The research team held
weekly meetings to discuss their latest findings about dark
Type of Dark Pattern Description
Bait and Switch You set out to do one thing, but a different, undesirable thing happens instead.
Disguised Ad
Adverts that are disguised as other kinds of content or navigation, in order to get you to click on
them.
Forced Continuity
When your free trial with a service comes to an end and your credit card silently starts getting
charged without any warning. In some cases this is made even worse by making it difficult to cancel
the membership.
Friend Spam
The product asks for your email or social media permissions under the pretence it will be used for a
desirable outcome (e.g. finding friends), but then spams all your contacts in a message that claims to
be from you.
Hidden Costs
You get to the last step of the checkout process, only to discover some unexpected charges have
appeared, e.g. delivery charges, tax, etc.
Misdirection
The design purposefully focuses your attention on one thing in order to distract your attention from
another.
Price Comparison
Prevention
The retailer makes it hard for you to compare the price of an item with another item, so you cannot
make an informed decision.
Privacy Zuckering
You are tricked into publicly sharing more information about yourself than you really intended to.
Named after Facebook CEO Mark Zuckerberg.
Roach Motel
The design makes it very easy for you to get into a certain situation, but then makes it hard for you
to get out of it (e.g. a subscription).
Sneak into Basket
You attempt to purchase something, but somewhere in the purchasing journey the site sneaks an
additional item into your basket, often through the use of an opt-out radio button or checkbox on a
prior page.
Trick Questions
You respond to a question, which, when glanced upon quickly appears to ask one thing, but if read
carefully, asks another thing entirely.
Table 1. Types of Dark Patterns, quoted from [17].
patterns as well as to refine search strategies, thereby carefully
expanding the size and diversity of the corpus.
Two researchers started collecting the corpus using popular
online platforms, including search engines such as Google
and Bing; social media sites such as Facebook, Twitter, Red-
dit, and designers’ blogs; commercial websites focused on
technology and design, such as Wired.com; and individual
UX practitioners’ professional websites. Initially the two re-
searchers used the keyword dark patterns, its derivatives, and
its platform-specific forms such as the hashtag #darkpatterns
on Facebook and Twitter. In this way, we identified an initial
corpus containing examples, articles, or resources shared by
ordinary users, designers, and journalists. At this point, we
realized the necessity to iteratively broaden the keyword set
for Internet search. First, we identified key terms such as "un-
ethical," "evil," and "manipulative" and searched these terms
interchangeably with terms such as "design," "UX," and "inter-
face" in order to find possible cases of unlabeled dark patterns.
Throughout the process, we used a bot to automatically collect
all the tweets with the hashtag #darkpatterns from Twitter.
In total, we collected a corpus of 118 artifacts. Of these, 45
were collected from social media outlets, 40 from practitioner
blogs, 19 from news outlets, 10 from our own interactions
with products in our daily lives, and 4 from darkpatterns.org.
We catalogued each exemplar, noting its source, the "culprit"
website or app, and a visual depiction of the pattern. In ad-
dition, we wrote researcher memos to note any salient dark
patterns, either drawing on provided descriptions or our own
interpretations of the artifact.
Analysis
Once we generated a corpus of exemplars, we used the con-
stant comparative method [33] and a document analysis ap-
proach [14] to describe the breadth of artifacts and nature of
the dark patterns these artifacts included. We first categorized
artifacts using the Brignull [17] taxonomy as an a priori set of
codes. While we were able to organize the majority of artifacts
within this taxonomy, we identified an overlap of strategies
and motivations that could not be articulated within the lan-
guage set out by Brignull. In addition, the existing categories
were not directly in parallel, with some types indicating more
specific kinds of content (e.g., advertising, e-commerce) and
other types indicating more general strategies (e.g., intentional
misdirection, bait and switch).
This led us to perform a second analysis of all exemplars in
the corpus using an open coding approach. We drew on our
experience in coding with the Brignull categories as a baseline,
but also foregrounded aspects of interaction quality, context,
and intended user group in the development of emergent and
axial codes. The resulting hierarchy of themes (Figure 1)
subsumes the original Brignull categories, but is consistently
structured around strategies and potential designer motivations
rather than the mixture of strategies and explicit contextual
or content-centric examples in the original taxonomy. This
coding strategy results in a significant movement away from a
description of the final pattern in situ to the motivation that may
have shaped the designer’s use of the pattern. This approach
resulted in five primary categories of designer strategies that
we label as dark patterns.
NAGGING
Redirection of expected
functionality that persists
beyond one or more
interactions.
Attempting to hide, disguise, or
delay the divulging of
information that is
relevant to the user.
Making a process more difficult
than it needs to be, with
the intent of dissuading
certain action(s).
INCLUDES:
Brignull “Roach Motel,” “Price
Comparison Prevention,” and
Intermediate Currency
INCLUDES:
Brignull “Forced Continuity,” “Hidden
Costs,” “Sneak into Basket,” and “Bait and
Switch”
Manipulation of the user
interface that privileges certain
actions over others.
INCLUDES:
Hidden Information, Preselection,
Aesthetic Manipulation, Toying with
Emotion, False Hierarchy, Brignull
“Disguised Ad,” and
“Trick Questions”
Requiring the user to perform a
certain action to access (or
continue to access) certain
functionality.
INCLUDES:
Social Pyramid, Brignull “Privacy
Zuckering,” and Gamification
OBSTRUCTION SNEAKING INTERFACE
INTERFERENCE FORCED ACTION
Figure 1. Summary of dark pattern strategies derived from analysis of our corpus.
Figure 2. Example of nagging behavior on Instagram, where a modal
dialogue provides no opportunity to permanently dismiss the message.
RESULTS
Through a constant comparative analysis [33] of our corpus
of exemplars, we identified five primary dark patterns that ap-
peared to serve as strategic motivators for designers: nagging,
obstruction, sneaking, interface interference, and forced action.
This categorization builds upon Brignull’s taxonomy, but seeks
to articulate strategic motivators through a rigorous scholarly
process of integrating, refining, and theorizing—consistent
with the development of grounded theory [33]. We will de-
scribe each pattern category below, using exemplars from our
corpus as they were shared in their original context to illus-
trate how these strategies appeared in everyday interactions.
Brignull’s categories have been nested under our new strat-
egy categories, and will be referenced in the relevant sections,
distinguishing them from our own new categories by using
Brignull’s categories in quotation marks.
Nagging
We define nagging as a minor redirection of expected function-
ality that may persist over one or more interactions. Nagging
often manifests as a repeated intrusion during normal interac-
tion, where the user’s desired task is interrupted one or more
times by other tasks not directly related to the one the user is
focusing on. Nagging behaviors may include pop-ups that ob-
scure the interface, audio notices that distract the user, or other
actions that obstruct or otherwise redirect the user’s focus.
We found varying levels of nagging behavior in our corpus,
at varying levels of potential malice. One such example from
the social media app Instagram includes a modal selection
where the user is prompted with a message asking them to
turn on notifications for the app (Figure 2). Only two options
are present, "Not Now" and "OK," which gives the user no
ability to discontinue the prompts. Another, more sinister, in-
terruption relates to Google’s location services. This instance
asks the user to allow Google to collect anonymous location
data at any time, even when apps are not running. There are
two options, "Agree" and "Disagree" to select, and a substan-
tially smaller "Don’t show again" check-box. This example
provides similar options as the previous example, but does so
in a manner that entices users to agree if they want the nagging
to cease. A third example targets drivers for the ride-sharing
app Uber. When a driver attempts to stop for the day, the app
prompts them to continue in order to reach an arbitrary goal
for the day. This interruption takes advantage of a person’s
natural tendency to "income target," or set an income goal for
the day, and makes the driver more likely to continue driving
to reach the arbitrary income goal.
Obstruction
We define obstruction as impeding a task flow, making an
interaction more difficult than it inherently needs to be with
the intent to dissuade an action. Obstruction often manifests
as a major barrier to a particular task that the user may want
to accomplish. One example of obstruction from our corpus
is the job listing site theladders.com. The site requires an ac-
count to browse jobs and a premium, paid account to apply for
a position, even though many jobs found on the site are also
advertised elsewhere. If the user does not pay for a premium
account and wishes to search for the job elsewhere, they will
discover that text highlighting is disabled through Javascript
on the site. This design choice manifests as a barrier to the
user, encouraging them to pay for a premium membership
by disabling expected functionality. An additional example
originates from the Apple iOS 6 operating system (Figure 3).
In this version of the mobile operating system, the option for
ad tracking is enabled by default and is hidden in an obscure
Figure 3. Example of obstructive behavior limiting access to ad tracking
settings on Apple iOS 6.
location in the settings menu. To further obstruct user behavior,
the option to disable ad tracking uses confusing wording to dis-
orient the user. In recent updates, this option has been moved
to a more relevant location with straightforward wording.
Brignull’s "Roach Motel" pattern describes a situation that
is easy to get into, but difficult to get out of. This usually
occurs when a user signs up for a service easily, but closing
an account or canceling the service is difficult or impossible.
A typical pattern of interactions requires the user to call a
phone number in order to cancel their account, where they
will be further pressured to maintain their account. In our
corpus, Stamps.com follows this pattern by making it difficult
to close an account on their site, requiring the user to call
during business hours to do so. They also hide this information
on their site in the frequently asked questions section to make
it difficult to cancel the service. This example indicates how
"roach motel" can invoke other types of dark patterns, in this
instance hidden information, to serve its purpose.
Brignull’s "Price Comparison Prevention" pattern seeks to
dampen the effect of market competition by making direct
price comparisons between products and services difficult.
Tactics could include making important product information
(product ID, price) un-copyable (as in theladders.com, above),
so as to prevent users from pasting such information into a
search bar or another site.
Intermediate Currency is another subtype of obstruction where
users spend real money to purchase a virtual currency which
is then spent on a good or service. The goal of this pattern is
to disconnect users from the real dollar value spent in order to
cause the user to interact differently with the virtual currency.
This may result in users spending the currency differently than
they would with fiat currency. This pattern typically manifests
as an in-app purchase for mobile games. One example from
our corpus, the video game Heroes and Generals, has a pur-
chasable in-game "credit" that can be used to buy upgrades for
the player, but the benefits are obscured by presenting the user
with confusing percentage gains in relation to specific game
mechanics.
Sneaking
We define sneaking as an attempt to hide, disguise, or delay the
divulging of information that has relevance to the user. Sneak-
ing often occurs in order to make the user perform an action
they may object to if they had knowledge of it. Sneaking be-
haviors may include additional undisclosed costs or undesired
Figure 4. Example of sneaking behavior, asking users to authorize trans-
fer of their information in order to unsubscribe from a newsletter.
effects from a particular action. Many of Brignull’s original
dark patterns are of this type, and were the most common when
dark patterns are referenced by practitioners. One example
of sneaking from Salesforce.com requires the user to consent
to a privacy statement before they can unsubscribe from an
email newsletter (Figure 4). This privacy statement allows
Salesforce to sell the user’s information to other countries and
if the user fails to read the fine print, they would unknowingly
fall victim to this scheme.
Brignull’s "Forced Continuity" pattern continues to charge
the user after the service they have purchased expires. This
pattern takes advantage of users’ failure to check up on service
expiration dates, either for a free trial or for a limited-time use
of a paid service, by assuming upon service expiration that the
user either wants to continue the paid service or upgrade to
the paid version of the free trial and charges the user.
Brignull’s "Hidden Costs" pattern provides users with a late
disclosure of certain costs. In this pattern, a certain price is
advertised for a good or service, only to later be changed due
to the addition of taxes and fees, limited time conditions, or
unusually high shipping costs. An example from our corpus
describes subscription prices on the website for the Boston
Globe; while the banner ad suggests a price of 99
¢
, when the
user clicks through to the sign up page, they are notified that
this is a trial offer for only four weeks.
Brignull’s "Sneak into Basket" pattern adds items not chosen
by the user to their online shopping cart, often claiming to be
a suggestion based on other purchased items. This may cause
the user to unintentionally buy these items if they do not notice
them prior to checkout. One example of this behavior from
SportsDirect.com adds an unwanted item into the cart, which
displays on checkout and can only be removed by exiting the
checkout process and returning to a previous screen.
Brignull’s "Bait and Switch" pattern makes it apparent that a
certain action will cause a certain result, only to have it cause a
different, likely undesired result. One example of this behavior
from our corpus includes having a red "X" button perform an
action other than closing a popup window. Another example
relies on manipulation of muscle memory in the mobile game
"Two Dots," where a button to buy more moves is positioned
in the same location where a button to start a new game is
normally positioned, thus increasingly the likelihood of being
accidentally triggered by the user.
Interface Interference
We define interface interference as any manipulation of the
user interface that privileges specific actions over others,
thereby confusing the user or limiting discoverability of im-
portant action possibilities (c.f., false or hidden affordances
[32]). Interface interference manifests as numerous individ-
ual visual and interactive deceptions, and is thus our most
involved strategy with three subtypes: hidden information,
preselection, and aesthetic manipulation. Across these three
subtypes, we have identified exemplars in our corpus where
critical information requires additional discovery effort (hid-
den information), instances where atypical user choices are
preselected or otherwise obscured (preselection), or instances
where manipulation of aesthetic characteristics leads to mis-
understanding of hierarchy, content type, or unrealistic sense
of urgency (aesthetic manipulation). Each subtype will be
elaborated in the following sections.
Hidden Information
We define hidden information as options or actions relevant to
the user but not made immediately or readily accessible. Hid-
den information may manifest as options or content hidden in
fine print, discolored text, or a product’s terms and conditions
statement. The primary motivator behind hidden information
is the disguising of relevant information as irrelevant. One
example we collected through our corpus generation is from
santander.com. When a user registers for an account, they
are given an option to accept the terms and conditions above
a long list of small text. Hidden within this text is a small
checkbox to opt out of the bank selling the user’s information.
Preselection
We define preselection as any situation where an option is
selected by default prior to user interaction. Preselection usu-
ally manifests as a default choice that the shareholder of the
product wishes the user to choose; however, this choice is
often against the user’s interests or may provide unintended
consequences. The user is more likely to agree to the default
option if they believe the product has their best interests in
mind. An example in Figure 5 preselects the option for in-
formation sharing and email subscription, and additionally
hides these options in a drop down menu. This exemplifies the
hidden information subtype in addition to preselection.
Aesthetic Manipulation
Aesthetic manipulation is any manipulation of the user inter-
face that deals more directly with form than function. This
includes design choices that focus the user’s attention on one
thing to distract them from or convince them of something
else (e.g., Brignull’s "Misdirection"). One example from our
corpus is from the website expertsexchange.com, a medium in
which questions asked can be answered by experts in the field.
When viewing a question, the site makes it appear that the
Figure 5. Example of preselection as a type of interface interference, hid-
ing and preselecting a choice that may not be in the user’s best interest.
answer to the question is behind a paywall, but it is actually
accessible at the bottom of the page without paying. We have
also identified four more specific instantiations of aesthetic
manipulation: toying with emotion, false hierarchy, disguised
ad, and trick questions.
Toying with emotion includes any use of language, style, color,
or other similar elements to evoke an emotion in order to
persuade the user into a particular action. This can manifest
as cute or scary images, or as enticing or frightening language.
One example from our corpus is from the Snapchat owned
company Spectacles. On their site upon checkout, the user is
shown a countdown timer that supposedly ensures delivery in
2-4 weeks if the user buys the product within the time window;
however, if the user lets the timer reach 0, the timer resets.
Another example from the website delish.com prompts the
user to sign up for a newsletter. The option for declining
is the demeaning phrase, "No thanks, I’ll have a microwave
dinner tonight," using copy that evokes emotion and perhaps
encourages the user to change their intended action.
False hierarchy gives one or more options visual or interactive
precedence over others, particularly where items should be in
parallel rather than hierarchical. This convinces the user to
make a selection that they feel is either the only option, or the
best option. One example of false hierarchy from our corpus
comes from the TuneUp app (Figure 6). Upon installing utili-
ties for the app, the user is given a decision between "express"
installation, labeled as "recommended," or "custom" installa-
tion, labeled as "advanced." The "custom" option is in gray
text, but still clickable, giving the user the false impression
that the option is disabled. This functionality is similar to
previous work on false feedforward [46, 73].
Brignull’s "Disguised Ad" pattern includes ads disguised as
interactive games, or ads disguised as a download button or
other salient interaction the user is looking for. Some sites use
a form of this pattern whereby a click on any part of the site
loads another page, effectively rendering the entire site an ad.
Figure 6. Example of false hierarchy as a type of interface interference,
encouraging users to think that one of their options is disabled.
Figure 7. Example of forced action on Windows 10.
Brignull’s "Trick Questions" pattern includes a question that
appears to be one thing but is actually another, or uses confus-
ing wording, double negatives, or otherwise leading language
to manipulate user interactions. One common example of this
tactic is the use of checkboxes to opt out rather than opt in,
often paired with confusing double negatives.
Forced Action
We define forced action as any situation in which users are
required to perform a specific action to access (or continue to
access) specific functionality. This action may manifest as a
required step to complete a process, or may appear disguised
as an option that the user will greatly benefit from. One exam-
ple of forced action in our collection is from the Windows 10
operating system (Figure 7). When a system update is avail-
able, the user is unable to shutdown or restart their operating
system without updating. This essentially forces an update
upon a user who needs to restart or shutdown their computer,
who may not otherwise want to proceed with the update. An
additional exemplar of forced action can be drawn from sales-
force.com in which the user must agree to allow the site to sell
their information to other countries in order to unsubscribe
from their mailing list. In this case, the user is required to
perform a specific action (i.e., sell their personal information)
in order to access specific functionality (i.e., unsubscribe).
Social Pyramid requires users to recruit other users to use the
service. This is a method commonly used in social media
applications and online games. Users can invite their friends
to use the service and are incentivized with benefits from the
platform. This pattern subsumes Brignull’s "Friend Spam" and
expands the definition to include any social recruiting. One
example from our corpus is the social media game FarmVille.
The game incentivizes users to invite their friends to the game
by making some features or goals inaccessible without online
connections that also play the game.
Brignull’s "Privacy Zuckering" pattern tricks users into sharing
more information about themselves than they intend to or
would agree to. This includes the selling of user’s information
to third parties that is included in the Term and Conditions or
Privacy Policies of websites.
Gamification describes situations in which certain aspects of a
service can only be "earned" through repeated (and perhaps
undesired) use of aspects of the service. One common instance
of gamification is "grinding," a term used in many video games
to describe the repeated process of killing monsters to gain
experience points in order to level up the user’s character. One
example from our corpus is from the mobile game Candy
Crush Saga. The game includes levels that are impossible to
complete in order to urge users to buy powerups or extra lives.
If the player doesn’t purchase anything from the game, they
will have to play the game for a longer period of time in order
to achieve the same result they would have from paying.
DISCUSSION
Qualities and Breadth of Potential "Dark"ness
Through our analysis of the corpus of exemplars, we have
outlined a set of strategies that can be used by designers to
undermine end user value in favor of shareholder value. This
subversion can be productively viewed as impacting the user’s
felt experience of interaction across two dimensions: 1) the
user’s interpretation of the system and its communicative po-
tential, including any salient affordances (c.f., Norman’s [55]
"gulf of execution"); and 2) the expected outcome that the user
anticipates as a result of their previous or present interaction
with the system, mediated by their continuing interpretation of
the system (c.f., Norman’s [55] "gulf of evaluation"). Across
all five strategies identified in the corpus, dark patterns are
complicit in undermining both action possibilities and commu-
nicative expectations based on salient interactions. In the cases
of interface interference or obstruction, the user is not able to
understand or fully interrogate action possibilities, essentially
"flying blind." The strategies of nagging and forced action
foreground action possibilities, but only those desired by the
shareholder, resulting in coercive interactions with the system.
In contrast to these strategies, the strategy of sneaking disal-
lows the identification of action possibilities entirely, removing
user choice and forcing the user to react to the system’s actions
rather than direct their own action.
While not all interactions that take on these strategies are neces-
sarily equally "dark" in terms of design intent and motivation,
they do have the potential to produce poor user outcomes, or
force users to interact in ways that are out of alignment with
their goals. Paradoxically, some instances of dark patterns test
well from a usability perspective (e.g., forced action, nagging),
but do so at the expense of user choice. Other interactions
may also test well if conducted using shareholder-focused
evaluation criteria. For instance, manipulation of the interface
via interface interference or sneaking strategies may result in
the conversion of users towards shareholder-defined outcomes,
but unless evaluation methods are varied, alternative task flows
that may be obstructed could be difficult to identify. These
perversions of the typical user-centered approach, then, can
cause a number of issues as a user relates to a technological
system: How does a user react when they receive a different
outcome than they expect? Do they blame themself for not
understanding the system, or do they understand that they are
being explicitly deceived or manipulated? Are there certain
boundaries to manipulative behaviors where such behavior
is excused by users, such as when one is getting an app or
service for free (e.g., ad supported or freemium)?
All of these concerns foreground ethical dilemmas that are dif-
ficult to reconcile or describe given the existing academically-
focused vocabulary of persuasive design or the method-
focused language of value-sensitive design. It is interesting
to note that many of the strategies we identified in our corpus
bear striking resemblance to the persuasive strategies proposed
by Fogg [26]. Persuasive strategies such as tunneling or reduc-
tion have strong similiarity to the forced action and obstruction
dark patterns strategies. Similarly, tailoring and suggestion
may result in strategies like interface interference. Finally,
conditioning could include strategies such as sneaking or nag-
ging. What this indicates is that many persuasive strategies are
already being used for nefarious purposes, which draws our
attention back to the designer’s role in selecting and applying
ethically-grounded strategies in their practice. Our intention is
to provide a vocabulary that allows for additional exploration
into the balancing of value, in the control of the designer, that
is at the core of pragmatist ethics.
Design Responsibility and a Designer’s Character
Given that many persuasive strategies—or even dark patterns—
can be used for good or ill, we must attend to how the selection
of such strategies relates to UX designers’ ethical responsi-
bility as practitioners, and how this exemplifies their design
character [67]. The emergence and use of dark patterns as an
ethical concern in HCI and UX design reveals that design is
rarely a solitary endeavor; in contrast, the complex entangle-
ment among designer responsibility, organizational pressures,
and neoliberal values often politicizes and prioritizes the prof-
itability of design above other social motivations (e.g., [6]).
Within this context, we foreground the role of design responsi-
bility in navigating the ethical challenges of dark patterns.
First, it is important to explore what Nelson and Stolterman
[67] refer to as the guarantor of design—in what ways is the
designer responsible for the success of design interventions,
and to what degree? Gray and Boling [35] extended this line
of argumentation regarding the guarantor of design further,
identifying differences in a near-term guarantee for a known
audience and context versus the societal adaptation and ap-
propriation of that same artifact or experience over time and
differing contexts. This allows us to question: Where along
this trajectory does a pattern become dark, and with what level
of intentionality? A design decision may have been made with
good intentions for a specific audience, but resulted in manip-
ulative outcomes when exposed to a broader audience. How
might we assess the "dark" characteristics of the pattern or the
designer employing the pattern? The aphorism Hanlon’s Ra-
zor has been commonly shared in the UX design community
as a means of separating malicious intent from lack of design
skill; put simply, this means to "never attribute to malice that
which is adequately explained by stupidity" [2]. This is also
reflected in the practitioner differentiation between dark pat-
terns (where malice is assumed) and anti-patterns (where lack
of skill is at issue). We do not propose the use of our dark
patterns strategies as an ethical checklist; rather, the strategies
we have identified are intended to deepen awareness of the
tradeoffs that designers always already engage in.
Second, we consider which dimensions of design responsibil-
ity the designer has an obligation to assess, and at what level of
granularity. This has relevance for what user groups might be
susceptible to dark patterns (e.g., vulnerable populations), or
whether a pattern must be intentionally applied in a malicious
manner to be "dark." Are patterns consistently misunderstood
by certain user groups in a harmful manner also considered
dark? And is the designer culpable for this "dark"ness, par-
ticularly after they are made aware of any harmful effects?
While traditional, static notions of ethical codes would seek
to draw specific boundaries, the number of variables in play
makes this approach untenable. Rather, we propose a redi-
rection back to the designer’s character, which Nelson and
Stolterman [67] link to core, deeply embedded philosophical
commitments that resonate with a designer’s values and guides
their design activities. This approach values a designer’s hu-
manness and her character as central to ethically responsible
design activity. A designer’s character is a reflection of who
they are as a person—not just a performance of design actions
merely because they are "appropriate" or "accepted practice."
Existing codes of ethics do not generally anticipate deception,
and instead tend to conflate designer intent and eventual use
(Albrechtslund’s [5] "positivist problem"); a use of our find-
ings may be generative in probing further into the complexity
of inscription and use of artifacts through the lens of designer
intent—not resolving the question of manipulation, legality, or
other issues, but rather introducing additional vocabulary with
which to describe the ethical complexity of design action.
Third, we contend that a focus on design character and respon-
sibility represents an opportunity for practitioners to engage
in design leadership—raising and addressing ethical issues in
everyday practice. Designers have a history of raising con-
sciousness around ethics and refusing to bow to the pressure of
shareholders. This is often linked to a tradition of manifestos
in design, such as the classic "First Things First" manifesto by
Ken Garland in 1964 [31] and the more recent "Copenhagen
Letter" [4]. These design manifestos have repeatedly raised
issues of criticality, calling on designers to be aware of the
value-laden nature of their work and the responsibility they
hold in relation to other stakeholders. We believe the findings
of this study also indicate the power of design action, and the
critical dimensions of that practice that must be continuously
recognized and reconciled through design judgments.
IMPLICATIONS AND FUTURE WORK
UX Pedagogy and Practice
Comprehensive ethics education in HCI and UX education is
critical to ensure that future generations of practitioners take
their role as creators of futures seriously. Other fields such as
engineering have required ethics education as part of the core
curriculum for more than a decade [68], yet a formalization of
ethics education in traditional design contexts are still nascent,
despite repeated calls for such a focus (e.g., [24, 49]). The lack
of formal ethics education in UX and HCI education is compli-
cated by the lack of disciplinary consensus regarding standard
curricula, responsibilities of professionals, and the dimensions
of the related disciplinary fields in which students will prac-
tice (e.g., [47]). While some professional organizations have
articulated codes of ethics by which professionals choosing
to affiliate with these organizations are loosely bound (e.g.,
Interaction Design Association (IxDA) [3]; User Experience
Professionals Association (UXPA) [1]), there appear to be no
coordinated efforts to build ethical awareness and education
into the fabric of HCI/UX education and practice, particularly
in comparison to the highly-coordinated efforts in other fields
such as engineering.
While there have been recent efforts within the ACM to make
changes to the code of ethics for the organization governing
SIGCHI [76], it is unclear how many UX practitioners are cur-
rently ACM members or recognize the ACM as a governing
body that should prescribe their ethical accountability. Many
other fields that have core ethical commitments use licensure
as a means of controlling who can call themself a practitioner
(e.g., medicine, architecture); however, this is not currently
the case for UX practitioners. This indicates the importance
of future work that addresses the place of ethical awareness
and responsibility in the CHI community, including both how
ethics is taught and related to practice, and how ethics should
inform, constrain, and generatively stimulate the future of HCI
and UX practice. This requires both educationally-centric
efforts to define and teach future practitioners to be ethically
aware and responsible for their role in the design process, and
practice-led efforts to document the ethical dilemmas that are
already being encountered by practitioners in their everyday
work. Both strands are critical to stimulate a reflexive conver-
sation regarding what kinds of ethical standards are necessary
and appropriate for the nature of HCI and UX outputs, and
the responsibility of the designer in motivating these outputs
as technologies and systems are appropriated and modified
by users and stakeholders alike as they define the future of
social systems [5, 72]. Additionally, our analysis has the po-
tential to impact the larger computing profession. While our
current corpus is limited by artifacts shared by UX practi-
tioners, future work should expand to address instances of
deception in service design and physical computing contexts;
new technologies and output types (e.g., services and non-
screen interactions) also have the potential to include dark
patterns.
Criticality and HCI
There has been substantial conversation in the academic HCI
community about the ethical standpoint of HCI practitioners
in a disciplinary sense, but little of this work has found its
way into everyday practice. This indicates two potential paths
forward: 1) further integration and expansion of existing value-
centered methods in UX practice, as a means of generating
awareness of ethical issues present in everyday design deci-
sions; and 2) additional focus on applied, pragmatist notions
of ethics in the academic HCI community.
Any expansion of value-centered methods must take into ac-
count the needs, constraints, and desires of practitioners. This
may require more sensitivity to the contexts and locations
where ethical concerns might arise, the amount of power prac-
titioners may have to affect change on a strategic level, and
other mediating factors. This foregrounds the question: at
what junctures should practitioners be engaged, and to what
degree are they responsible for the outcomes of their work, par-
ticularly when multiple domain experts are required to make a
specification of work into reality?
While there has been substantial interest in critical dimen-
sions of HCI, often through the lenses of social justice, fem-
inism, and care, these discussions have frequently remained
at a highly abstracted level. What is lacking, perhaps, is a
more explicit discussion of applied ethics in the pragmatist
tradition. This is the mode that van Wynsberghe [69] sug-
gests as a path forward, where ethics become embedded in
the everyday activities of practitioners through the actions of
practitioners-as-everyday-ethicists and professional ethicists
with domain expertise. This is consistent with efforts within
CHI to reframe ethics as inherently situational and dynamic
(e.g., [53]). Additional work is needed to stake out this area
of applied ethics as a persistent issue of concern and interest
for the HCI community, building upon efforts already under-
way (e.g., [75]), and activating critical perspectives offered
by third paradigm thinking in the everyday work of HCI and
UX designers. We suggest that there may be two areas for
additional exploration in this regard. First, a reprioritization of
the activities of practitioners, taking into account both the near
term positive economic and social impacts of their work, but
also the long-term social impacts that might be productively
viewed through the lens of care ethics and critical theory. Sec-
ond, a further investment in scholarship of multiple types that
explicitly connects knowledge production that is valuable (e.g.,
speculative fiction, critical design), but often inaccessible for
practitioner use in a ready-at-hand manner.
CONCLUSION
In this paper, we have provided an overview of the landscape
of dark patterns from the perspective of UX practitioners,
describing the breadth of these patterns as they currently exist
and the potential uptakes for HCI research and UX practice
in further defining this ethical and value-laden phenomenon.
We have recharacterized existing practitioner-led notions of
dark patterns to reflect the strategies that designers activate
when manipulating the balance of user and shareholder value,
supporting both ethically-focused UX practice and future HCI
scholarship regarding ethics and values.
ACKNOWLEDGMENTS
This research was funded in part by the National Science
Foundation Grant No. #1657310. We would also like to thank
the AC and reviewers for their generous suggestions.
REFERENCES
1. 2013. UXPA Code of Professional Conduct. https:
//uxpa.org/resources/uxpa-code- professional-conduct
2. 2017a. Hanlon’s Razor.
https://en.wikipedia.org/wiki/Hanlons_razor
3. 2017b. IxDA Code of Conduct.
http://ixda.org/code-of- conduct/
4. 2017c. The Copenhagen Letter.
https://copenhagenletter.org
5. Anders Albrechtslund. 2007. Ethics and technology
design. Ethics and Information Technology 9, 1 (2007),
63–72.
DOI:https://doi.org/10.1007/s10676-006- 9129-8
6. Ellen Balka. 2006. Inside the Belly of the Beast : The
Challenges and Successes of a Reformist Participatory
Agenda. In PDC ’06: Proceedings of the ninth
conference on Participatory design, Vol. 1. 134–143.
DOI:https://doi.org/10.1145/1147261.1147281
7. Jeffrey Bardzell and Shaowen Bardzell. 2013. What is
"Critical" about Critical Design?. In CHI 2013.
3297–3306.
8. Jeffrey Bardzell and Shaowen Bardzell. 2015.
Humanistic HCI. Vol. 8. Morgan Claypool Publishers.
1–185 pages. DOI:
https://doi.org/10.2200/S00664ED1V01Y201508HCI031
9.
Jeffrey Bardzell, Shaowen Bardzell, and Erik Stolterman.
2014. Reading Critical Designs: Supporting Reasoned
Interpretations of Critical Design. In Proceedings of the
32Nd Annual ACM Conference on Human Factors in
Computing Systems. ACM, 1951–1960. DOI:
https://doi.org/10.1145/2556288.2557137
10. Shaowen Bardzell, Jeffrey Bardzell, Jodi Forlizzi, John
Zimmerman, and John Antanitis. 2012. Critical design
and critical theory. In Proceedings of the Designing
Interactive Systems Conference on - DIS ’12. 288. DOI:
https://doi.org/10.1145/2317956.2318001
11. Steve Benford, Matt Adams, Ju Row Farr, Nick
Tandavanitj, Kirsty Jennings, Chris Greenhalgh, Bob
Anderson, Rachel Jacobs, Mike Golembewski, Marina
Jirotka, Bernd Carsten Stahl, Job Timmermans, and
Gabriella Giannachi. 2015. The Ethical Implications of
HCI’s Turn to the Cultural. ACM Transactions on
Computer-Human Interaction 22, 5 (2015), 1–37. DOI:
https://doi.org/10.1145/2775107
12. Daniel Berdichevsky and Erik Neuenschwander. 1999.
Toward an ethics of persuasive technology. Commun.
ACM 42, 5 (1999), 51–58. DOI:
https://doi.org/10.1145/301353.301410
13. Alan Borning and Michael Muller. 2012. Next steps for
value sensitive design. In Proceedings of the 2012 ACM
annual conference on Human Factors in Computing
Systems - CHI ’12. 1125. DOI:
https://doi.org/10.1145/2207676.2208560
14. Glenn A. Bowen. 2009. Document Analysis as a
Qualitative Research Method. Qualitative Research
Journal 9, 2 (2009), 27–40. DOI:
https://doi.org/10.3316/QRJ0902027
15. N. Bowman. 2014. The ethics of UX research.
http://www.uxbooth.com/articles/ethics-ux- research/
16.
Harry Brignull. 2013. Dark Patterns: inside the interfaces
designed to trick you.
http://www.theverge.com/2013/8/29/4640308/
dark-patterns- inside-the- interfaces-designed- to-trick- you
17. Harry Brignull, Marc Miquel, Jeremy Rosenberg, and
James Offer. 2015. Dark Patterns - User Interfaces
Designed to Trick People. http://darkpatterns.org/
18. Lucas Colusso, Gary Hsieh, Sean A Munson, Cindy L
Bennet, Cynthia L Bennett, Gary Hsieh, and Sean A
Munson. 2017. Translational Resources: Reducing the
Gap Between Academic Research and HCI Practice. In
DIS’17: Proceedings of the Conference on Designing
Interactive Systems. ACM Press, New York, NY. DOI:
https://doi.org/10.1145/3064663.3064667
19.
Janet Davis. 2009. Design methods for ethical persuasive
computing. In Proceedings of the 4th International
Conference on Persuasive Technology - Persuasive ’09.
ACM Press, New York, New York, USA, 1. DOI:
https://doi.org/10.1145/1541948.1541957
20. Lynn Dombrowski, Ellie Harmon, and Sarah Fox. 2016.
Social Justice-Oriented Interaction Design. Proceedings
of the 2016 ACM Conference on Designing Interactive
Systems - DIS ’16 (2016), 656–671. DOI:
https://doi.org/10.1145/2901790.2901861
21. Anthony Dunne and Fiona Raby. 2013. Speculative
Everything: Design, Fiction, and Social Dreaming. MIT
Press, Cambridge, MA. DOI:
https://doi.org/10.1080/17547075.2015.1051844
22. Daniel Fallman. 2011. The new good: Exploring the
potential of philosophy of technology to contribute to
Human-Computer interaction. In Proceedings of the 29th
SIGCHI Conference on Human Factors in Computing
Systems. ACM, 1051–1060. DOI:
https://doi.org/10.1145/1978942.1979099
23. Gabriele Ferri, Jeffrey Bardzell, Shaowen Bardzell, and
Stephanie Louraine. 2014. Analyzing critical designs:
categories, distinctions, and canons of exemplars. In
Proceedings of the 2014 conference on Designing
interactive systems. ACM, ACM Press, New York, NY,
355–364.
DOI:https://doi.org/10.1145/2598510.2598588
24.
Alain Findeli. 2001. Rethinking Design Education for the
21st Century: Theoretical, Methodological, and Ethical
Discussion. Design Issues 17, 1 (2001), 5–17. DOI:
https://doi.org/10.1162/07479360152103796
25. Mary Flanagan and Helen Nissanbaum. 2014. Values at
play in digital games.DOI:
https://doi.org/10.1177/1461444816631742
26.
BJ Fogg. 2003. Persuasive Technology: Using Computers
to Change What We Think and Do. 1–282 pages. DOI:
https://doi.org/10.1016/B978-1- 55860-643- 2.X5000-8
27. BJ Fogg. 2009. A behavior model for persuasive design.
In Proceedings of the 4th international Conference on
Persuasive Technology. ACM. DOI:
https://doi.org/10.1145/1541948.1542005
28.
Christopher Frauenberger, Marjo Rauhala, and Geraldine
Fitzpatrick. 2016. In-Action Ethics. Interacting with
Computers 29, 2 (2016), 220–236. DOI:
https://doi.org/10.1093/iwc/iww024
29. Batya Friedman, Peter Kahn, and Alan Borning. 2002.
Value sensitive design: Theory and methods. University
of Washington technical report December (2002), 2–12.
DOI:https://doi.org/10.1016/j.neuropharm.2007.08.009
30. Batya Friedman, Peter H Kahn Jr, and Peter H. Kahn, Jr.
2003. Human Values, Ethics, and Design. In The
Human-Computer Interaction Handbook, Julie A Jacko
and Andrew Sears (Eds.). Lawrence Erlbaum Associates,
Mahwah, NJ, 1177–1201.
31. Ken Garland. 1964. First Things First. http:
//www.designishistory.com/1960/first-things- first/
32. William W. Gaver. 1991. Technology affordances.
Proceedings of the SIGCHI conference on Human factors
in computing systems Reaching through technology - CHI
’91 (1991), 79–84. DOI:
https://doi.org/10.1145/108844.108856
33. Barney G. Glaser and Anselm L. Strauss. 1967. The
Discovery of Grounded Theory: Strategies for Qualitative
Research. Vol. 1. 271 pages. DOI:
https://doi.org/10.2307/2575405
34. Colin M. Gray. 2016. "It’s More of a Mindset Than a
Method": UX Practitioners’ Conception of Design
Methods. Proceedings of the 2016 CHI Conference on
Human Factors in Computing Systems (2016),
4044–4055. DOI:
https://doi.org/10.1145/2858036.2858410
35. Colin M. Gray and Elizabeth Boling. 2016. Inscribing
ethics and values in designs for learning: a problematic.
Educational Technology Research and Development 64, 5
(2016), 969–1001. DOI:
https://doi.org/10.1007/s11423-016- 9478-x
36. Colin M. Gray and Yubo Kou. 2017. UX Practitioners’
Engagement with Intermediate-Level Knowledge. In DIS
’17 Proceedings of the 2017 Conference on Designing
Interactive Systems. 13–17. DOI:
https://doi.org/10.1145/3064857.3079110
37. Colin M. Gray, Erik Stolterman, and Martin A. Siegel.
2014. Reprioritizing the relationship between HCI
research and practice. In Proceedings of the 2014
conference on Designing interactive systems - DIS ’14.
725–734.
DOI:https://doi.org/10.1145/2598510.2598595
38. Saul Greenberg, Sebastian Boring, Jo Vermeulen, and
Jakub Dostal. 2014. Dark patterns in proxemic
interactions: a critical perspective. In Proceedings of the
2014 conference on Designing interactive systems. ACM,
523–532.
39. David Hankerson, Andrea R Marshall, Jennifer Booker,
Houda El Mimouni, Imani Walker, and Jennifer A Rode.
2016. Does Technology Have Race? Proceedings of the
2016 CHI Conference Extended Abstracts on Human
Factors in Computing Systems (2016), 473–486. DOI:
https://doi.org/10.1145/2851581.2892578
40. Steve Harrison, Phoebe Sengers, and Deborah Tatar.
2011. Making epistemological trouble: Third-paradigm
HCI as successor science. Interacting with Computers 23,
5 (2011), 385–392. DOI:
https://doi.org/10.1016/j.intcom.2011.03.005
41. Danny Jeremiah. 2017. Southern Rail has a UX problem.
https://medium.com/@dannyjeremiah/
southern-rail- has-a- ux-problem- 461c8915f8a9
42.
N. Kellingley. Ethics and the user experience—ethics and
the individual.
https://www.interaction-design.org/literature/article/
ethics-and- the-user- experience-ethics- and-the- individual
43.
Andrew Koenig. 1995. Patterns and antipatterns. Journal
of Object-Oriented Programming 8, 1 (1995), 46–48.
44. Jes A. Koepfler, Luke Stark, Paul Dourish, Phoebe
Sengers, and Katie Shilton. 2014. Values & design in HCI
education (workshop). In CHI’14 Extended Abstracts on
Human Factors in Computing Systems. ACM, 127–130.
DOI:https://doi.org/10.1145/2559206.2559231
45. Kari Kuutti and Liam J. Bannon. 2014. The turn to
practice in HCI. In Proceedings of the 32nd annual ACM
conference on Human factors in computing systems - CHI
’14. 3543–3552. DOI:
https://doi.org/10.1145/2556288.2557111
46. Benjamin Lafreniere, Parmit K. Chilana, Adam Fourney,
and Michael A. Terry. 2015. These Aren’t the Commands
You’re Looking For: Addressing False Feedforward in
Feature-Rich Software. In UIST ’15: Proceedings of the
28th Annual ACM Symposium on User Interface Software
& Technology. 619–628. DOI:
https://doi.org/10.1145/2807442.2807482
47. Carine Lallemand, Guillaume Gronier, and Vincent
Koenig. 2015. User experience: A concept without
consensus? Exploring practitioners’ perspectives through
an international survey. Computers in Human Behavior
43 (2015), 35–48. DOI:
https://doi.org/10.1016/j.chb.2014.10.048
48. Christopher A. Le Dantec, Erika S. Poole, and Susan P.
Wyche. 2009. Values as lived experience: Evolving value
sensitive design in support of value discovery. In
Proceedings of the 27th international conference on
Human factors in computing systems (CHI ’09). ACM,
ACM Press, New York, NY, 1141–1150. DOI:
https://doi.org/10.1145/1518701.1518875
49. Peter Lloyd. 2009. Ethical imagination and design. In
Design Studies, Janet McDonnell and Peter Lloyd (Eds.).
Vol. 30. CRC Press, Boca Raton, FL, 154–168. DOI:
https://doi.org/10.1016/j.destud.2008.12.004
50.
Dan Lockton, David Harrison, and Neville Stanton. 2008.
Design with Intent : Persuasive Technology in a Wider
Context. In Persuasive Technology. Vol. 5033 LNCS.
Springer Berlin Heidelberg, Berlin, Heidelberg, 274–278.
DOI:https://doi.org/10.1007/978-3- 540-68504- 3{_}30
51. Ken McPhail. 2001. The other objective of ethics
education: Re-humanising the accounting profession—A
study of ethics education in law, engineering, medicine
and accountancy. Journal of Business Ethics 34, 3-4
(2001), 279–298.
52. Tara Mullaney and Erik Stolterman. 2014. Why ’design
research practice’ is not design as we know it. Design
Research Society (2014).
http://www.drs2014.org/media/654248/0266-file1.pdf
53.
Cosmin Munteanu, Heather Molyneaux, Wendy Moncur,
Mario Romero, Susan O’Donnell, and John Vines. 2015.
Situational Ethics: Re-thinking Approaches to Formal
Ethics Requirements for Human-Computer Interaction.
Proceedings of the 33rd Annual ACM Conference on
Human Factors in Computing Systems (2015), 105–114.
DOI:https://doi.org/10.1145/2702123.2702481
54. Chris Nodder. 2013. Evil by Design: Interaction design
to lead us into temptation. John Wiley & Sons, Inc.,
Indianapolis, IN. 303 pages.
55.
Donald A. Norman. 1986. User Centered System Design:
New Perspectives on Human-computer Interaction.
Erlbaum, Hillsdale, NJ.
56. James Pierce, Phoebe Sengers, Tad Hirsch, Tom Jenkins,
William Gaver, and Carl DiSalvo. 2015. Expanding and
Refining Design and Criticality in HCI. In Proceedings of
the 33rd Annual ACM Conference on Human Factors in
Computing Systems - CHI ’15. ACM, 2083–2092. DOI:
https://doi.org/10.1145/2702123.2702438
57. Johan Redström. 2006. Persuasive Design: Fringes and
Foundations. In Persuasive Technology. 112–122. DOI:
https://doi.org/10.1007/11755494{_}17
58. Noam Scheiber. 2017. How Uber Uses Psychological
Tricks to Push Its Drivers’ Buttons. https:
//www.nytimes.com/interactive/2017/04/02/technology/
uber-drivers- psychological-tricks.html?_r=1
59. Dan Schlosser. 2015. LinkedIn Dark Patterns.
https://medium.com/@danrschlosser/
linkedin-dark- patterns-3ae726fe1462
60. Phoebe Sengers, Kirsten Boehner, Shay David, and
Joseph ’Jofish’ Kaye. 2005. Reflective design. In
Proceedings of the 4th decennial conference on Critical
computing between sense and sensibility - CC ’05. ACM,
ACM Press, New York, NY, 49–58. DOI:
https://doi.org/10.1145/1094562.1094569
61.
Phoebe Sengers, John McCarthy, and Paul Dourish. 2006.
Reflective HCI: Articulating an Agenda for Critical
Practice. In Extended Abstracts CHI 2006. ACM,
1683–1686. DOI:
https://doi.org/10.1145/1125451.1125762
62. Katie Shilton. 2012. Values Levers: Building Ethics Into
Design. Science, Technology & Human Values 38, 3
(2012), 374–397. DOI:
https://doi.org/10.1177/0162243912436985
63. Katie Shilton and Sara Anderson. 2016. Blended, not
bossy: Ethics roles, responsibilities and expertise in
design. Interacting with Computers 29, 1 (2016), 71–79.
DOI:https://doi.org/10.1093/iwc/iww002
64. Katie Shilton, Jes A. Koepfler, and Kenneth R.
Fleischmann. 2014. How to see values in social
computing: methods for studying values dimensions. In
Proceedings of the 17th ACM conference on Computer
supported cooperative work & social computing. ACM,
426–435. DOI:https://doi.org/10.1145/2531602.2531625
65.
Natasha Singer. 2016. When Websites Won’t Take No for
an Answer. https:
//www.nytimes.com/2016/05/15/technology/personaltech/
when-websites- wont-take- no-for- an-answer.html?mcubz=
0&_r=0
66.
Erik Stolterman. 2008. The nature of design practice and
implications for interaction design research. International
Journal of Design 2, 1 (2008), 55–65. DOI:
https://doi.org/10.1016/j.phymed.2007.09.005
67.
Erik Stolterman and Harold G Nelson. 2012. The Design
Way. MIT Press, Cambridge, MA. DOI:
https://doi.org/10.1017/S0038713400054014
68.
Mary E Sunderland, J Ahn, C Carson, and W Kastenberg.
2013. Making Ethics Explicit: Relocating Ethics to the
Core of Engineering Education. In Proceedings of the
American Society for Engineering Education (ASEE)
Annual Conference. Atlanta, GA, 23.881.1 – 23.881.11.
69. Aimee van Wynsberghe. 2013. Designing Robots for
Care: Care Centered Value-Sensitive Design. Science and
Engineering Ethics 19, 2 (2013), 407–433. DOI:
https://doi.org/10.1007/s11948-011- 9343-6
70.
Aimee van Wynsberghe and Scott Robbins. 2014. Ethicist
as designer: a pragmatic approach to ethics in the lab. Sci
Eng Ethics 20, 4 (2014), 947–961. DOI:
https://doi.org/10.1007/s11948-013- 9498-4
71. Alyssa Vance. 2016. Dark Patterns by the Boston Globe.
https://rationalconspiracy.com/2016/04/24/
dark-patterns- by-the- boston-globe/
72. Peter-Paul Verbeek. 2006. Materializing Morality:
Design Ethics and Technological Mediation. Science,
Technology & Human Values 31, 3 (2006), 361–380.
DOI:
https://doi.org/10.1177/0162243905285847
73.
Jo Vermeulen and Kris Luyten. 2013. Crossing the bridge
over Norman’s gulf of execution: revealing feedforward’s
true identity. In CHI’13: ACM Conference on Human
Factors in Computing Systems. 1931–1940. DOI:
https://doi.org/10.1145/2470654.2466255
74. Vasillis Vlachokyriakos, Clara Crivellaro, Christopher A
Le Dantec, Eric Gordon, Pete Wright, and Patrick Olivier.
2016. Digital Civics: Citizen Empowerment With and
Through Technology. Proceedings of the 2016 CHI
Conference Extended Abstracts on Human Factors in
Computing Systems (2016), 1096–1099. DOI:
https://doi.org/10.1145/2851581.2886436
75. Jenny Waycott, Cosmin Munteanu, Hilary Davis, Anja
Thieme, Wendy Moncur, Roisin McNaney, John Vines,
and Stacy Branham. 2016. Ethical Encounters in
Human-Computer Interaction. In Proceedings of the 2016
CHI Conference Extended Abstracts on Human Factors
in Computing Systems - CHI EA ’16. ACM Press, New
York, New York, USA, 3387–3394. DOI:
https://doi.org/10.1145/2851581.2856498
76.
Marty J Wolf. 2016. The ACM Code of Ethics: A Call to
Action. Association for Computing Machinery.
Communications of the ACM 59, 12 (2016), 6. DOI:
https://doi.org/10.1145/3012934
... The third facet of dark pattern definitions is the role of the user interface designer. Some definitions involve a designer abusing their domain-specific knowledge of human behavior (Gray et al. [21], Lacey and Caudwell [29], Westin and ...
... Chiasson [59], and Maier and Harr [34]). Other definitions state that designers intentionally deploy dark patterns to achieve a goal (Conti and Sobiesk [9], Zagal et al. [62], Gray et al. [21], Luguri and Strahilevitz [32], Gray et al. [20], CNIL [42], DETOUR Act [58], and CPRA [40]). We note that, as far back as 1998, Fogg articulated a very similar concept: user interface designers could use "persuasive technologies" to intentionally influence users [15]. ...
... Some definitions describe a dark pattern as aiming to benefit an online service (Conti and Sobiesk [9], Mathur et al. [35], Utz et al. [56], and Gray et al. [20]). Other dark pattern definitions involve harm to users (Zagal et al. [62], Gray et al. [21], Waldman [57], and the NCC [18]). ...
Preprint
Full-text available
There is a rapidly growing literature on dark patterns, user interface designs -- typically related to shopping or privacy -- that researchers deem problematic. Recent work has been predominantly descriptive, documenting and categorizing objectionable user interfaces. These contributions have been invaluable in highlighting specific designs for researchers and policymakers. But the current literature lacks a conceptual foundation: What makes a user interface a dark pattern? Why are certain designs problematic for users or society? We review recent work on dark patterns and demonstrate that the literature does not reflect a singular concern or consistent definition, but rather, a set of thematically related considerations. Drawing from scholarship in psychology, economics, ethics, philosophy, and law, we articulate a set of normative perspectives for analyzing dark patterns and their effects on individuals and society. We then show how future research on dark patterns can go beyond subjective criticism of user interface designs and apply empirical methods grounded in normative perspectives.
... We then built on prior analysis of this dataset to identify several consent patterns which were distributed across the temporal user experience that include initial framing, configuration, and acceptance of consent parameters. We then used our shared expertise as authors in HCI, design, ethics, computer science, and law to analyze these design outcomes for their legality using prior legal precedent [36,44], and their ethical appropriateness using relevant strategies from the dark patterns literature [51]. ...
... ] with a solid understanding of human psychology, and [which] do not have the user's interests in mind" [18]. Brignull identified a taxonomy [19] of twelve different types of dark patterns and collects examples in his "hall of shame," which has subsequently been built upon by Gray et al. [51], Bösch et al. [22], and Mathur et al. [70]. In 2016, Bösch et al. presented a classification of eight "dark strategies" [22], built in opposition to Hoepman's "privacy design strategies" [54], which uncovered several new patterns: Privacy Zuckering, Bad Defaults; Forced Registration (requiring account registration to access some functionality); Hidden Legalese Stipulations (hiding malicious information in lengthy terms and conditions); Immortal Accounts; Address Book Leeching; and Shadow User Profiles. ...
... Finally, they made some recommendations to mitigate the negative effects of these deceptive techniques on users. In this work, we rely more specifically on the five dark patterns strategies proposed by Gray et al. [51], which include: nagging-a "redirection of expected functionality that persists beyond one or more interactions"; obstruction-"making a process more difficult than it needs to be, with the intent of dissuading certain action(s)"; sneaking-"attempting to hide, disguise, or delay the divulging of information that is relevant to the user"; interface interference-"manipulation of the user interface that privileges certain actions over others"; and forced action-"requiring the user to perform a certain action to access (or continue to access) certain functionality. " ...
Conference Paper
Full-text available
User engagement with data privacy and security through consent banners has become a ubiquitous part of interacting with internet services. While previous work has addressed consent banners from either interaction design, legal, and ethics-focused perspectives , little research addresses the connections among multiple disciplinary approaches, including tensions and opportunities that transcend disciplinary boundaries. In this paper, we draw together perspectives and commentary from HCI, design, privacy and data protection, and legal research communities, using the language and strategies of "dark patterns" to perform an interaction criticism reading of three different types of consent banners. Our analysis builds upon designer, interface, user, and social context lenses to raise tensions and synergies that arise together in complex, contingent, and conflicting ways in the act of designing consent banners. We conclude with opportunities for transdisciplinary dialogue across legal, ethical, computer science, and interactive systems scholarship to translate matters of ethical concern into public policy.
... direct users towards outcomes that involve greater data collection and processing. Additionally, the proliferation of data-driven computational methods allows firms to identify vulnerabilities of users and to target specific users with these vulnerabilities 12 » (Gray et al. 2018). ...
... Luguri and Strahilevitz (2019) propose a detailed typology of the different mechanisms linked to dark patterns. We reproduce it in part in table 2 presented below, adding elements developed by Gray et al. (2018). ...
... Gray et al. (2018) définissent les modèles sombres comme "des cas où les concepteurs utilisent leur connaissance du comportement humain (par exemple la psychologie) et les désirs des utilisateurs finaux pour mettre en oeuvre des fonctionnalités trompeuses qui ne sont pas dans le meilleur intérêt de l'utilisateur". ...
Article
Full-text available
The digital economy's development poses questions unprecedented in their magnitude in potential market manipulations and manipulations of consumer choices. Deceptive and unfair strategies in consumer law may coexist and mutually reinforce each other with infringements in the field of competition, whether it be algorithmic collusion or abuse of a dominant position. Faced with the difficulty of detecting and sanctioning these practices ex-post, questions are raised about the sanction's dissuasive effect and its capacity to prevent possibly irreversible damage. To this end, this article considers the available supervision tools for the authorities in charge of market surveillance, the consumers or the stakeholders of the companies concerned.
... Other research focused on making classifications of regular, anti-, and dark patterns (Mirnig & Tscheligi, 2017), outlining the differences between these patterns and the risks associated with confusing them. A third focus consisted of a thorough analysis and categorization of dark patterns through a content analysis of a wide range of examples to determine their ethical implications (Gray, Kou, Battles, Hoggatt, & Toombs, 2018). To understand this phenomenon in more detail, further research is required. ...
... Even if the crawler analyzed only textual information-thus excluding patterns enabled by color, style, and other nontextual features-it uncovered 1,818 instances of dark patterns within these sites. Recently, researchers at Purdue University began studying the issue (e.g., Chivukula, Brier, & Gray, 2018;Gray et al., 2018), primarily by focusing on ethical considerations. In this paper, dark patterns are defined as functionality, drawing on psychological insights, implemented within user interfaces that is deceptive to users and in not in their best interests. ...
... Brignull (2018) formulated 12 different types of dark patterns to describe various deceptive strategies, while Gray et al. (2018) contributed by sorting these into five distinct categories: nagging, obstruction, sneaking, interface interference, and forced action (see Figure 3). ...
Article
Full-text available
The number of websites and mobile applications available is growing continually, as are the persuasive approaches to influence human behavior and decision making. Although designing for persuasion offers several potential benefits, recent developments expose various deceptive designs, that is, dark patterns, that utilize psychological factors to nudge people toward, from someone else’s perspective, desired directions. This paper contributes to an increased awareness of the phenomenon of dark patterns through our exploring how users perceive and experience these patterns. Hence, we chose a qualitative research approach, with focus groups and interviews, for our exploration. Our analysis shows that participants were moderately aware of these deceptive techniques, several of which were perceived as sneaky and dishonest. Respondents further expressed a resigned attitude toward such techniques and primarily blamed businesses for their occurrence. Users considered their dependency on services employing these practices, thus making it difficult to avoid fully dark patterns.
... Similarly, issues of transparency, for example in user-agreements, and just treatment of users are something to be considered [23]. Additionally, many ethical issues of games such as addiction [11], dark patterns [5] and doing immoral or violent things in games matter. More unique ethical issues of LBGs raise from their set up in the real world. ...
... Favouring some specific areas or points of interest also raises the question about the possibility of manipulation or persuasion through game design. Thus, dark patterns, such as disguised ads [5] could create ethically questionable business models. ...
Conference Paper
Full-text available
Location-based games-urban games which are played in real life locations-are growing their share in the mobile gaming markets. While these kinds of games have been present since the growing popularity of so-called feature phones, they gained remarkable momentum through the popularity of Niantic Inc.'s Pokémon Go, Ingress and later Harry Potter: Wizards Unite. As modern location-based games are gathering hundred of millions dollars monthly revenue, also related social, economical, and even juridical questions have raised; however, the ethical questions embodied into the location-based games or augmented reality games have not been completely unveiled. As the mobile game industry is growing its importance in the software industry, also business ethics questions should be taken into account during the design and development of game products and services. This paper seeks to understand the relevant ethical concerns of location-based games by using the the viewpoints of individual, community and business.
... Social responsibility and ethical aspects of design and technology artifacts are increasingly being discussed in HCI research [22,23,41], technology practice [6,12], and the popular press [18]. Frequently, Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. ...
Conference Paper
Full-text available
HCI and STS researchers have previously described the ethical complexity of practice, drawing together aspects of organizational complexity, design knowledge, and ethical frameworks. Building on this work, we investigate the identity claims and beliefs that impact practitioners' ability to recognize and act upon ethical concerns in a range of technology-focused disciplines. In this paper, we report results from an interview study with 12 practitioners, identifying and describing their identity claims related to ethical awareness and action. We conducted a critically-focused thematic analysis to identify eight distinct claims representing roles relating to learning, educating, following policies, feeling a sense of responsibility, being a member of a profession, a translator, an activist , and deliberative. Based on our findings, we demonstrate how the claims foreground building competence in relation to ethical practice. We highlight the dynamic interplay among these claims and point towards implications for identity work in socio-technical contexts.
... Indeed, for the agile software development to be successful, the aim of the business model should be to maximize the intersection of the two types of aforementioned values, namely the "Win-Win" value, as shown in Fig. 5. This can be done by applying design thinking [18] and lean principles [19] suitably, and, at the same time, avoiding dark patterns [20] that aim for exploitability rather than usability, and reducing the potential for negative uses [21] that emanate from the violations of, say, privacy, safety, and/or security. ...
Chapter
Full-text available
The profitability, reputability, and sustainability of any organization depend intrinsically on customer satisfaction. In the context of an agile project, this paper proposes a set of conceptual models that form an input to a customer-centered, market-oriented, user story engineering process aiming to provide a continual positive customer experience. In doing so, organizational constraints pertaining to economics of the user story engineering process are considered.
... Fogg [9] described deceptive patterns as techniques that are used to obtain unintended outcome. Gray's taxonomy [10] of deceptive materials is based on the strategic motivator behind patterns. Interface interference that gray suggested broadly refers to visual and language manipulation. ...
Preprint
Full-text available
This paper proposes a set of techniques to investigate eye gaze and fixation patterns while users interact with electronic user interfaces. In particular, two case studies are presented - one on analysing eye gaze while interacting with deceptive materials in web pages and another on analysing graphs in standard computer monitor and virtual reality displays. We analysed spatial and temporal distributions of eye gaze fixations and sequence of eye gaze movements. We used this information to propose new design guidelines to avoid deceptive materials in web and user-friendly representation of data in 2D graphs. In 2D graph study we identified that area graph has lowest number of clusters for user's gaze fixations and lowest average response time. The results of 2D graph study were implemented in virtual and mixed reality environment. Along with this, it was ob-served that the duration while interacting with deceptive materials in web pages is independent of the number of fixations. Furthermore, web-based data visualization tool for analysing eye tracking data from single and multiple users was developed.
... Multiple workshops at CHI 2020 explored ethics in a range of topics such as cognitive bias in computing systems [2], ethics of user experience design [4], responsible AI [16] and ethical implications of mixed realities [10]. A series of publications brought to light the dark patterns in user experience (UX) [8] calling for ethical UX pedagogy [6]. Others draw attention to the political use of "unintended consequences" within HCI and technology discourses to illustrate the vital but often overlooked role of ethics and responsibility in design [17]. ...
Conference Paper
Full-text available
Due to the evolving nature of technology and its impact on individuals, communities and society, practitioners and designers in Human-Computer Interaction (HCI) are expected to consider ethics in their work. This role has inspired the development of a number of resources for practice, such as tools, frameworks and methods to tackle ethical issues in HCI. But these suffer from low adoption rate potentially because they are not yet part of the standard body of knowledge. To mitigate the issue, we argue that there is an urgent need for ethics education in HCI. Beyond defining ethics, an ethics curriculum must enable practitioners to reflect and allow consideration of intended and unintended consequences of the technologies they create from the ground up, rather than as a fix or an afterthought. In this co-design workshop, we aim to build upon existing practices and knowledge of ethics in HCI and work with the CHI community to enrich ethics curriculum. We will scaffold our collective understandings of the existing resources and create guidelines that support interactive educational experiences to support HCI ethics curriculum. ----------------------------------------------------------------------------------------------------------------------------------- WEBSITE: https://sites.google.com/view/codesignethics -----------------------------------------------------------------------------------------------------------------------------------
Article
Full-text available
We report findings from our two-year research study to investigate the practices, processes and roles of professional creatives working on interaction design and wider digital design projects. The study contributes insights from interviews conducted to support the development of 13 high profile industry case studies involving 21 of their creators. Through thematic analysis of interview transcripts we constructed key themes of project scope, design stances, skills sets and studio practice. We discuss these as representative of the perpetual shifting of the cornerstones of how designers have traditionally understood and embodied their own and peers’ roles and combinations of competencies. This, we argue, is challenging perceptions and expectations around designers’ traditional “T-shape” organisation of skills and knowledge. The article goes on to identify areas of emerging design practice brought about by rapid technological changes associated with the fourth industrial age that warrant further research. These include anticipatory design and personalisation, branded interactions and magic technology. The article concludes by calling for wide sharing of designers’ stories as a pragmatic resource to demonstrate and communicate emerging practices that support the development of graduates and other designers entering this rapidly-changing field.
Conference Paper
Full-text available
Scholars have repeatedly called for the knowledge production efforts of the HCI research community to have resonance with the needs of practitioners. These efforts, reified in approaches such as "implications for design," annotated portfolios, and other forms of intermediate-level knowledge have begun to take hold within the research community, yet it is unclear if and how these forms of knowledge are used to actually support user experience (UX) practice. In this study, we analyzed resources shared via URLs that pointed to articles on external websites within a practitioner-focused Reddit community. Using Löwgren's taxonomy of intermediate-level knowledge, we identified the forms of knowledge these resources represent, and use this analysis as a provocation for future exploration into the types of knowledge practitioners desire and use to support their practice.
Conference Paper
Full-text available
Academic research can offer insights for HCI practitioners, yet past work shows that research findings are rarely used in industry. We interviewed 22 design practitioners to identify why they do not use academic research and why and how they use other resources at work. We contribute recommendations for the design of translational resources to bridge the gap between theory and practice in HCI. We recommend ways to create theory-driven examples tailored to specific activities: understanding, brainstorming, building, and advocacy. Additionally, practitioners prefer actionable guidance and see prescriptive recommendations and downloadable design patterns as most useful. Design-oriented filters, support for mapping design challenges to research keywords, and visual galleries of examples from theory have the potential to facilitate designers' search processes. Finally, translational resources and discussion features can be integrated into tools for designers and academics to support cross-community collaboration.
Conference Paper
Full-text available
In the HCI community, there is growing recognition that a reflective and empathetic approach is needed to conduct ethical research in sensitive settings with people who might be considered vulnerable or marginalized. At our CHI 2015 workshop on ethical encounters, researchers shared personal stories of the challenges and tensions they have faced when conducting HCI research in complex settings such as hospitals, with young mental health patients, in schools for children with disabilities, and with homeless people. These research contexts can present significant challenges for HCI researchers who would not typically receive the training that other professionals working in these environments would normally receive. From our discussions with attendees at the CHI 2015 workshop, we identified a number of ethical issues that researchers are grappling with. In this follow-up workshop we aim to build on the lessons learned and to generate pragmatic but sensitive solutions to manage complex ethical issues for HCI researchers working in challenging settings.
Conference Paper
Full-text available
There has been increasing interest in the work practices of user experience (UX) designers, particularly in relation to approaches that support adoption of human-centered principles in corporate environments. This paper addresses the ways in which UX designers conceive of methods that support their practice, and the methods they consider necessary as a baseline competency for beginning user experience designers. Interviews were conducted with practitioners in a range of companies, with differing levels of expertise and educational backgrounds represented. Interviewees were asked about their use of design methods in practice, and the methods they considered to be core of their practice; in addition, they were asked what set of methods would be vital for beginning designers joining their company. Based on these interviews, I evaluate practitioner conceptions of design methods, proposing an appropriation-oriented mindset that drives the use of tool knowledge, supporting designers' practice in a variety of corporate contexts. Opportunities are considered for future research in the study of UX practice and training of students in human-computer interaction programs.
Article
The emergence of 'third paradigm' Human-Computer Interaction (HCI) was driven by the shortcomings of existing approaches to adequately describe and understand the ways people interact with a new breed of pervasive digital technologies in everyday life. In response, new approaches became situated, value-driven and participatory with a shift towards studying HCI in the wild. With technology reaching into every aspect of our lives, the ethical and moral responsibilities of designers and researchers have increased. However, while HCI's design and research methodology have become fluid and responsive to reflect the paradigm shift, ethics is still widely interpreted as a static, anticipatory and formalised process. In this article, we address this gap and propose In- Action Ethics as a novel framework that links anticipatory ethics with the practice of HCI research. We start by laying out the foundations for In-Action Ethics by reviewing relevant work in ethics and moral philosophy, and discuss the current state of ethical perspectives in HCI, Action Research and Responsible Science and Innovation. We provide two examples from our own work to show how situated, explorative and design-oriented HCI projects raise issues of ethical importance that formal ethics is struggling to manage. On the basis of our experiences and those of others, we start developing key qualities for an In-Action Ethics framework and show how those qualities can be operationalised in relation to the realities of existing structures by introducing the concept of ethos building and care. © The Author 2016. Published by Oxford University Press on behalf of The British Computer Society. All rights reserved.
Article
The exponential growth in technological capability has resulted in increased interest on the short- and long-term effects of designed artifacts, leading to a focus in many design fields on the ethics and values that are inscribed in the designs we create. While ethical awareness is a key concern in many engineering, technology, and design disciplines—even an accreditation requirement in many fields—instructional design and technology (IDT) has not historically focused their view of practice on ethics, instead relying on a more scientistic view of practice which artificially limits the designer’s interaction with the surrounding society through the artifacts and experiences they design. In this paper, we argue for a heightened view of designer responsibility and design process in an ethical framing, drawing on methods and theoretical frameworks of ethical responsibility from the broader design community. We then demonstrate the frequency of ethical concerns that emerge in a content analysis of design cases that document authentic instructional design practice. We conclude with two paths forward to improve instructional design education and research regarding the nature of practice, advocating for increased documentation of design precedent to generatively complicate our notions of the design process, and for the creation and use of critical designs to foreground ethical and value-related concerns in IDT research and practice.
Conference Paper
In recent years, many HCI designers have begun pursuing research agendas that address large scale social issues. These systemic or "wicked" problems present challenges for design practice due to their scope, scale, complexity, and political nature. In this paper, we develop a social justice orientation to designing for such challenges. We highlight a breadth of design strategies that target the goals of social justice along six dimensions -- transformation, recognition, reciprocity, enablement, distribution, and accountability -- and elaborate three commitments necessary to developing a social justice oriented design practice -- a commitment to conflict, a commitment to reflexivity, and a commitment to personal ethics and politics. Although there are no easy solutions to systemic social issues, a social justice orientation provides one way to foster an engagement with the thorny political issues that are increasingly acknowledged as crucial to a field that is not just about technological possibility, but also about political responsibility.
Conference Paper
This paper started as a response to the "Black Lives Matter" campaign in the USA, and emerged as a critique of race more generally in technology design. This paper provides case studies of how technologies are often less usable by persons of color, and contextualizes this in light of intersectionalist theory. Finally, it discusses how the HCI community can ameliorate the situation, and our obligation to do so in light of the ACM code of ethics.