ArticlePDF Available
1 23
Business & Information Systems
Engineering
The International Journal of
WIRTSCHAFTSINFORMATIK
ISSN 2363-7005
Bus Inf Syst Eng
DOI 10.1007/s12599-016-0453-1
Digital Nudging
Markus Weinmann, Christoph
Schneider & Jan vom Brocke
1 23
Your article is published under the Creative
Commons Attribution license which allows
users to read, copy, distribute and make
derivative works, as long as the author of
the original work is cited. You may self-
archive this article on your own website, an
institutional repository or funder’s repository
and make it publicly available immediately.
CATCHWORD
Digital Nudging
Markus Weinmann Christoph Schneider
Jan vom Brocke
Received: 21 October 2015 / Accepted: 5 August 2016
ÓThe Author(s) 2016. This article is published with open access at Springerlink.com
Keywords Nudging Information systems design
Human–computer interaction Online choice architecture
1 Digital Nudging – Guiding Judgment and
Decision-Making in Digital Choice Environments
Digital nudging is the use of user-interface design elements
to guide people’s behavior in digital choice environments.
Digital choice environments are user interfaces – such as
web-based forms and ERP screens – that require people to
make judgments or decisions. Humans face choices every
day, but the outcome of any choice is influenced not only
by rational deliberations of the available options but also
by the design of the choice environment in which infor-
mation is presented, which can exert a subconscious
influence on the outcome. In other words, ‘‘what is chosen
often depends upon how the choice is presented’’ (Johnson
et al. 2012, p. 488) such that the ‘‘choice architecture alters
people’s behavior in a predictable way’’ (Thaler and
Sunstein 2008, p. 6). Even simple modifications of the
choice environment in which options are presented can
influence people’s choices and ‘‘nudge’’ them into behav-
ing in particular ways. In fact, there is no neutral way to
present choices. For example, Johnson and Goldstein
(2003) showed that simply changing default options (from
opt-in to opt-out) in the context of organ donation nearly
doubled the percentage of people who consent to being
organ donors.
Many choices are made in online environments. As the
design of digital choice environments always (either
deliberately or accidentally) influences people’s choices,
understanding the effects of digital nudges in these envi-
ronments can help designers lead users to the most desir-
able choice. For example, the mobile payment app Square
nudges people into giving tips by setting the default to
‘tipping’’ so that customers must actively select a ‘‘no
tipping’’ option if they choose not to give a tip. Using this
simple nudge has raised tip amounts, especially where little
or no tipping has been common (Carr 2013). These
examples show that simply changing the default option
affects the outcome.
2 Relevance
The increasing use of digital technologies in large areas of
our private and professional lives means that people fre-
quently make important decisions within digital choice
environments. Most, if not all, online interactions – ranging
from e-government to e-commerce interactions – require
people to make choices.
User interfaces such as Web sites and mobile apps fre-
quently include digital choice environments; likewise,
interfaces of organizational information systems such as
ERP and CRM systems are digital choice environments
that predefine or influence decisions by how the system
Accepted after one revision by Prof. Dr. Sinz.
Dr. M. Weinmann (&)Prof. Dr. J. v. Brocke
Institute of Information Systems, University of Liechtenstein,
Fu
¨rst-Franz-Josef-Strasse, 9490 Vaduz, Liechtenstein
e-mail: markus.weinmann@uni.li
Prof. Dr. J. v. Brocke
e-mail: jan.vom.brocke@uni.li
Dr. C. Schneider
Department of Information Systems, City University of Hong
Kong, 83 Tat Chee Avenue, Kowloon, Hong Kong
e-mail: christoph.schneider@cityu.edu.hk
123
Bus Inf Syst Eng
DOI 10.1007/s12599-016-0453-1
organizes and presents workflows. Since there is no neutral
way to present choices, all decisions related to user-inter-
face design influence users’ behavior (Mandel and Johnson
2002; Sunstein 2015), often regardless of the designers’
intent. A digital choice environment’s design that acci-
dentally influences people’s choices may lead to unin-
tended consequences; therefore, designers must understand
the effects of their designs on users’ choices so they can
choose whether to implement a design that nudges users
deliberately or one that reduces the effects of the design on
users’ choices in order to increase free will.
A key consideration when making such design decisions
is the ethical implications of using nudges. While nudges
should be used to help people make better choices (Thaler
and Sunstein 2008), this is not always the case in practice.
For example, some European low-cost air carriers present
choices of non-essential options in a way that nudges
customers toward purchasing these options. While these
unethical nudges may lead to short term gains for the
company, they may have long-term repercussions in terms
of loss of goodwill, negative publicity, or even legal action.
Therefore, designers must be aware of the ethical impli-
cations of nudges (see Sunstein 2015 for a discussion of
nudging ethics).
3 Current Status
Research on nudging has been conducted primarily in
offline contexts. Whereas traditional economic theory
suggests that human behavior is rational, nudging works
because people do not always behave rationally. In par-
ticular, research in psychology has demonstrated that,
because of their cognitive limitations, people act in
boundedly rational ways (Simon 1955), and various
heuristics and biases influence their decision-making
(Tversky and Kahneman 1974). Heuristics, commonly
defined as simple ‘‘rules of thumb’’ (Hutchinson and
Gigerenzer 2005, p. 98) that people use to ease their cog-
nitive load in making judgments or decisions, can influence
decision-making positively or negatively: They can be
helpful in making simple, recurrent decisions by reducing
the amount of information to be processed so people can
focus on differentiated factors (Evans 2006), reducing
mental effort (Evans 2008). On the other hand, heuristic
thinking can result in cognitive biases and introduce sys-
tematic errors when making complex judgments or deci-
sions (Tversky and Kahneman 1974) that require effortful
thinking (Evans 2006). In such situations, common
heuristics – such as the anchoring and adjustment heuristic
(e.g., using the default values), the availability heuristic
(e.g., being influenced by the vividness of events), and the
representativeness heuristic (i.e., relying on stereotypes)
(Tversky and Kahneman 1974) – affect the evaluation of
alternatives, often leading to suboptimal decisions.
Nudges attempt either to counter or to encourage the use
of heuristics by altering the choice environment to change
people’s behavior. Commonly used nudges include giving
incentives, providing feedback or anchors, and setting
defaults (Dolan et al. 2012; Johnson et al. 2012; Michie
et al. 2013; Thaler and Sunstein 2008; see Table 1).
In various situations, the designers of the choice envi-
ronments (sometimes referred to as ‘‘choice architects’’;
Thaler et al. 2010, p. 1) attempt to influence people’s
choices. For example, many organizations encourage peo-
ple to engage in socially responsible behaviors, such as
leading a healthy life (e-health; e.g., the Fitbit provides
feedback on physical activity), reducing waste or energy
consumption (Green IS; e.g., Nest thermostats provide
feedback on energy consumption), and planning for
retirement (e-finance; e.g., governments set defaults on
retirement options). Likewise, many non-governmental
organizations attempt to encourage people to donate funds,
participate in charitable activities, or vote for particular
outcomes. In an e-commerce context, Web sites often use
opt-in or opt-out mechanisms to nudge users into signing
up for newsletters.
4 Applications of Digital Nudging and Future Trends
4.1 Applications
By definition, digital nudging focuses on guiding the
behavior of individuals, but the effects of digitally nudging
individuals can extend to organizational or societal levels
(see Table 2).
While digital nudging, as described in this article, focuses
on people’s choices in digital choice environments, the
concept can be applied beyond this context, as nudges in
digital environments are increasingly used to influence real-
world behavior. One example is the Fitbit activity monitor,
where digital nudges (e.g., reminding the user to exercise,
giving feedback on activity, presenting friends’ statistics) are
used to nudge people into increasing their activity levels.
4.2 Future Trends
Digital nudging will have a significant impact on future
information systems research and practice, particularly for
design-oriented information systems research. As user
interfaces will always steer people in certain directions
(depending on how information is presented), information
systems designers must understand the behavioral effects
of interface design elements so that digital nudging does
not happen at random and unintended effects do not occur.
123
M. Weinmann et al.: Digital Nudging, Bus Inf Syst Eng
Research on digital nudging is likely to evolve into an
important area of design science research, as knowledge
about the behavioral effects of interface-design decisions
on users’ behavior will provide valuable guidance for
improved interface design. New design theories may
evolve that extend knowledge from psychology and
behavioral economics to digital choice environments. As
research on digital nudging is still in its early stages,
clarification of the theoretical mechanisms that underlie
digital nudging is needed, as is the development of theo-
retically based design recommendations to inform research
on persuasive technology (Fogg 2003), particularly the
design of persuasive systems (Oinas-Kukkonen and Har-
jumaa 2009) like behavior-change support systems (Oinas-
Kukkonen 2010).
Because of the ubiquitous digitalization of our private
and professional lives, digital nudging will soon extend to
other application areas as people will use digital devices to
make decisions in more situations and sectors, and the
devices themselves will diversify in form and function.
New devices will emerge with new interaction and
interface design elements, such as kinetics, virtual reality,
and holograms, and designers will need to understand the
potential behavioral effects of these new technologies on
people’s judgment and decision-making.
We encourage our fellow scholars to engage in research
on digital nudging, a fascinating area of information sys-
tems research that bears considerable potential for both
research and society.
5 Further Reading
We suggest the following books for further reading on
offline nudging and the underlying mechanisms: Kahne-
man 2011; Thaler and Sunstein 2008.
Open Access This article is distributed under the terms of the
Creative Commons Attribution 4.0 International License (http://crea
tivecommons.org/licenses/by/4.0/), which permits unrestricted use,
distribution, and reproduction in any medium, provided you give
appropriate credit to the original author(s) and the source, provide a
link to the Creative Commons license, and indicate if changes were
made.
Table 1 Selection of nudge principles, descriptions, and examples (based on Thaler et al. 2010)
Nudge
principle
Description Example
Incentive Making incentives more salient to increase their
effectiveness
Telephones that are programmed to display the running cost of phone
calls
Understanding
mapping
Mapping information that is difficult to evaluate to
familiar evaluation schemes
Mapping megapixels to maximum printable size instead of pointing to
megapixels when advertising a digital camera
Defaults Preselecting options by setting default options Changing defaults (from opt-in to opt-out) to increase the percentage
of people who consent to being organ donors
Giving
feedback
Providing users with feedback when they are doing
well or making mistakes
Electronic road signs with smiling or sad faces depending on the
vehicle’s speed
Expecting
error
Expecting users to make errors and being as
forgiving as possible
Requiring people at an ATM to retrieve the card before they receive
their money in order to help them avoid forgetting the card
Structure
complex
choices
Listing all the attributes of all the alternatives and
letting people make trade-offs when necessary
Online product configuration systems that make choices simpler by
guiding users through the purchase process
Table 2 Example applications of digital nudging and their effects
Use case/IS field Nudging example/behavior change intervention Effect on organizational or societal level
Business process management Structuring complex input screens Organizational
E-business and e-commerce Displaying limited room inventory during a hotel-booking
process
Organizational
E-finance and insurance Setting defaults for frequently selected insurance plan options Societal
E-government Setting defaults to opt in for organ donation Societal
E-health Step counter app that provides feedback on activity levels Societal
E-learning Reminder to learners to engage with course content Organizational and/or societal
Green IS Smart meters to encourage energy savings Societal
Security and privacy Displaying the strength of selected passwords Organizational and/or societal
Social media Giving incentives, such as badges, for sharing or other activities Societal
123
M. Weinmann et al.: Digital Nudging, Bus Inf Syst Eng
References
Carr A (2013) How square register’s UI guilts you into leaving tips.
http://www.fastcodesign.com/3022182/innovation-by-design/
how-square-registers-ui-guilts-you-into-leaving-tips. Accessed
08 Aug 2016
Dolan P, Hallsworth M, Halpern D, King D, Metcalfe R, Vlaev I
(2012) Influencing behaviour: the mindspace way. J Econ
Psychol 1(33):264–277
Evans JSBT (2006) The heuristic-analytic theory of reasoning:
extension and evaluation. Psychon Bull Rev 3(13):378–395
Evans JSBT (2008) Dual-processing accounts of reasoning, judgment,
and social cognition. Ann Rev Psychol 1(59):255–278
Fogg BJ (2003) Persuasive technology: using computers to change
what we think and do. Elsevier, Oxford
Hutchinson JMC, Gigerenzer G (2005) Simple heuristics and rules of
thumb: where psychologists and behavioural biologists might
meet. Behav Process 2(69):97–124
Johnson EJ, Goldstein D (2003) Do defaults save lives? Science
5649(302):1338–1339
Johnson EJ, Shu SB, Dellaert BGC, Fox C, Goldstein DG, Ha
¨ubl G,
Larrick RP, Payne JW, Peters E, Schkade D, Wansink B, Weber
EU (2012) Beyond nudges: tools of a choice architecture.
Marketing Lett 2(23):487–504
Kahneman D (2011) Thinking, fast and slow. Farrar, Straus and
Giroux, New York
Mandel N, Johnson EJ (2002) When web pages influence choice:
effects of visual primes on experts and novices. J Consum Res
2(29):235–245
Michie S, Richardson M, Johnston M, Abraham C, Francis J,
Hardeman W, Eccles MP, Cane J, Wood CE (2013) The
behavior change technique taxonomy (v1) of 93 hierarchically
clustered techniques: building an international consensus for the
reporting of behavior change interventions. Ann Behav Med
1(46):81–95
Oinas-Kukkonen H (2010) Behavior change support systems: a
research model and agenda. In: Ploug T, Hasle P, Oinas-
Kukkonen H (eds) Persuasive technology: lecture notes in
computer science, vol 6137. Springer, Heidelberg, pp 4–14
Oinas-Kukkonen H, Harjumaa M (2009) Persuasive systems design:
key issues, process model, and system features. Commun Assoc
Inf Syst 28(24):485–500
Simon HA (1955) A behavioral model of rational choice. Q J Econ
1(69):99–118
Sunstein CR (2015) Nudging and choice architecture: ethical
considerations. SSRN Electron J. http://papers.ssrn.com/sol3/
papers.cfm?abstract_id=2551264. Accessed 15 Mar 2016
Thaler RH, Sunstein CR (2008) Nudge: improving decisions about
health, wealth, and happiness. Yale University Press, New
Haven
Thaler RH, Sunstein CR, Balz JP (2010) Choice architecture. SSRN
Electron J. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=
1583509. Accessed 08 Aug 2016
Tversky A, Kahneman D (1974) Judgment under uncertainty:
heuristics and biases. Science 4157(185):1124–1131
123
M. Weinmann et al.: Digital Nudging, Bus Inf Syst Eng
... The practicality of IT tools is often tethered to their complexity, with surging complexities escalating associated costs [49]. Digital nudging can streamline intricate choices [7,50], guiding users seamlessly, for example, online systems can ease individual choice by guiding an user through a process. In the ...
... The vastness of online content can engender information overload, especially when faced with a multitude of product options and their respective features [3][4][5]. This scenario sets the stage for digital nudging (DN) -deliberate UI design features that modify user behavior in online decision contexts [6][7][8][9]. DN mechanisms are omnipresent and continuously shape choices in such environments [10], positioning DN in e-commerce as an emergent research domain [11]. While DN has the potential to enhance online decision-making, such as by suggesting default choices based on a user's previous selections [12], by reducing the purchase rate of incompatible products through information nudges on check-out pages [13], it also unravels the risk of unfolding reactance or backfire effects [14] on behalf of the user resulting in a series of ethical conundrums [15]. ...
... To illuminate this subject, we conducted an online experiment, probing if perceptions fluctuate based on DN design nuances. Grounded in both descriptive and inferential statistical analyses, our findings aim to steer e-commerce designers toward crafting enhanced user experiences [7]. We evaluate our findings with experts from information systems and psychology. ...
Article
Full-text available
Given the nascent understanding of user perceptions toward digital nudges in e-commerce, our study examines key factors: perceived usefulness, ease of use, trust, and privacy risks. Via an online experiment of 273 participants, we examined the influence of digital nudging interventions – social norms, defaults, and scarcity warnings – against a control group. Employing descriptive and inferential statistics, notable trust variations were found between default and scarcity warning groups versus controls. To assess these findings, we interviewed 11 information systems and psychology experts. This research enriches our understanding of digital nudges in e-commerce and provides design insights. Theoretical implications span from providing propositions in order to enhance user involvement, conducting narrative accompanying research, analyzing diverse time points of nudging. Practical implications focus on emphasizing to users their choice autonomy and the highlighting that defaults and scarcity warnings are designed to mitigate inherent heuristics and biases for combining nudging with boosting elements.
Article
Digital nudging is gaining traction in the educational domain to guide students' decision‐making processes and achieve desirable learning outcomes through subtle changes in the digital learning environment. From the information science perspective, these changes are realized through informational cues and human‐computer interface design to affect the influences. Studies have shown nudge effectiveness in influencing students' behaviors, but the extent to which digital nudging affects them and who is susceptible to nudges remains unclear. Although progress has been made in understanding nudge acceptability, research in the context of learning and students' characteristics, such as procrastination behavior, remains limited. To fill this gap, this study surveyed 305 university students to assess their nudge acceptability on two types of nudges: System 1, which involves automatic and intuitive processes, and System 2, which engages deliberate and reflective thinking. The results show that students, regardless of their procrastination tendencies, were receptive to nudges in supporting their learning. System 1 nudge was preferred due to its simplistic and straightforward intervention approach. The insights gained from this study contributed to the advancement of nudge research by demonstrating that students with various procrastination tendencies were receptive to nudging and guiding researchers in designing tailored nudges to maximize effectiveness.
Chapter
The rapid spread of misinformation online can be attributed to bias in human decision-making, facilitated by algorithmic processes. The area of human-computer interaction has contributed a mechanism of such biases that can be addressed by the design of system interventions. For example, the principle of nudging refers to sophisticated modifications in the choice architecture that can change user behaviors in desired or directed ways. This chapter discusses the design of nudging interventions in the context of misinformation, including a systematic review of the use of nudging in human-AI interaction that has led to a design framework. By using algorithms that work invisibly, nudges can be maneuvered in misinformation to individuals, and their effectiveness can be traced and attuned as the algorithm improves from user feedback based on a user’s behavior. It seeks to explore the potential of nudging in decreasing the chances of consuming and spreading misinformation. The key is how to ensure that algorithmic nudges are used in an effective way and whether the nudge could also help to achieve a sustainable way of life. This chapter discusses the principles and dimensions of the nudging effects of AI systems on user behavior in response to misinformation.
Chapter
This chapter introduces the principle of diversity-aware AI and discusses the need to develop recommendation models to embed AI with diversity awareness to mitigate misinformation. Free and plural ideas are key to addressing misinformation and informing users. A key indicator of the healthy online ecosystem is the existence of diversity of ideas and others’ perspectives. Exposure to diverse sources of news promotes tolerance, social cohesion, and harmonious accord of different ideologies, perspectives, and cultures. Diversity in news recommender systems (NRS) is perceived as a major issue for the preservation of a healthy democratic discourse. In this light of importance, this chapter proposes a conceptual framework for personalized recommendation nudges that can promote diverse news consumption on online platforms. It empirically tests the effects of algorithmic nudges by examining how users make sense of algorithmic nudges and how nudges influence users’ views on personalization and attitudes toward news diversity. The findings show that algorithmic nudges play a key role in understanding normative values in NRS, which then influence the user’s intention to consume diverse news. The findings imply the personalization paradox that personalized news recommendations can enhance and decrease user engagement with the systems. This paradox provides conceptual and operational bases for diversity-aware NRS design, enhancing the diversity and personalization of news recommendations. It proposes a conceptual framework of algorithmic nudges and news diversity, and from there, we develop theoretically grounded paths for facilitating diversity and inclusion in NRS.
Chapter
Full-text available
Bu araştırma makalesi, enerji verimliliği ağlarının (EVA) Türkiye’deki teknolojik gelişmeler aracılığıyla toplumsal yapıyı kontrol etmek ve değiştirmek için nasıl bir devlet aygıtı haline geldiğini tartışmaktadır. Bu nedenle, enerji verimliliği ağlarını analiz etmek için Aktör-ağ kuramını kullanmak, hem Bilim, Teknoloji ve Toplum (STS) hem de diğer sosyal bilimler için yeni bir kuram-yöntem iş birliğidir. Aktör-ağ kuramı üzerinden Türkiye’de yeni ortaya çıkan enerji verimliliği ağlarından birinin incelenmesi, enerji sisteminin Bilim, Teknoloji ve Toplum açısından nasıl bir araştırma alanı haline gelebileceğini ve STS’in sistem içinde nasıl bir konumda olduğunu göstermeyi amaçlamaktadır. Bu makale aktör ağ kuramı kullanılarak enerji verimliliği ağlarının aktör ve aktan arasındaki ilişkileri analiz etmektedir.
Chapter
The pandemic caused by COVID-19 has led many states, in an attempt to control the spread of the virus, to decree social immobilization, which meant the introduction of restrictions on the free movement of people and the opening of shops. This led them to seek new marketing channels for purchases. Among these, ICTs have been important. This is the focus of the analysis in this document. Through surveys and interviews, information was obtained, divided into four age groups, which showed that an important part of the population has had to resort to ICTs to acquire goods and/or pay for services. This change in the way of acquiring had different particularities according to the age group analyzed, with a greater change in the oldest group (56 to 74 years old). It can be concluded that the massification of these tools has generated a change in the ways of acquiring products, and this is likely to transcend the pandemic and these channels will be maintained and strengthened in the future.
Article
Full-text available
Is nudging unethical? Is choice architecture a problem for a free society? This essay defends seven propositions: (1) It is pointless to object to choice architecture or nudging as such. Choice architecture cannot be avoided. Nature itself nudges; so does the weather; so do customs and traditions; so do spontaneous orders and invisible hands. The private sector inevitably nudges, as does the government. It is reasonable to worry about nudges by government and to object to particular nudges, but not to nudging in general. (2) In this context, ethical abstractions (for example, about autonomy, dignity, manipulation, and democratic self-government) can create serious confusion. To make progress, those abstractions must be brought into contact with concrete practices. Nudging and choice architecture take highly diverse forms, and the force of an ethical objection depends on the specific form. (3) If welfare is our guide, much nudging is actually required on ethical grounds, even if it comes from government. (4) If autonomy is our guide, much nudging is also required on ethical grounds, in part because some nudges actually promote autonomy, in part because some nudges enable people to devote their limited time and attention to their most important concerns. (5) Choice architecture should not, and need not, compromise either dignity or self-government, but it is important to see that imaginable forms could do both. It follows that when they come from government, choice architecture and nudges should not be immune from a burden of justification, which they might not be able to overcome. (6) Some nudges are objectionable because the choice architect has illicit ends. When the ends are legitimate, and when nudges are fully transparent and subject to public scrutiny, a convincing ethical objection is less likely to be available. (7) There is ample room for ethical objections in the case of well-motivated but manipulative interventions, certainly if people have not consented to them; such nudges can undermine autonomy and dignity. It follows that both the concept and the practice of manipulation deserve careful attention. The concept of manipulation has a core and a periphery; some interventions fit within the core, others within the periphery, and others outside of both.
Article
Full-text available
A growing number of information technology systems and services are being developed to change users’ attitudes or behavior or both. Despite the fact that attitudinal theories from social psychology have been quite extensively applied to the study of user intentions and behavior, these theories have been developed for predicting user acceptance of the information technology rather than for providing systematic analysis and design methods for developing persuasive software solutions. This article is conceptual and theory-creating by its nature, suggesting a framework for Persuasive Systems Design (PSD). It discusses the process of designing and evaluating persuasive systems and describes what kind of content and software functionality may be found in the final product. It also highlights seven underlying postulates behind persuasive systems and ways to analyze the persuasion context (the intent, the event, and the strategy). The article further lists 28 design principles for persuasive system content and functionality, describing example software requirements and implementations. Some of the design principles are novel. Moreover, a new categorization of these principles is proposed, consisting of the primary task, dialogue, system credibility, and social support categories.
Article
Full-text available
The way a choice is presented influences what a decision-maker chooses. This paper outlines the tools available to choice architects, that is anyone who present people with choices. We divide these tools into two categories: those used in structuring the choice task and those used in describing the choice options. Tools for structuring the choice task address the idea of what to present to decision-makers, and tools for describing the choice options address the idea of how to present it. We discuss implementation issues in using choice architecture tools, including individual differences and errors in evaluation of choice outcomes. Finally, this paper presents a few applications that illustrate the positive effect choice architecture can have on real-world decisions.
Article
Full-text available
Background: CONSORT guidelines call for precise reporting of behavior change interventions: we need rigorous methods of characterizing active content of interventions with precision and specificity. Objectives: The objective of this study is to develop an extensive, consensually agreed hierarchically structured taxonomy of techniques [behavior change techniques (BCTs)] used in behavior change interventions. Methods: In a Delphi-type exercise, 14 experts rated labels and definitions of 124 BCTs from six published classification systems. Another 18 experts grouped BCTs according to similarity of active ingredients in an open-sort task. Inter-rater agreement amongst six researchers coding 85 intervention descriptions by BCTs was assessed. Results: This resulted in 93 BCTs clustered into 16 groups. Of the 26 BCTs occurring at least five times, 23 had adjusted kappas of 0.60 or above. Conclusions: "BCT taxonomy v1," an extensive taxonomy of 93 consensually agreed, distinct BCTs, offers a step change as a method for specifying interventions, but we anticipate further development and evaluation based on international, interdisciplinary consensus.
Conference Paper
Full-text available
This article introduces the concept of a behavior change support system and suggests it as a key construct for research on persuasive systems design, technologies, and applications. Key concepts for behavior change support systems are defined and a research agenda for them is outlined. The article suggests that a change in complying, a behavior change, and an attitude change (C-, B- or A-Change) constitute the archetypes of a behavioral change. Change in itself is either of a forming, altering or reinforcing outcome (F-, A- or R-Outcome). This research model will become helpful in researching and designing persuasive technology.
Article
The ability to influence behaviour is central to many of the key policy challenges in areas such as health, finance and climate change. The usual route to behaviour change in economics and psychology has been to attempt to ‘change minds’ by influencing the way people think through information and incentives. There is, however, increasing evidence to suggest that ‘changing contexts’ by influencing the environments within which people act (in largely automatic ways) can have important effects on behaviour. We present a mnemonic, MINDSPACE, which gathers up the nine most robust effects that influence our behaviour in mostly automatic (rather than deliberate) ways. This framework is being used by policymakers as an accessible summary of the academic literature. To motivate further research and academic scrutiny, we provide some evidence of the effects in action and highlight some of the significant gaps in our knowledge.
Article
Introduction, 99. — I. Some general features of rational choice, 100.— II. The essential simplifications, 103. — III. Existence and uniqueness of solutions, 111. — IV. Further comments on dynamics, 113. — V. Conclusion, 114. — Appendix, 115.
Article
Can computers change what you think and do? Can they motivate you to stop smoking, persuade you to buy insurance, or convince you to join the Army? "Yes, they can," says Dr. B.J. Fogg, director of the Persuasive Technology Lab at Stanford University. Fogg has coined the phrase "Captology"(an acronym for computers as persuasive technologies) to capture the domain of research, design, and applications of persuasive computers.In this thought-provoking book, based on nine years of research in captology, Dr. Fogg reveals how Web sites, software applications, and mobile devices can be used to change peoples attitudes and behavior. Technology designers, marketers, researchers, consumers-anyone who wants to leverage or simply understand the persuasive power of interactive technology-will appreciate the compelling insights and illuminating examples found inside. Persuasive technology can be controversial-and it should be. Who will wield this power of digital influence? And to what end? Now is the time to survey the issues and explore the principles of persuasive technology, and B.J. Fogg has written this book to be your guide.