PreprintPDF Available

Adapting the Transtheoretical Model for the Design of Security Interventions

Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

The continued susceptibility of end users to cybersecurity attacks suggests an incomplete understanding of why some people ignore security advice and neglect to use best practices and tools to prevent threats. A more detailed and nuanced approach can help more accurately target security interventions for end users according to their stage of intentional security behavior change. In this paper, we adapt the Transtheoretical Model of Behavior Change for use in a cybersecurity design context. We provide a visual diagram of our model as adapted from public health and cybersecurity literature. We then contribute advice for designers' use of our model in the context of human-computer interaction and the specific domain of usable privacy and security, such as for encouraging timely software updates, voluntary use of two-factor authentication and attention to password hygiene.
Content may be subject to copyright.
The continued susceptibility of end users to
cybersecurity attacks suggests an incomplete
understanding of why some people ignore security
advice and neglect to use best practices and tools to
prevent threats. A more detailed and nuanced
approach can help more accurately target security
interventions for end users according to their stage
of intentional security behavior change.
In this work, we adapt the Transtheoretical Model of
Behavior Change for use in acybersecurity design
context.We provide avisual diagram of each stage
and the associated causative concept (right, top) as
adapted from public health and security literature.
We then contribute advice for designers’ use of our
model (right, bottom) in the context of human-
computer interaction and the specific domain of
usable privacy and security, such as for encouraging
timely software updates, voluntary use of two-factor
authentication and attention to password hygiene.
Adapting the Transtheoretical Model
for the Design of Security Interventions
Cori Faklaris, Laura Dabbish and Jason Hong
Carnegie Mellon University, Pittsburgh, PA 15213, USA;
The Unified Theory of Use and Acceptance of
Technology synthesizes the ideas of Davis et al.and
Venkatesh et al.into one model of how users take
action inside acomputer system.In this view,
situational and social factors,moderated by
individual factors,precede the individual’s intention
to use and actual use of the system.
Security Sensitivity is defined by Das as “the
awareness of, motivation to use, and knowledge of
how to use security tools”.This is based on prior
findings that many people believe themselves in no
danger of falling victim to asecurity breach and are
unaware of the existence of tools to protect them
against those threats;they perceive the
inconvenience and cost to their time and attention
as outweighing the harm of experiencing asecurity
breach, and they think they are too difficult to use
or lack the knowledge to use them effectively.Stage Evidence Goal/Task Effective
Interventions Examples Successful
“I don’t need to
use / I don’t
have time to use
interest in
Education, Reading
Storytelling, Media
campaigns, Empathy
Password strength indicator;
CyberQuiz-type materials;
social media articles
“It may be a
good idea to
use security
“I worry I don’t
use / I may want
to use security
users and
Role playing;
Imagery, Value
IT workshops; “Choose your
own adventure” game
“I will regret it
if I do not start
using security
“I want / I need
to change my
users to put
into action
procedures and
policies, Advocacy
for marginalized
; Resolutions +
Public Testimonies;
Providing choices
among 2-3
“Magic link” alternative to
password (Slack); signing
security change contracts
“I feel better
for committing
to my chosen
“I intend to use /
I know why to use
Creating action
of acts
Punishments and
Group recognitions;
Rapport building,
Coaching and Buddy
Self-help groups,
substitute behaviors;
Environmental Re-
Controlling stimuli to
avoid harmful or
inadvisable actions
Thumprint (Das et al.);
Facebook Trusted Contacts;
chatbot to praise secure
behaviors and offer tips;
Inputting prank phrase on
peers unlocked laptop;
interface re-design to direct
or nudge behaviors
“I ask for help
with using /
I get help with
using /
I am successful
with /
I keep improving
my security
“I am already
/ I
value security
and solidifying
In our Cyclical Model of Security Behavior Change,the factors of Awareness,Motivation,Knowledge,Resistance,Reinforcement
and Denial cause users to move through Stages of Change as they weigh pros and cons comprised of Situational and Social Factors
(Performance Expectancy, Effort Expectancy, Social Influence and Facilitating Conditions), along with Self-Efficacy and Temptation.
Other Individual (Gender,Age, Experience and Voluntariness of Use) and External Factors (such as Regulations) impact the model.
Prochaska and DiClemente’s Transtheoretical
Model of Behavior Change has been identified in
the literature as auseful framework for privacy and
security research.The TTM marks ashift from
thinking of behavior change as occurring in asingle,
decisive moment to that of alonger-term, cyclical
process in which people balance pros and cons
along with Self-Efficacy and Temptat ion to make
decisions and move through identifiable stages:
(also called Determination), Action,Relapse,and
longer-term Maintenance of desired behaviors.
... This avoids a "one size fits all" approach and produces a classification scheme that can be used to design and direct an intervention to those who are most likely to benefit from it. Researchers have created many models of behavior change, such as the Theory of Reasoned Action/Theory of Planned Behavior [2,74,132], the Technology Acceptance Model [46,47,197], the Transtheoretical Model [52,69,157,196], and Diffusion of Innovations [19,167,168]. However, no one has yet established or validated such a model for end-user cybersecurity, nor one that accounts for social influences by stage. ...
... The Transtheoretical Model [69,157] (Figure 7) incorporates insights from a variety of other models and theories, starting with Decisional Balance Theory. It proposes a cyclical process of precontemplation, contemplation, determination (sometimes called preparation), action, and either maintenance or relapse (called termination if it is final). ...
... In cybersecurity and in privacy, Sano et al. [175,176], Faklaris et al. [69], and Ting et al. [190] have explored applying the Stages of Change and Processes of Change to end user studies. These researchers identified a theoretical and/or empirical basis for classifying computer users by whether they are in either precontemplation (Stage 1), contemplation/preparation (Stages 2-3), or action/maintenance (Stages 4-5) of adopting practices such as updating their operating systems, checking for https in URLs, and using antivirus software. ...
Full-text available
My research looks at how to apply insights from social psychology, marketing, and public health to reduce the costs of cybercrime and improve adoption of security practices. The central problem that I am addressing is the widespread lack of understanding of cyber-risks. While many solutions exist (such as using password managers), people often are not fully aware of what they do or use them regularly. To address the problem, we should look to insights from social psychology, marketing, and public health that behavior change unfolds as a process in time and is influenced at each stage by relevant contacts, and that interventions are more successful when grounded in appropriate theory. Other researchers have developed models to describe behaviors such as reasoned action, technology acceptance, health/wellness adoption, and innovation diffusion. But we lack a model that is specifically developed for end-user cybersecurity and that accounts for social influences and for non-adoption. In my thesis, I used an exploratory sequential mixed-methods approach to specify such a preliminary model, comprised of six steps of adoption, their step-associated social influences, and each step’s obstacles to moving forward. To this end, I conducted two phases of research. In Phase 1, a remote interview study (N=17), I gathered data to synthesize a common narrative of how people adopt security practices. In Phase 2, an online survey study (N=859), I validated the Phase 1 insights with a U.S. Census-matched panel of adults aged 18 and older. I documented the distribution of the steps of adoption for password managers (either built-in or separately installed), and which factors were significantly associated with each step. I then integrated these findings and triangulated them with prior research on the influences of threat awareness, social proof, advice-seeking, and caretaking roles in people’s security behaviors. The results are a data-driven diagram and description of the six steps of cybersecurity adoption and a survey-item algorithm for classifying people by adoption step. These steps are 0: No Learning or Threat Awareness, 1: Threat Awareness, 2: Security Learning, 3: Security Practice Implementation, 4: Security Practice Maintenance, and “X”: Security Practice Rejection. My Step Classifications exhibit reliability and convergent validity, showing an expected significant variance by steps on mean scores for adapted Transtheoretical Model scales (p<.001). The trialability of password managers and the availability of troubleshooting help were significantly positively associated with adoption of password managers (Step 3 and Step 4, p<.001), and the lack of troubleshooting help was significantly positively associated with rejection of password managers (Step X, p<.001). Other authority influences (mandates, adoption leadership, caretaking) and peer/media influences (advice on password managers, exposure to news of others’ security breach experiences) also were significantly associated with adoption decisions. My thesis helps move the field of usable security away from “one size fits all” strategies by providing a theoretical basis and a method for segmenting the target audience for security interventions and directing resources to those segments most likely to benefit. It establishes an agenda for future experiments to validate whether specific step-matched interventions influence adoption and are more likely to lead to long-term change. It contributes to the literature on Diffusion of Innovations and extends other established theoretical models, such as Protection Motivation Theory, the Technology Acceptance Model, and the Transtheoretical Model. Finally, it suggests specific design interventions for boosting security adoption.
... To this end, we apply Prochaska and DiClemente's Transtheoretical Model of Behavior Change [44,63]. This model, and its associated Stages of Change, has already been identified as a useful framework for privacy and security research [8,10,25,30,55]. ...
... In security and privacy, Sano et al. [49,50], Faklaris et al. [30], and Ting et al. [55] have explored applying the Stages of Change and Processes of Change to end user studies. These researchers identified a theoretical and/or empirical basis for classifying computer users by whether they are in either precontemplation (Stage 1), contemplation/preparation (Stages 2-3), or action/maintenance (Stages 4-5) of adopting practices such as updating their operating systems, checking for https in URLs, and using antivirus software. ...
... Model [29,30,55] to be a desirable framework for creating and assessing usability interventions. ...
Full-text available
Behavior change ideas from health psychology can also help boost end user compliance with security recommendations, such as adopting two-factor authentication (2FA). Our research adapts the Transtheoretical Model Stages of Change from health and wellness research to a cybersecurity context. We first create and validate an assessment to identify workers on Amazon Mechanical Turk who have not enabled 2FA for their accounts as being in Stage 1 (no intention to adopt 2FA) or Stages 2-3 (some intention to adopt 2FA). We randomly assigned participants to receive an informational intervention with varied content (highlighting process, norms, or both) or not. After three days, we again surveyed workers for Stage of Amazon 2FA adoption. We found that those in the intervention group showed more progress toward action/maintenance (Stages 4-5) than those in the control group, and those who received content highlighting the process of enabling 2FA were significantly more likely to progress toward 2FA adoption. Our work contributes support for applying a Stages of Change Model in usable security.
... The Transtheoretical Model [12,25] proposes a cyclical process of precontemplation, contemplation/preparation, ...
... Further, work to create the SA-13 security-attitude inventory [3] has identified four factors --Resistance, Concernedness, Attentiveness, and Engagement --that weigh in a person's cybersecurity decisional balance [19]. These findings echo stage models in public health [15] such as the Transtheoretical Model [9,12,25,34] and the Precaution Adoption Process Model [37,38], each successful in promoting measures such as smoking cessation and home radon tests. Now, we seek to integrate these findings with other proven models of behavior adoption, such as Diffusion of Innovations [27] concepts of successful tech characteristics, and with empirical data from end users to provide insights specific to cybersecurity. ...
Conference Paper
Full-text available
Our research focuses on understanding how attitudes and social influences act on end users in the process of cybersecurity behavior adoption (or non-adoption). This work draws on five expectancy-value models and on four stage models that have been applied successfully in social psychology, market-ing, and public health. In this talk, we will first give an over-view of these models. We then will present the progress of our empirical mixed-methods research to craft a model specific to cybersecurity adoption that identifies the relevant (1) attitudes and (2) social influences acting at each step, along with (3) tech characteristics that are associated with sustained adoption. We will conclude with remarks on how our work can be of use to cybersecurity teams tasked with boost-ing awareness and/or adoption.
... Cybersecurity psychology is akin to health psychology in that, in the short-term, people often act against their long-term or collective interest by neglecting to take precautions or avoid risky behaviors [29,70]. From the user's point of view, this is not necessarily irrational, as they may have needs that conflict with security recommendations and policies [1]; also, people's decisionmaking styles vary [65], which impacts their intentions and behaviors in both health care [31,50] and cybersecurity [27,54]. ...
Full-text available
We present SA-13, the 13-item Security Attitude inventory. We develop and validate this assessment of cybersecurity attitudes by conducting an exploratory factor analysis, confirmatory factor analysis, and other tests with data from a U.S. Census-weighted Qualtrics panel (N=209). Beyond a core six indicators of Engagement with Security Measures (SA-Engagement, three items) and Attentiveness to Security Measures (SA-Attentiveness, three items), our SA-13 inventory adds indicators of Resistance to Security Measures (SA-Resistance, four items) and Concernedness with Improving Compliance (SA-Concernedness, three items). SA-13 and the subscales exhibit desirable psychometric qualities; and higher scores on SA-13 and on the SA-Engagement and SA-Attentiveness subscales are associated with higher scores for security behavior intention and for self-reported recent security behaviors. SA-13 and the subscales are useful for researchers and security awareness teams who need a lightweight survey measure of user security attitudes. The composite score of the 13 indicators provides a compact measurement of cybersecurity decisional balance.
Full-text available
The Community Reinforcement Approach (CRA), originally developed for individuals with alcohol use disorders, has been successfully employed to treat a variety of substance use disorders for more than 35 years. Based on operant conditioning, CRA helps people rearrange their lifestyles so that healthy, drug-free living becomes rewarding and thereby competes with alcohol and drug use. Consequently, practitioners encourage clients to become progressively involved in alternative non-substance-related pleasant social activities, and to work on enhancing the enjoyment they receive within the "community" of their family and job. Additionally, in the past 10-15 years, researchers have obtained scientific evidence for two off-shoots of CRA that are based on the same operant mechanism. The first variant is Adolescent Community Reinforcement Approach (A-CRA), which targets adolescents with substance use problems and their caregivers. The second approach, Community Reinforcement and Family Training (CRAFT), works through family members to engage treatment-refusing individuals into treatment. An overview of these treatments and their scientific backing is presented.
Full-text available
The present research develops and tests a theoretical extension of the Technology Acceptance Model (TAM) that explains perceived usefulness and usage intentions in terms of social influence and cognitive instrumental processes. The extended model, referred to as TAM2, was tested using longitudinal data collected regarding four different systems at four organizations (N = 156), two involving voluntary usage and two involving mandatory usage. Model constructs were measured at three points in time at each organization: preimplementation, one month postimplementation, and three months postimplementation. The extended model was strongly supported for all four organizations at all three points of measurement, accounting for 40%--60% of the variance in usefulness perceptions and 34%--52% of the variance in usage intentions. Both social influence processes (subjective norm, voluntariness, and image) and cognitive instrumental processes (job relevance, output quality, result demonstrability, and perceived ease of use) significantly influenced user acceptance. These findings advance theory and contribute to the foundation for future research aimed at improving our understanding of user adoption behavior.
Full-text available
Computer systems cannot improve organizational performance if they aren't used. Unfortunately, resistance to end-user systems by managers and professionals is a widespread problem. To better predict, explain, and increase user acceptance, we need to better understand why people accept or reject computers. This research addresses the ability to predict peoples' computer acceptance from a measure of their intentions, and the ability to explain their intentions in terms of their attitudes, subjective norms, perceived usefulness, perceived ease of use, and related variables. In a longitudinal study of 107 users, intentions to use a specific system, measured after a one-hour introduction to the system, were correlated 0.35 with system use 14 weeks later. The intention-usage correlation was 0.63 at the end of this time period. Perceived usefulness strongly influenced peoples' intentions, explaining more than half of the variance in intentions at the end of 14 weeks. Perceived ease of use had a small but significant effect on intentions as well, although this effect subsided over time. Attitudes only partially mediated the effects of these beliefs on intentions. Subjective norms had no effect on intentions. These results suggest the possibility of simple but powerful models of the determinants of user acceptance, with practical value for evaluating systems and guiding managerial interventions aimed at reducing the problem of underutilized computer technology.
Conference Paper
End users’ behaviors that compromise security and privacy put both individuals and organizations at risk. In this study, we focus on online self-disclosure and propose a staged model that describes how individuals can move towards reduced online self-disclosure. We recognize that information privacy and security behavior is a dynamic process that unfolds over time, making stage theories applicable. Specifically, we apply the transtheoretical model (TTM) of behavior change, which has been widely used in the public health domain to change high risk behaviors, such as smoking as well as behaviors related to physical security, such as seat belt use. We also consider the mechanisms of changing behavior in the context of online self-disclosure and propose various hypotheses about the dynamic impacts of the mechanisms based on the stage an individual is in. Our proposed model can be used to develop stage-based interventions to promote safer behaviors regarding online self-disclosure.
While behavior change methods have become relatively commonplace in the health domain, they have only recently been applied to the cybersecurity field. In this chapter we review two fundamentally different approaches to behavior change in cybersecurity. First we explore “nudging” and behavioral interventions arising from the MINDSPACE framework. Second we explore the more theoretically based Protection Motivation Theory (PMT) as a framework for introducing behavior change. Finally we consider the relationship between these two approaches.
Conference Paper
Social influence is key in technology adoption, but its role in security-feature adoption is unique and remains unclear. Here, we analyzed how three Facebook security features' Login Approvals, Login Notifications, and Trusted Contacts-diffused through the social networks of 1.5 million people. Our results suggest that social influence affects one's likelihood to adopt a security feature, but its effect varies based on the observability of the feature, the current feature adoption rate among a potential adopter's friends, and the number of distinct social circles from which those feature-adopting friends originate. Curiously, there may be a threshold higher than which having more security feature adopting friends predicts for higher adoption likelihood, but below which having more feature-adopting friends predicts for lower adoption likelihood. Furthermore, the magnitude of this threshold is modulated by the attributes of a feature-features that are more noticeable (Login Approvals, Trusted Contacts) have lower thresholds.
One of the largest outstanding problems in computer security is the need for higher awareness and use of available security tools. One promising but largely unexplored approach is to use social proof: by showing people that their friends use security features, they may be more inclined to explore those features, too. To explore the efficacy of this approach, we showed 50,000 people who use Facebook one of 8 security announcements - 7 variations of social proof and 1 non-social control - to increase the exploration and adoption of three security features: Login Notifications, Login Approvals, and Trusted Contacts. Our results indicated that simply showing people the number of their friends that used security features was most effective, and drove 37% more viewers to explore the promoted security features compared to the non-social announcement (thus, raising awareness). In turn, as social announcements drove more people to explore security features, more people who saw social announcements adopted those features, too. However, among those who explored the promoted features, there was no difference in the adoption rate of those who viewed a social versus a non-social announcement. In a follow up survey, we confirmed that the social announcements raised viewer's awareness of available security features.
Getting an innovation adopted is difficult; a common problem is increasing the rate of its diffusion. Diffusion is the communication of an innovation through certain channels over time among members of a social system. It is a communication whose messages are concerned with new ideas; it is a process where participants create and share information to achieve a mutual understanding. Initial chapters of the book discuss the history of diffusion research, some major criticisms of diffusion research, and the meta-research procedures used in the book. This text is the third edition of this well-respected work. The first edition was published in 1962, and the fifth edition in 2003. The book's theoretical framework relies on the concepts of information and uncertainty. Uncertainty is the degree to which alternatives are perceived with respect to an event and the relative probabilities of these alternatives; uncertainty implies a lack of predictability and motivates an individual to seek information. A technological innovation embodies information, thus reducing uncertainty. Information affects uncertainty in a situation where a choice exists among alternatives; information about a technological innovation can be software information or innovation-evaluation information. An innovation is an idea, practice, or object that is perceived as new by an individual or an other unit of adoption; innovation presents an individual or organization with a new alternative(s) or new means of solving problems. Whether new alternatives are superior is not precisely known by problem solvers. Thus people seek new information. Information about new ideas is exchanged through a process of convergence involving interpersonal networks. Thus, diffusion of innovations is a social process that communicates perceived information about a new idea; it produces an alteration in the structure and function of a social system, producing social consequences. Diffusion has four elements: (1) an innovation that is perceived as new, (2) communication channels, (3) time, and (4) a social system (members jointly solving to accomplish a common goal). Diffusion systems can be centralized or decentralized. The innovation-development process has five steps passing from recognition of a need, through R&D, commercialization, diffusions and adoption, to consequences. Time enters the diffusion process in three ways: (1) innovation-decision process, (2) innovativeness, and (3) rate of the innovation's adoption. The innovation-decision process is an information-seeking and information-processing activity that motivates an individual to reduce uncertainty about the (dis)advantages of the innovation. There are five steps in the process: (1) knowledge for an adoption/rejection/implementation decision; (2) persuasion to form an attitude, (3) decision, (4) implementation, and (5) confirmation (reinforcement or rejection). Innovations can also be re-invented (changed or modified) by the user. The innovation-decision period is the time required to pass through the innovation-decision process. Rates of adoption of an innovation depend on (and can be predicted by) how its characteristics are perceived in terms of relative advantage, compatibility, complexity, trialability, and observability. The diffusion effect is the increasing, cumulative pressure from interpersonal networks to adopt (or reject) an innovation. Overadoption is an innovation's adoption when experts suggest its rejection. Diffusion networks convey innovation-evaluation information to decrease uncertainty about an idea's use. The heart of the diffusion process is the modeling and imitation by potential adopters of their network partners who have adopted already. Change agents influence innovation decisions in a direction deemed desirable. Opinion leadership is the degree individuals influence others' attitudes
Efficacy expectations are postulated to mediate all behavior change. This study examined the construct of self-efficacy in the self-change of smoking behavior. A 31-item measure of self-efficacy was used that included ratings of both temptation (cue strength) and confidence (efficacy). The subjects were 957 volunteers representing five stages of self change: (1) immotives, (2) contemplators, (3) recent quitters, (4) long-term quitters, and (5) relapsers. Subjects were assessed initially and at a 3- to 5-month follow-up. The self-efficacy scale proved to be an extremely reliable and coherent instrument with identifiable but not clearly interpretable subcomponents. Groups of subjects demonstrated significant differences in total self-efficacy scores. Efficacy expectations demonstrated small but significant relationships with smoking history variables and the pros and cons of smoking, but not with demographic, life stress, or persistence measures. Subject's efficacy evaluations at the initial assessment were related to changes in status for recent quitters and contemplators at the follow-up. The relationship between temptation and efficacy ratings is complex and varies for subjects in the various stages of change. Correlations between total self-efficacy and temptation scores were largest for contemplators (r = –.65) and relapsers (r = –.67) and smallest for the recent quitters (r = –.18). Finally, the magnitude of the difference between temptation and efficacy increased with length of abstinence for subjects in maintenance.
Governments, enterprises and consumers face a myriad of computer threats that are technically advanced and persistent. Commonly available cyber defenses such as firewalls, antivirus software, and automatic updates for security patches help reduce the risk from threats but they are not enough, especially since many consumers do not always follow the guidance provided and/or engage in other unsafe actions (e.g., downloading executable programs from unknown sources). Those with infected computers are not simply risking their own valuable information and data; they are putting others at risk too. This paper will look at addressing online security issues using a model similar to the one society uses to address human illness.