Conference Paper

Designing Personalized OS Update Message based on Security Behavior Stage Model

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In security and privacy, Sano et al. [49,50], Faklaris et al. [30], and Ting et al. [55] have explored applying the Stages of Change and Processes of Change to end user studies. These researchers identified a theoretical and/or empirical basis for classifying computer users by whether they are in either precontemplation (Stage 1), contemplation/preparation (Stages 2-3), or action/maintenance (Stages 4-5) of adopting practices such as updating their operating systems, checking for https in URLs, and using antivirus software. ...
... These researchers identified a theoretical and/or empirical basis for classifying computer users by whether they are in either precontemplation (Stage 1), contemplation/preparation (Stages 2-3), or action/maintenance (Stages 4-5) of adopting practices such as updating their operating systems, checking for https in URLs, and using antivirus software. Sano et al. [49,50] tested messaging strategies by stage, for example, finding that a message emphasizing ease of the OS update was significantly associated with users in the preparation stage answering "I update OS now" to a survey item. ...
Preprint
Full-text available
Behavior change ideas from health psychology can also help boost end user compliance with security recommendations, such as adopting two-factor authentication (2FA). Our research adapts the Transtheoretical Model Stages of Change from health and wellness research to a cybersecurity context. We first create and validate an assessment to identify workers on Amazon Mechanical Turk who have not enabled 2FA for their accounts as being in Stage 1 (no intention to adopt 2FA) or Stages 2-3 (some intention to adopt 2FA). We randomly assigned participants to receive an informational intervention with varied content (highlighting process, norms, or both) or not. After three days, we again surveyed workers for Stage of Amazon 2FA adoption. We found that those in the intervention group showed more progress toward action/maintenance (Stages 4-5) than those in the control group, and those who received content highlighting the process of enabling 2FA were significantly more likely to progress toward 2FA adoption. Our work contributes support for applying a Stages of Change Model in usable security.
Conference Paper
To improve the rate of taking security action, it is important to promote personalized approaches for each user. Related works indicate phrases and UIs of dialog messages and incentives that influence a user’s action. In our previous work, we focused on smartphone users updating software, and proposed appropriate phrases of dialog messages according to the user’s understanding of the updating procedure, as well as the type of software. We also analyzed appropriate incentives. However, in the terms of level of literacy, the effectiveness of the UI of dialog messages and the volume of incentives remain unclear. In this paper, we conducted a user survey to analyze appropriate UIs according to the user’s understanding of the updating procedure and the appropriate volume of incentives. As a result, we confirmed different UIs are effective according to the user’s understanding of the updating procedure. In addition, we found an appropriate volume of points, mobile data, and coupons in order to promote the updating of software.
Chapter
It is important to provide personalized interventions that focus on the different security awareness of each user. We focus on updating operating system (OS) and seven major types of applications (Communication, Finance, Lifestyle, Games, Utility, Health, Entertainment) for smartphone users and aim to provide approaches that lead to updating of OS and seven types of applications (hereinafter, this is called “software”). We consider that user awareness of updating software may differ, and this relates to user understanding of the updating procedure. In this paper, we propose intervention methods according to users’ knowledge about update procedures. We conduct an online survey to evaluate effective approaches such as dialog message and type of incentive that increases the intention of smartphone users to update. We found that effective phrases of dialog messages differ according to the users’ understanding of the update procedures and that reward points are the best incentive for many users.
Conference Paper
Full-text available
Despite recent advances in increasing computer security by eliminating human involvement and error, there are still situations in which humans must manually perform computer security tasks, such as enabling automatic updates, rebooting machines to apply some of those updates, or enrolling in two-factor authentication. We argue that present bias-the tendency to discount future risks and gains in favor of immediate gratifications-could be the root cause explaining why many users fail to take such actions. Thus, we systematically explore the application of commitment devices, a technique from behavioral economics , to mitigate the effects of present bias on the adoption of end-user security measures. Offering users the option to be reminded or to schedule such tasks in the future could be effective in increasing their proclivity. While some current systems have begun incorporating such commitment nudges into software update messaging, we are unaware of rigorous scientific research that demonstrates how effective these techniques are, how they may be improved, and how they may be applied to other security behaviors. Using two online experiments, with over 1,000 participants total, we find that both reminders and commitment nudges can be effective at reducing the intentions to ignore the request to enable automatic updates (Study 1), and to install security updates and enable two-factor authentication, but not to configure automatic backups (Study 2). We also find that intentions of Mac OS users are generally more affected by these nudges.
Conference Paper
Full-text available
A major inhibitor of the effectiveness of security warnings is habituation: decreased response to a repeated warning. Although habituation develops over time, previous studies have examined habituation and possible solutions to its effects only within a single experimental session, providing an incomplete view of the problem. To address this gap, we conducted a longitudinal experiment that examines how habituation develops over the course of a five-day workweek and how polymorphic warnings decrease habituation. We measured habituation using two complementary methods simultaneously: functional magnetic resonance imaging (fMRI) and eye tracking. Our results show a dramatic drop in attention throughout the workweek despite partial recovery between workdays. We also found that the polymorphic warning design was substantially more resistant to habituation compared to conventional warnings, and it sustained this advantage throughout the five-day experiment. Our findings add credibility to prior studies by showing that the pattern of habituation holds across a workweek, and indicate that cross-sectional habituation studies are valid proxies for longitudinal studies. Our findings also show that eye tracking is a valid measure of the mental process of habituation to warnings.
Conference Paper
Full-text available
Updates alter the way software functions by fixing bugs, changing features, and modifying the user interface. Sometimes changes are welcome, even anticipated, and sometimes they are unwanted leading to users avoiding potentially unwanted updates. If users delay or do not install updates it can have serious security implications for their computer. Updates are one of the primary mechanisms for correcting discovered vulnerabilities, when a user does not update they remain vulnerable to an increasing number of attacks. In this work we detail the process users go through when updating their software, including both the positive and negative issues they experience. We asked 307 survey respondents to provide two contrasting software update stories. Using content analysis we analysed the stories and found that users go through six stages while updating: awareness, deciding to update, preparation, installation, troubleshooting, and post state. We further detail the issues respondents experienced during each stage and the impact on their willingness to update.
Conference Paper
Full-text available
Despite the plethora of security advice and online education materials offered to end-users, there exists no standard measurement tool for end-user security behaviors. We present the creation of such a tool. We surveyed the most common computer security advice that experts offer to end-users in order to construct a set of Likert scale questions to probe the extent to which respondents claim to follow this advice. Using these questions, we iteratively surveyed a pool of 3,619 computer users to refine our question set such that each question was applicable to a large percentage of the population, exhibited adequate variance between respondents, and had high reliability (i.e., desirable psychometric properties). After performing both exploratory and confirmatory factor analysis, we identified a 16-item scale consisting of four sub-scales that measures attitudes towards choosing passwords, device securement, staying up-to-date, and proactive awareness.
Article
Full-text available
In order to ensure that employees abide by their organizations' Information Security Policies (ISP), a number of information security policy compliance measures have been proposed in the past. If different factors can explain/predict the information security behavior of those employees who do know the ISP and of those who do not know the ISP, such as is suggested by stage theories, and the existing studies do not control for this issue, then the practical relevance of the existing models will be decreased. In order to test whether different factors explain/predict the information security behavior of those employees who do know the ISP and of those who do not know the ISP, we designed a study using the Protection Motivation Theory (PMT) as the baseline theory. Employees' ISP knowledge was tested by asking a few questions related to their organization's ISP. We divided the data (N=513) into that related to a low knowledge group (regarding the organizations' ISP) and that of a high knowledge group. The results show that the findings between the low knowledge group and the high knowledge group differ substantially. Our results provide an explanation for the inconsistent results in previous IS security research.
Conference Paper
Full-text available
Notifying users about a system's data practices is supposed to enable users to make informed privacy decisions. Yet, current notice and choice mechanisms, such as privacy policies , are often ineffective because they are neither usable nor useful, and are therefore ignored by users. Constrained interfaces on mobile devices, wearables, and smart home devices connected in an Internet of Things exacerbate the issue. Much research has studied usability issues of privacy notices and many proposals for more usable privacy notices exist. Yet, there is little guidance for designers and developers on the design aspects that can impact the effectiveness of privacy notices. In this paper, we make multiple contributions to remedy this issue. We survey the existing literature on privacy notices and identify challenges, requirements, and best practices for privacy notice design. Further, we map out the design space for privacy notices by identifying relevant dimensions. This provides a taxonomy and consistent terminology of notice approaches to foster understanding and reasoning about notice options available in the context of specific systems. Our systemization of knowledge and the developed design space can help designers, developers, and researchers identify notice and choice requirements and develop a comprehensive notice concept for their system that addresses the needs of different audiences and considers the system's limitations and opportunities for providing notice.
Article
Full-text available
It is widely acknowledged that employees of an organization are often a weak link in the protection of its information assets. Information security has not been given enough attention in the literature in terms of the human factor effect; researchers have called for more examination in this area. Human factors play a significant role in computer security. In this paper, we focus on the relationship of the human factor on information security presenting the human weaknesses that may lead to unintentional harm to the organization and discuss how information security awareness can be a major tool in overcoming these weaknesses. A framework for a field research is also presented in order to identify the human factors and the major attacks that threat computer security.
Conference Paper
Full-text available
Smartphone users visit application marketplaces (or app stores) to search and in-stall applications. However, these app stores are not free from privacy-invasive apps, which collect personal information without sufficient disclosure or people’s consent. To nudge people away from privacy-invasive apps, we created a visual representation of the mobile app’s privacy rating. Inspired by “Framing Effects,” we designed semantically equivalent visuals that are framed in either a positive or negative way. We investigated the effect of the visual privacy rating, framing, and user rating on people’s perception of an app (e.g., trustworthiness) through two experiments. In Study 1, participants were able to understand the intended mean-ing of the visual privacy ratings. In Study 2, we found a strong main effect for visual privacy rating on participants’ perception of an app, and framing effects in a low privacy rating app. We discuss implications for designing visual privacy ratings, including the use of positive visual framing to nudge people away from privacy-invasive apps
Conference Paper
Full-text available
We designed and tested attractors for computer security dialogs: user-interface modifications used to draw users' attention to the most important information for making decisions. Some of these modifications were purely visual, while others temporarily inhibited potentially-dangerous behaviors to redirect users' attention to salient information. We conducted three between-subjects experiments to test the effectiveness of the attractors. In the first two experiments, we sent participants to perform a task on what appeared to be a third-party site that required installation of a browser plugin. We presented them with what appeared to be an installation dialog from their operating system. Participants who saw dialogs that employed inhibitive attractors were significantly less likely than those in the control group to ignore clues that installing this software might be harmful. In the third experiment, we attempted to habituate participants to dialogs that they knew were part of the experiment. We used attractors to highlight a field that was of no value during habituation trials and contained critical information after the habituation period. Participants exposed to inhibitive attractors were two to three times more likely to make an informed decision than those in the control condition.
Conference Paper
Full-text available
HCI research published 10 years ago pointed out that many users cannot cope with the number and complexity of passwords, and resort to insecure workarounds as a consequence. We present a study which re-examined password policies and password practice in the workplace today. 32 staff members in two organisations kept a password diary for 1 week, which produced a sample of 196 passwords. The diary was followed by an interview which covered details of each password, in its context of use. We find that users are in general concerned to maintain security, but that existing security policies are too inflexible to match their capabilities, and the tasks and contexts in which they operate. As a result, these password policies can place demands on users which impact negatively on their productivity and, ultimately, that of the organisation. We conclude that, rather than focussing password policies on maximizing password strength and enforcing frequency alone, policies should be designed using HCI principles to help the user to set an appropriately strong password in a specific context of use.
Article
Full-text available
The transtheoretical model posits that health behavior change involves progress through six stages of change: precontemplation, contemplation, preparation, action, maintenance, and termination. Ten processes of change have been identified for producing progress along with decisional balance, self-efficacy, and temptations. Basic research has generated a rule of thumb for at-risk populations: 40% in precontemplation, 40% in contemplation, and 20% in preparation. Across 12 health behaviors, consistent patterns have been found between the pros and cons of changing and the stages of change. Applied research has demonstrated dramatic improvements in recruitment, retention, and progress using stage-matched interventions and proactive recruitment procedures. The most promising outcomes to data have been found with computer-based individualized and interactive interventions. The most promising enhancement to the computer-based programs are personalized counselors. One of the most striking results to date for stage-matched programs is the similarity between participants reactively recruited who reached us for help and those proactively recruited who we reached out to help. If results with stage-matched interventions continue to be replicated, health promotion programs will be able to produce unprecedented impacts on entire at-risk populations.
Chapter
To protect computers from various types of cyberattack, users are required to learn appropriate security behaviors. Different persuasion techniques to encourage users to take security behaviors are required according to user attitude toward security. In this paper, we first propose a Security Behavior Stage Model (SeBeST) which classifies users into five stages in terms of attitude toward security measurements; having security awareness and taking security behaviors. In addition, we focus on OS updating behaviors as an example of security behaviors and evaluated effective OS update messages for users in each stage. We create message dialogs which can promote user OS updating behaviors. We conduct two online surveys; we analyze the validity of SeBeST in Survey1 and then evaluate effective messages for each stage in Survey2. We find that SeBeST has high validity and appropriate messages for the users in each stage differ from one another.
Conference Paper
The security field relies on user studies, often including survey questions, to query end users' general security behavior and experiences, or hypothetical responses to new messages or tools. Self-report data has many benefits -- ease of collection, control, and depth of understanding -- but also many well-known biases stemming from people's difficulty remembering prior events or predicting how they might behave, as well as their tendency to shape their answers to a perceived audience. Prior work in fields like public health has focused on measuring these biases and developing effective mitigations; however, there is limited evidence as to whether and how these biases and mitigations apply specifically in a computer-security context. In this work, we systematically compare real-world measurement data to survey results, focusing on an exemplar, well-studied security behavior: software updating. We align field measurements about specific software updates (n=517,932) with survey results in which participants respond to the update messages that were used when those versions were released (n=2,092). This allows us to examine differences in self-reported and observed update speeds, as well as examining self-reported responses to particular message features that may correlate with these results. The results indicate that for the most part, self-reported data varies consistently and systematically with measured data. However, this systematic relationship breaks down when survey respondents are required to notice and act on minor details of experimental manipulations. Our results suggest that many insights from self-report security data can, when used with care, translate to real-world environments; however, insights about specific variations in message texts or other details may be more difficult to assess with surveys.
Conference Paper
Information security programs are instituted by organizations to provide guidance to their users who handle their data and systems. The main goal of these programs is to foster a positive information security culture within the organization. In this study, we present a literature review on information security culture by outlining the factors that contribute to the security culture of an organization and developing a framework from the synthesized research. The findings in this review can be used to further research in information security culture and can help organizations develop and improve their information security programs.
Conference Paper
Computer security tools usually provide universal solutions without taking user characteristics (origin, income level, ...) into account. In this paper, we test the validity of using such universal security defenses, with a particular focus on culture. We apply the previously proposed Security Behavior Intentions Scale (SeBIS) to 3,500 participants from seven countries. We first translate the scale into seven languages while preserving its reliability and structure validity. We then build a regression model to study which factors affect participants' security behavior. We find that participants from different countries exhibit different behavior. For instance, participants from Asian countries, and especially Japan, tend to exhibit less secure behavior. Surprisingly to us, we also find that actual knowledge influences user behavior much less than user self-confidence in their computer security knowledge. Stated differently, what people think they know affects their security behavior more than what they do know.
Article
Previous research has established continuance models that explain and predict an individual's behaviors when engaged with hedonic or functional systems or environments that provide productivity-enhancing outcomes. However, within the context of information security, these models are not applicable and fail to accurately assess the circumstances in which an individual engages in protective security behaviors beyond an initial adoption. This research addresses this gap and establishes a model for explaining an individual's continued engagement in protective security behaviors, which is a significant problem in securing enterprise information resources. Within this model, protection motivation theory (PMT) is considered an underlying theoretical motivation for continuance intention using constructs such as perceived threat severity, perceived threat susceptibility, self-efficacy, and response efficacy as direct antecedents of behavioral intents and indirect predictors of continuance behavior. Furthermore, the introduction of perceived extraneous circumstances is used to reconcile the ?acceptance-discontinuance anomaly.? A novel research methodology for measuring actual security behavior continuance was developed for this investigation. Experimental results indicate support for all of the proposed relationships, with the exception of response efficacy?continuance intent. Nearly half of the variance in the dependent variable, continuance behavior, was explained by the model. This is the first comprehensive empirical investigation of protective security behavior continuance intention. The findings have practical implications for security administrators and security technology solution providers, and they have theoretical ramifications in the area of behavioral information security and protection motivation theory.
Conference Paper
The Security Behavior Intentions Scale (SeBIS) measures the computer security attitudes of end-users. Because intentions are a prerequisite for planned behavior, the scale could therefore be useful for predicting users' computer security behaviors. We performed three experiments to identify correlations between each of SeBIS's four sub-scales and relevant computer security behaviors. We found that testing high on the awareness sub-scale correlated with correctly identifying a phishing website; testing high on the passwords sub-scale correlated with creating passwords that could not be quickly cracked; testing high on the updating sub-scale correlated with applying software updates; and testing high on the securement sub-scale correlated with smartphone lock screen usage (e.g., PINs). Our results indicate that SeBIS predicts certain computer security behaviors and that it is a reliable and valid tool that should be used in future research.
Book
Early user interface (UI) practitioners were trained in cognitive psychology, from which UI design rules were based. But as the field has evolved since the publication of the first edition of this book, designers enter the field from many disciplines. Practitioners today have enough experience in UI design that they have been exposed to design rules, but it is essential that they understand the psychology behind the rules in order to effectively apply them. In this completely updated and revised edition, Jeff Johnson provides you with just enough background in perceptual and cognitive psychology that UI design guidelines make intuitive sense rather than being just a list or rules to follow. In this completely updated edition, you'll find new chapters on human choice & decision making and hand-eye coordination & attention, as well as new examples, figures, and explanations throughout.
Article
Proposes a protection motivation theory that postulates the 3 crucial components of a fear appeal to be (a) the magnitude of noxiousness of a depicted event, (b) the probability of that event's occurrence, and (c) the efficacy of a protective response. Each of these communication variables initiates corresponding cognitive appraisal processes that mediate attitude change. The proposed conceptualization is a special case of a more comprehensive theoretical schema: expectancy-value theories. Several suggestions are offered for reinterpreting existing data, designing new types of empirical research, and making future studies more comparable. The principal advantages of protection motivation theory over the rival formulations of I. L. Janis and of H. Leventhal are discussed. (81 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
The psychological principles that govern the perception of decision problems and the evaluation of probabilities and outcomes produce predictable shifts of preference when the same problem is framed in different ways. Reversals of preference are demonstrated in choices regarding monetary outcomes, both hypothetical and real, and in questions pertaining to the loss of human lives. The effects of frames on preferences are compared to the effects of perspectives on perceptual appearance. The dependence of preferences on the formulation of decision problems is a significant concern for the theory of rational choice.
A Self-Report Measure of End-User Security Attitudes(SA-6)
  • C Faklaris
  • L Dabbish
  • J I Hong
The Effect of Social Influence on Security Sensitivity
  • S Das
  • T H Kim
  • L A Dabbish
  • J I Hong
Behavior Ever Follows Intention?: A Validation of the Security Behavior Intentions Scale(SeBIS)
  • S Egleman
  • M Harbach
  • E Peer
A Typology of Perceived Triggers for End-User Security and Privacy Behaviors
  • S Das
  • L A Dabbish
  • J I Hong
Impact of User Characteristics on Attitudes Towards Automatic Mobile Application Updates
  • A Mathur
  • M Chetty
Scaling the Security Wall: Developing a Security Behavior Intentions Scale (SeBIS)
  • S Egleman
  • E Peer
Continuance of Protective Security Behavior: A Longitudinal Study
  • M Warkentin
  • A C Johnston
  • J Shropshire
  • W Barnett
They Keep Coming Back Like Zombies Improving Software Updating Interfaces
  • A Mathur
  • J Engel
  • S Sobti
  • V Chang
  • M Chetty
A Typology of Perceived Triggers for End-User Security and Privacy Behaviors
  • das
The Effect of Social Influence on Security Sensitivity
  • das
They Keep Coming Back Like Zombies Improving Software Updating Interfaces
  • mathur
A Self-Report Measure of End-User Security Attitudes(SA-6)
  • faklaris