Conference PaperPDF Available

Review Of Behavioural Theories In Security Compliance And Research Challenges


Abstract and Figures

Inconsistent findings on the effect of various determinants of cyber security behaviour emphasise the need for further understanding of the applicability of compliance theories. The paper provides a critical review of determinants of users' cyber security behaviour and establishes directions for future research. Background Cyber security behaviour has been studied using a range of behavioural theories. Factors from these theories help organisations to develop suitable initiatives to encourage positive compliance from the employees. Contribution The paper integrates factors that can impact cyber security behaviour from Theory of Planned Behaviour, Protection Motivation Theory, Rational Choice Theory and General Deterrence Theory into an overarching framework for better connection of the theories. Previous studies' findings were analysed to establish research challenges in the field. Future research should investigate the complex interaction between organizational and personal characteristics so that a security program can be developed that can effectively engage employees with security tasks even in demanding work environment.
Content may be subject to copyright.
July 31 - August 5 2017, Ho Chi Minh City (Saigon), Vietnam
Accepting Editor: Eli Cohen │ Received: (date) │ Revised: (date) │ Accepted: (date).
Cite as: Pham, H-C., Brennan, L., & Richardson. J. (2017). Review of behavioural theories in security compli-
ance and research challenges. Proceedings of the Informing Science and Information Technology Education Conference, Vi-
etnam, pp. 65-76. Santa Rosa, CA: Informing Science Institute. Retrieved from
(CC BY-NC 4.0) This article is licensed to you under a Creative Commons Attribution-NonCommercial 4.0 International
License. When you copy and redistribute this paper in full or in part, you need to provide proper attribution to it to ensure
that others can later locate this work (and to ensure that others do not accuse you of plagiarism). You may (and we encour-
age you to) adapt, remix, transform, and build upon the material for any non-commercial purposes. This license does not
permit you to use this material for commercial purposes.
Hiep-Cong Pham*
RMIT University Vietnam,
Ho Chi Minh City, Vietnam
Linda Brennan
RMIT University, Melbourne, Australia
Joan Richardson
RMIT University, Melbourne, Australia
* Corresponding author
Inconsistent findings on the effect of various determinants of cyber security
behaviour emphasise the need for further understanding of the applicability of
compliance theories. The paper provides a critical review of determinants of
users’ cyber security behaviour and establishes directions for future research.
Cyber security behaviour has been studied using a range of behavioural theo-
ries. Factors from these theories help organisations to develop suitable initia-
tives to encourage positive compliance from the employees.
The paper integrates factors that can impact cyber security behaviour from
Theory of Planned Behaviour, Protection Motivation Theory, Rational Choice
Theory and General Deterrence Theory into an overarching framework for bet-
ter connection of the theories. Previous studies’ findings were analysed to es-
tablish research challenges in the field.
Future Research
Future research should investigate the complex interaction between organiza-
tional and personal characteristics so that a security program can be developed
that can effectively engage employees with security tasks even in demanding
work environment.
security compliance, theory of planned behaviour, protection motivation theory,
rational choice theory, general deterrence theory
Behavioural Theories in Security Compliance
The main objective of information security is to protect confidentiality, integrity and the availability
of respective data, information and organisational computer services (Dhillon & Backhouse, 2001).
Information security is the practice of defending the safety of data and information in a computer
system against unauthorised disclosure, modification, or destruction. In addition, information securi-
ty also protects the computer system itself and resources against unauthorised use, modification, or
denial of service (von Solms & von Solms, 2004). Traditionally, information security measures were
designed to address security risks in four phases: deterrence, prevention, detection, and recovery
(Warkentin & Willison, 2009). The deterrence and prevention phases aim to discourage and minimise
breaches of individuals located within or outside the organisation from intentionally or accidentally
violating security policies or procedures, which may lead to compromises of confidentiality, integrity,
or availability of information and computing resources. The detection and recovery phases aim to
detect unauthorised security activities and recover damaged information or systems and restore them
to their original conditions prior to the security violation.
Users’ failure to follow security procedures is the most common cause of security problems rather
than deliberate harmful external attack events (Crossler et al., 2013). Various organisational and per-
sonal factors can influence how employees respond to security requirements (Furnell & Rajendran,
2012). With the advancement of security technologies, certain measures can be automated and there-
fore little user involvement is required, thus reducing the potential for human errors while ensuring
information security objectives. For some security measures or practices that cannot be fully auto-
mated, however, user compliance is vital to ensure effective security management. Security compli-
ance describes the behaviour of users, who, for whatever reason may or may not follow an organisa-
tion’s security policies when accessing corporate IT networks and services (Warkentin & Willison,
2009). Security measures are less effective if the employees do not use them and choose to act un-
safely. For example, automatically scheduled password changes together with password complexity
checks can minimise reliance on users to regularly update and use difficult-to-guess passwords.
Hence, users may change passwords repeatedly and have to create difficult-to-guess ones. However,
some users may resort to writing down passwords on a sticky note and attaching the note to their
computer for easy access. These types of unsafe practices can defeat even the most sophisticated
security systems.
Security compliance as an individual behavioural choice can be affected by organisational and per-
sonal factors. Several behavioural theories have been employed as the underpinning framework in
compliance studies. For example, Theory of Planned Behaviour (TPB) (Ajzen, 1991), Protection Mo-
tivation Theory (PMT) (Rogers, 1983), General Deterrence Theory (GDT) (Gibbs, 1975), and Ra-
tional Choice Theory (RCT) (Becker, 1968) have been examined in terms of their effect on security
compliance intention and behaviour. Given the existence of a wide range of compliance determi-
nants from several theories, this paper aims to organise those determinants into an overarching
framework based on the TPB and highlight remaining challenges in motivating employees’ security
The paper is structured as follows. The next section presents the TPB as an overarching framework
to incorporate compliance factors from other behavioural theories. Future research directions to ad-
dress remaining challenges are discussed in the third section. The final section is a brief conclusion
with suggestions for future research.
Security compliance refers to the behaviour of users in accordance with security polices when access-
ing and using the IT network and services. Thus, behavioural theories have been used widely in secu-
rity compliance literature to understand factors that motivate user security compliance (Sommestad,
Hallberg, Lundholm, & Bengtsson, 2014). The TPB is one of the most influential frameworks for
Pham, Brennan, & Richardson
studying human behaviour, as it explains behavioural antecedents (Ajzen, 2001). The TPB states that
perceived behaviour control, attitude towards the behaviour, and subjective norms which predict in-
tention account for a considerable amount of actual behaviour. For example, the TPB predicts that a
customer may have an intention to buy a car if he/she knows how to drive it, whether he/she has a
positive impression of some aspects of the car, and favourable feedback received from acquaintances
that have purchased the same or a similar vehicle. A strong purchase intention towards the car is a
strong indication that the customer will buy it.
Perceived behavioural control refers to evaluation of factors, whether internal or external, that facili-
tate or impede the performance of the behaviour (Ajzen, 2002). User attitudes towards behaviour
can include positive or negative personal evaluation of performing (or not performing) a behaviour.
Subjective norms are beliefs about other people’s expectations about the behaviour that results in
perceived social pressure to perform (or not to perform).
In other words, the TPB clearly distinguishes three different stages leading to behaviour. In stage one
various factors can influence the attitude towards a behaviour. In stage two, behavioural controls,
attitudes, and subjective norms influence an intention towards performing the behaviour, and lastly
the intention significantly predicts the actual behaviour (Ajzen 2001). The relationship between be-
havioural factors influencing potential security compliance is described in Figure 1.
Figure 1. Theory of Planned Behaviour (Adapted from Ajzen, 2001)
In the context of security compliance, the TPB posits that if an employee (1) perceives sufficient
capacity to complete the security task, (2) enjoys a favourable attitude towards performing it, and (3)
observes other people in the organisation are also actively performing the practice, he/she will likely
comply, which can result in actual security compliance. Studies in behavioural security compliance
have explored antecedents of compliance attitudes, intention, and behaviour (Sommestad et al.,
In accordance with the TPB’s main argument on relationship between intention and behaviour, most
studies on security compliance measured security intention as the dependent construct and argued
that intention would lead to actual behaviour (Sommestad et al., 2014). A main reason that most
studies stopped short of recording actual security behaviour is that monitoring the behaviour in an
organisation is difficult (Crossler et al., 2013). For instance, security behaviour can be recorded indi-
rectly through electronic means, such as server logs, cameras, or through managerial monitoring of
user behaviour. However, access to accurate security information sources detailing user security ac-
tions in organisational contexts can be difficult to obtain for research purposes due to cost and con-
fidentiality concerns (Warkentin, Straubb, & Malimagea, 2012).
The following sections present factors from GDT, RCT and PMT that may predict security intention
to comply using the TPB as the underpinning framework.
Behavioural Theories in Security Compliance
Perceived behavioural control refers to the perceived ease or difficulty of performing the security
compliance requirements, which can depend on whether a person may, or may not, have the ability to
perform the intended tasks. Perceived behavioural control can impact a person’s beliefs about their
intentions and actions (Ajzen, 2002). The concept of self-efficacy is described using Bandura’s (1977)
Social Cognitive Theory, and Icek Ajzen’s (1991) TPB as a component of perceived behavioural con-
trol (Maddux & Volkmann, 2010). The control construct describes one’s self-confidence in one’s abil-
ity to mobilise motivation, cognitive resources, and actions needed to successfully complete a specific
task within a given context. Self-efficacy influences the amount of effort, initiation, and maintenance
of coping efforts in adverse situations (Bandura, 1997).
Security self-efficacy describes individuals security knowledge and expertise that enables them to
perform their security tasks, as well as cope with changing security requirements. Self-efficacy is also
included in PMT, which theorises that knowledgeable and skilful employees are more amenable to
take protective security tasks (Vance, Siponen, & Pahnila, 2012). Self-efficacy has been recognised as
a key factor that positively influences security compliance (Rhee, Kim, & Ryu, 2009; Johnston &
Warkentin, 2010). For example, self-efficacy was reported to have a positive impact on protection
motivation and related compliance with security policies (Vance et al., 2012), to strengthen security
effort (Rhee et al., 2009), and directly influence individual security practice (Rhee et al., 2009; Vance
& Siponen, 2012).
Another component of perceived behavioural control is locus of control, which is the perception of
whether a person can control the outcome of their behaviour due to either internal or external fac-
tors (Ajzen, 2002). Self-efficacy is often considered as an internal locus of control, while the external
control in security compliance refers to organisational factors that may affect an employee’s capacity
to perform security tasks (Cox, 2012). For example, employees need organisational resources, such as
time to get acquainted with security policies, or easy access to the policies, or training required in or-
der to comply with security policies (Pahnila, Siponen, & Mahmood, 2007). External locus of control
was not consistently found to positively affect compliance intentions. Cox (2012) noticed that organi-
sational supports did not contribute to perceived behavioural control or compliance intention.
Pahnila et al. (2007) reported compliance facilitating conditions negatively affected compliance inten-
tion and argued that users viewed external security processes as the responsibilities of the organisa-
tion. Consequently, the more effective the security resources the more reliant employees would be on
the organisation. Users often leave security responsibility to security managers and the technology.
Several TPB-based studies have found that attitude towards a behaviour can be the strongest predic-
tor of behavioural intent (Westaby, 2005). Likewise, the majority of security compliance literature has
focused on investigating the compliance attitude and its antecedents to predict actual compliance
(Bulgurcu, Cavusoglu, & Benbasat, 2010; Sommestad et al., 2014). Other behavioural theories have
also been applied to explain how compliance attitudes can be formulated; thus appropriate measures
can be used to alter user attitudes towards security compliance. Factors from three behavioural theo-
ries including PMT, GDT and RCT are now reviewed to explain how security compliance attitudes
can be affected by various factors.
Severity and Vulnerability of Security Threats
PMT has been widely used to explain protective behaviour due to fear (Rogers, 1983). Individuals are
motivated to protect themselves from physical, social, and psychological threats by invoking coping
mechanisms, which are conducted by assessing threat and appraising relevant actions (Rogers, 1983).
PMT states that fear influences cognition, attitudes, intentions, and protective actions. Threat ap-
praisal comprises assessment of perceived severity, vulnerability, and rewards (benefits of taking a
risk). Coping appraisal assessment comprises response efficacy, cost and self-efficacy, which deter-
Pham, Brennan, & Richardson
mine how well people perceive themselves as being able to respond to a threat. Protection motivation
(i.e., protective intention) is a mediating variable whose function is to evoke, sustain, and direct pro-
tective behaviour, which facilitates the adoption of adaptive behaviours (taking the advised behav-
iours) if the execution of the advised behaviour leads to a reduction of fear (Suton, 1982). In a situa-
tion where the performance of the advised behaviour does not lead to a reduction of fear, maladap-
tive coping actions, such as denial of the threat or avoidance of the fear-evoking message, may be
used as a way of avoiding fear. PMT has been applied to health-related behaviours, such as reducing
alcohol use (Stainback & Rogers, 1983), enhancing healthy lifestyles (Stanley & Maddux, 1986), en-
hancing diagnostic health behaviours (Rippetoe & Rogers, 1987), and prevention of disease (Tanner,
Hunt, & Eppright, 1991).
Unsafe security behaviour can be compared to making unhealthy behavioural choices. People comply
with security measures to reduce the fear of breach consequences (Crossler et al., 2013). PMT-based
compliance approaches argue that when facing a security threat, an employee conducts threat and
coping assessments to determine an adaptive (compliance) or maladaptive response (non-
compliance) (Vance et al., 2012).
The severity of a security threat is measured by the characteristics evidencing its negative impacts on
the organisation including confidentiality, integrity, and availability of access to information and re-
sources. Perceived severity of a threat, such as the negative impact of opening an infected email at-
tachment, will influence a user to behave more cautiously by limiting or eliminating such practice.
Users are more likely to respond to security risks that are more certain than those less likely to hap-
pen (Rogers, 1983). Vulnerability or likelihood of security threats represent how likely an employee
perceives that an unwanted incident will happen, if they do not complete a required security task
(Vance et al., 2012). However, individuals can have different perceptions of vulnerability to the same
security threat as one may perceive a security threat as very likely, while another feels quite the oppo-
site (Ng, Kankanhalli, & Xu, 2009). Consequently, for the same security risk an employee can take a
preventive measure against the risk while another may ignore it.
Response efficacy assesses the perceived effectiveness of taking security measures to minimise the
risk of a security threat. The resources and security measures that the organisations provide and im-
plement to facilitate employees’ security compliance should demonstrate their effectiveness against
the threats. Security measures that are perceived as more effective would influence an employee to
take other recommended measures given alignment between their competence and the security sys-
tem’s requirements (Vance et al., 2012). Similar to the TPB the PMT also speculates that self-efficacy
is a determinant of protection motivation. Self-efficacy can positively influence protective behaviour
such as performing security tasks (Herath & Rao, 2009a; Ifinedo, 2011; Vance et al., 2012).
PMT-based studies found evidence for mixed impacts of threat assessments on compliance attitudes
and intentions. The security threat severity and the perceived effectiveness of the measures (i.e., re-
sponse efficacy) have a strong influence on the intention of taking the advised security behaviour
(Vance et al., 2012). Nevertheless, Cox (2012) did not find that risk severity had a significant role in
users’ intention. Likewise, the impact of vulnerability on compliance intention was not clear. An in-
significant impact of vulnerability on compliance intentions was observed (Vance et al., 2012), it was,
however, identified as positively affecting compliance intention (Ifinedo, 2011).
Fear-based communications help promote security compliance by ensuring users are aware of the
severity and vulnerability of security risks, and the effectiveness of preventative measures provided
by the organisation (Brennan & Binney, 2010). When facing potential security risks, people may as-
sess the severity of the risks and act in a way to avoid the consequences, especially if non-compliance
evokes a punishment. A clearly described and understood risk that is likely to occur would be more
likely to have an impact on compliance choices. Given a similar level of a security threat, a less likely
threat would have less influence on the user’s motivation to act safely and avoid risk.
Behavioural Theories in Security Compliance
There are some issues related to the effectiveness of using a fear-based compliance approach. Poor
security communication makes it difficult for users to respond to a real security threat since they may
underestimate the likelihood of the threat. Often users are motivated to respond to a security threat
when the risk is evident and personal (Pfleeger & Caputo, 2011). Furthermore, little is known about
the circumstances in which individuals feel fearful and the characteristics of the individuals that may
serve to accentuate or diminish the emotion of fear in security compliance situations (Crossler et al.,
2013). Finally, Brennan and Binney (2010) stated that externally motivated fears have a short term
motivating influence and are not self-sustaining, hence they are not effective to motivate security
compliance in the long term.
Response Cost for Compliance
Attitudes towards security compliance can be drawn from RCT, which puts forward two premises for
the consideration of an offence (non-compliance): (1) balancing the costs and benefits of offending,
and (2) the decision maker’s perceived or subjective expectation of reward and cost (Becker, 1968).
For example, the habits of changing password frequently and more difficult to guess are impacted
positively from the training, enforcement of acceptable use policy (AUP), monitoring, and reward
system. However, the more frequently changing and the more difficult of guessing passwords, the
more difficult for individuals to remember their passwords and the more likely an individual will
write them down (Stanton, Stam, Mastrangelo, & Jolton, 2005). Therefore, the increase in training,
AUP, monitoring, and reward systems may lead to the higher risk of losing information but very
slightly. Correspondingly, the more a user perceives favourable rewards for non-compliance, the
higher the chance he/she does not comply with security policies. An example of a reward for non-
compliance could be saving time (Woon, Tan, & Low, 2005). When the perceived direct costs to the
users incurred from the security threat are lower than the indirect cost or effort required by the user
to circumvent the threat, users can ignore security compliance requirements (Schneier, 2008).
Inconsistent findings on the impacts of compliance costs on intention to comply have been reported
in prior studies. Ng et al. (2009) noticed that a perceived barrier or inconvenience for practising safe
email had an insignificant impact on the users’ safe email practice. Security response efficacy and self-
efficacy were found to have a direct and significant impact on compliance intentions, whereas re-
sponse cost and security concerns did not appreciably contribute to predicting compliance intentions
(Herath & Rao, 2009b).
Vance et al. (2012), however, detected that compliance cost negatively influenced employees’ compli-
ance intention due to employees considering the inconvenience of following information security
policies a legitimate reason for not complying with those policies. Employees may find security com-
pliance time-consuming and inconvenient as it has the potential to obstruct their daily routine work,
which negatively impacts compliance levels (Dhillon & Torkzadeh, 2006; Vance & Siponen, 2012).
There are inconsistent findings on the impact of compliance cost on security behaviour. Contextual
factors such as organisational support and personal resources may affect the impact of personal re-
sponse cost on compliance intention (Herath & Rao, 2009a).
Sanctions for Non-Compliance and Rewards for Compliance
GDT has been used as a theoretical basis for understanding why employees follow (or do not follow)
their organisation’s information security policies (Hu, Xu, Dinev, & Ling, 2011). GDT emphasises
the use of punishments to deter people from offending, which proposes that individuals assess de-
terrent certainty and severity to determine actions to be taken when a violation of the rules occurs
(Gibbs, 1975). In a security compliance context, organisations might employ security mandates and
disciplinary actions to manage and motivate compliance (Bulgurcu et al., 2010; Herath & Rao, 2009a).
As a result, communications of certainty and severity of penalties for rule-breaking behaviour have
been considered to be effective strategies in preventing employees from violating security policies.
Pham, Brennan, & Richardson
GDT-based security measures are mainly based on fear of punishment as an antecedent to changing
an undesirable behaviour. However, the effectiveness of threats of punishment to achieve security
compliance has been inconsistent. For example, fear of penalties for non-compliance has been re-
ported to have a significant impact on security behaviour (Herath & Rao, 2009a). These studies
showed that if employees perceive high certainties of being caught for violating security policies, they
were more likely to comply; moreover, the certainty of being detected outweighs fear of the punish-
ment’s severity. On the contrary, other studies found that sanctions did not have a significant impact
on actual compliance (Herath & Rao, 2009a; Hu et al., 2011).
Associated with sanctions for non-compliance, rewards can also be used to promote compliance.
Rewards can include tangible or intangible compensations that an organisation gives to an employee
in return for compliance with the security requirements. Compensations may include monetary re-
wards, such as pay rises or bonuses, or nonmonetary rewards, including personal mention, formal
recognition in oral or written assessment reports, and promotions (Bulgurcu et al., 2010).
The granting of rewards for security compliance may not yet be common practice (Guo & Yuan,
2012). Boss, Kirsch, Angermeier, Shingler, and Boss (2009) argued that rewards may increase how
mandatory users perceive compliance with security policies, which in turn may enforce security pre-
caution-taking behaviour. Similarly, Pahnila et al. (2007) and Siponen, Mahmood, and Pahnila (2014)
hypothesised that rewards would increase actual security compliance. Both studies, however, found
rewards did not contribute to either how obligatory security compliance was perceived to be (Boss et
al., 2009) or actual compliance (Pahnila et al., 2007).
Reasons that may explain the inconsistent findings of the effectiveness of sanctions and lack of sup-
port for rewards in motivating security compliance are:
Few organisations implement schemes of sanctions and rewards for security compliance, so
the actual effectiveness cannot be measured. In addition enforcing a sanction and reward-
based approach can have a negative impact on staff cooperation(Guo & Yuan, 2012).
Boss et al. (2009) explained that, unlike other work tasks, there is little that an individual can
do to exceed expected security compliance; hence organisations may not employ compliance
Penalising or rewarding a user can be impractical for organisations due to time constraints
and difficulty in the description of a concrete evidence trail (Guo & Yuan, 2012).
Promoting compliance through sanctions could promote a culture of lies, deception, and
avoidance of responsibility (Ramachandran, Rao, & Goles, 2008).
The impact of social influences on an individual’s behaviours and beliefs have been widely acknowl-
edged (Cialdini & Goldstein, 2004). Social influences are often referred to as subjective norms
(Ajzen, 1991) and can take the form of introjected motivation (Gagné & Deci, 2005). For instance,
subjective norms refer to the users’ beliefs about the normative expectations and social pressure that
drive people’s intention to perform security behaviours, as posited in the TPB (Ajzen, 1991). Self-
motivation for security compliance is the ideal where people can be trusted to work within relevant
parameters without surveillance, thereby decreasing costs of security monitoring. In the absence of
self-motivation, extrinsic factors and other people (social influences and relatedness) can motivate
people to comply with security requirements. Members of a work environment, such as peers, col-
leagues, or supervisors, can exert social influence on an individual to perform security tasks
(Johnston & Warkentin, 2010). If an employee believes that other important members in the work-
place expect security compliance from him/her, then he/she is more likely to perform appropriate
security tasks (Bulgurcu et al., 2010). Subjective norms are sometimes also referred to as social influ-
ence (Johnston & Warkentin, 2010) or normative beliefs (Bulgurcu et al., 2010). The positive effect
of subjective norms on compliance intention has been reported in several studies (Bulgurcu et al.,
2010; Vance et al., 2012).
Behavioural Theories in Security Compliance
Based on the review of the behavioural compliance theories above, several research challenges are
now presented in Table 1.
Tab le 1. Summary of Behavioural Security Theories based on TPB
Theory Factors Challenges
ceived Behav-
iour Control
1. Key factors: self-efficacy to per-
form security tasks, attitudes to-
ward compliance and security prac-
tice of other stakeholders.
2. Security compliance can be en-
hanced by developing self-efficacy,
creating a positive attitude toward
security tasks, and establishing an
organisational safe security culture.
a. A main assumption that compliance
intention would lead to security be-
b. May not accurately capture determi-
nants of actual security behaviour.
c. Need to employ methods of record-
ing true security behaviour and less
reliant on self-reported responses.
TPB Atti-
1. Factors: fear of consequences of
security threats, effectiveness of
response measures
2. Security compliance can be en-
couraged by communicating secu-
rity risk severity and vulnerability,
effectiveness of security measures,
and training to enhance self-
a. Focusing on nature of security risks.
b. Accurate risk assessment is difficult
due to its complexity and subject to
behavioural biases, which affect indi-
viduals’ ability to assess a risk objec-
tively and accurately.
c. Short-term effectiveness in changing
security behaviour.
TPB Atti-
General De-
terrence Theo-
1. Key factors: fear of severity and
likelihood of sanctions for non-
2. Security compliance can be
achieved employing strict security
behaviour monitoring and imple-
mentation of disciplinary actions.
a. Focusing external enforcement
b. Costly security monitoring can have
negative impact on staff morale and
TPB Atti-
Choice Theory
1. Key factors: perceived extrinsic
cost and benefits of performing
security tasks.
2. Security compliance can be moti-
vated by streamlining security pro-
cesses, minimising impacts of se-
curity tasks on work productivity,
and providing resources to facili-
tate users’ compliance.
a. Security compliance cost may be una-
voidable. Lack of immediate benefits
of compliance.
b. Lack of understanding of character-
istics of security tasks and compli-
ance cost.
c. Low compliance cost still does not
guarantee better compliance.
While existing studies employing numerous behavioural theories provide a solid foundation for ex-
plaining employees’ security compliance decisions, a complete knowledge of the phenomenon re-
mains a challenge. Evidence of the incomplete knowledge can be shown in the percentage of ex-
plained variance of the compliance variable outcome in existing security compliance models, which
varies between 25-70 per cent range (Sommestad et al., 2014). A complete understanding of security
behaviour is problematic because it can be affected by many environmental and personal factors.
Furnell and Rajendran (2012) identified a mix of job characteristics, and organisational and non-work
factors that all play a role in affecting employees’ security compliance. To complicate the issue fur-
Pham, Brennan, & Richardson
ther, individuals’ personality attributes can also act as a filter of the environmental impact and affect
individual attitudes and behaviour toward security behaviour (Furnell Rajendran, 2012; Pfleeger &
Caputo, 2011).
Employees’ unsafe security behaviour has been considered the weakest link in overall security pro-
grams. Safe security practice and complying with security guidelines are essential to minimise security
risks caused by the users. Current behavioural theories have contributed to better understanding of
how security behaviour can be improved, though not yet complete. This paper reviews key factors
influencing security compliance based on several behavioural theories. Challenges to successfully ap-
ply those factors are identified and future research is proposed. The paper recommends that com-
bined organisational and personal focuses which embolden employees to become involved with secu-
rity activities is important; nevertheless, the level of emotional and cognitive resources that people
bring to performing security tasks might be the key to maintenance of expected security behaviour,
even in an unfavourable security environment (Crawford, LePine, & Rich, 2010).
Future research should investigate the complex interaction between organizational and personal
characteristics so that a security program can be developed that can effectively engage employees
with security tasks even in demanding work environment.
Ajzen, I. (1991). Theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179
211. doi:10.1016/0749-5978(91)90020-T
Ajzen, I. (2001). Nature and operation of attitudes. Annual Review of Psychology, 52(1), 27-58.
Ajzen, I. (2002). Perceived behavioral control, self-efficacy, locus of control, and the theory of planned behav-
ior. Journal of Applied Social Psychology, 32(4), 665-683.
Bandura, A. (1977). Self-efficacy: toward a unifying theory of behavioral change. Psychological Review, 84(2), 191
215. doi:
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: Freeman.
Becker, G. S. (1968). Crime and punishment: An economic approach. Journal of Political Economy, 76(2), 169.
Boss, S. R., Kirsch, L. J., Angermeier, I., Shingler, R. A., & Boss, R. W. (2009). If someone is watching, I’ll do
what I’m asked: Mandatoriness, control, and information security. European Journal of Information Systems, 18,
Brennan, L., & Binney, W. (2010). Fear, guilt and shame appeals in social marketing. Journal of Business Research,
63(2), 140-146.
Bulgurcu, B., Cavusoglu, H., & Benbasat, I. (2010). Information security policy compliance: An empirical study
of rationality-based beliefs and information security awareness. MIS Quarterly, 34(3), 523-548.
Cialdini, R. B., & Goldstein, N. J. (2004). Social influence: Compliance and conformity. Annual Review of Psychol-
ogy, 55, 591621. doi: 10.1146/annurev.psych.55.090902.142015
Cox, J. (2012). Information systems user security: A structured model of the knowingdoing gap. Computers in
Human Behavior, 28, 1849-1858.
Crawford, E. R., LePine, J. A., & Rich, B. L. (2010). Linking job demands and resources to employee engage-
ment and burnout: A theoretical extension and meta-analytic test. Journal of Applied Psychology, 95(5), 834-
848. doi:
Crossler, R. E., Johnston, A. C., Lowry, P. B., Hud, Q., Warkentin, M., & Baskerville, R. (2013). Future direc-
tions for behavioral information security research. Computer & Security, 32, 90-101.
doi: 10.1016/j.cose.2012.09.010
Behavioural Theories in Security Compliance
Dhillon, G., & Backhouse, J. (2001). Current directions in IS security research: Towards socio-organizational
perspectives. Information Systems, 11, 127-153.
Dhillon, G., & Torkzadeh, G. (2006). Value-focused assessment of information system security in organiza-
tions. Information Systems, 16(3), 293-314.
Furnell, S., & Rajendran, A. (2012). Understanding the influences on information security behaviour. Computer
Fraud & Security, 2012(3), 12-15. doi: 10.1016/s1361-3723(12)70053-2
Gagné, M., & Deci, E. L. (2005). Self-determination theory and work motivation. Journal of Organizational Behav-
ior, 26, 331-362.
Gibbs, J. P. (1975). Crime, punishment, and deterrence. New York, NY: Elsevier.
Guo, K. H., & Yuan, Y. (2012). The effects of multilevel sanctions on information security violations: A medi-
ating model. Information & Management, 49, 320-326.
Herath, T., & Rao, H. R. (2009a). Encouraging information security behaviors in organizations: Role of penal-
ties, pressures and perceived effectiveness. Decision Support Systems, 47, 154-165.
doi: 10.1016/j.dss.2009.02.005
Herath, T., & Rao, H. R. (2009b). Protection motivation and deterrence: A framework for security policy com-
pliance in organisations. European Journal of Information Systems, 18, 106-125. doi: 10.1057/ejis.2009.6
Hu, Q., Xu, Z. C., Dinev, T., & Ling, H. (2011). Does deterrence work in reducing information security policy
abuse by employees? Communications of the ACM, 54(6), 54-60.
Ifinedo, P. (2011). Understanding information systems security policy compliance: An integration of the theory
of planned behavior and the protection motivation theory. Computers & Security, 31, 83-95.
doi: 10.1016/j.cose.2011.10.007
Johnston, A. C., & Warkentin, M. (2010). Fear appeals and information security behaviors: An empirical study.
Management Information Systems Quarterly, 34(3), 549-566.
Maddux, J. E., & Volkmann, J. (2010). Self-efficacy handbook of personality and self-regulation: Blackwell Publishing
Ng, B. Y., Kankanhalli, A., & Xu, Y. C. (2009). Studying users’ computer security behavior: A health belief per-
spective. Decision Support Systems, 4, 815-825.
Pahnila, S., Siponen, M., & Mahmood, A. (2007). Employees’ Behavior towards IS security policy compliance. Paper pre-
sented at the 40th Hawaii International Conference on System Sciences.
Pfleeger, S. L., & Caputo, D. D. (2011). Leveraging behavioral science to mitigate cyber security risk. Computer
& Security, 31, 597-611.
Ramachandran, S., Rao, S. V., & Goles, T. (2008). Information security cultures of four professions: A comparative study.
Paper presented at the 41st Hawaii International Conference on System Sciences.
Rhee, H. S., Kim, C., & Ryu, Y. U. (2009). Self-efficacy in information security: Its influence on end users’ in-
formation security practice behavior. Computer & Security, 28, 816-826.
Rippetoe, P. A., & Rogers, R. W. (1987). Effects of components of protection motivation theory on adaptive
and maladaptive coping with a health threat. Journal of Personality and Social Psychology, 52, 596-604.
Rogers, R. W. (1983). Cognitive and physiological processes in fear appeals and attitude change: A revised theo-
ry of protection motivation. In J. T. Cacioppo & R. E. Petty (Eds.), Social psychophysiology: New York: Guil-
ford Press.
Schneier, B. (2008). The psychology of security. Retrieved from
Siponen, M., Mahmood, M. A., & Pahnila, S. (2014). Employee's adherence to information security policies: An
exploratory field study. Information & Management, 51, 217-224. doi: 10.1016/
Sommestad, T., Hallberg, J., Lundholm, K., & Bengtsson, J. (2014). Variables influencing information security
policy compliance: A systematic review of quantitative studies. Information Management & Computer Security,
22(1), 4275. doi:
Pham, Brennan, & Richardson
Stainback, R. D., & Rogers, R. W. (1983). Identifying effective components of alcohol abuse prevention pro-
grams: Effects of fear appeals, message style and source expertise. International Journal of Addictions, 18,
Stanley, M. A., & Maddux, J. E. (1986). Cognitive processes in health enhancement: Investigation of a com-
bined protection motivation and self-efficacy model. Basic and Applied Social Psychology, 7, 101-113.
Stanton, J. M., Stam, K. R., Mastrangelo, P., & Jolton, J. (2005). Analysis of end user security behaviors. Comput-
er & Security, 24, 124-133.
Suton, S. R. (1982). Fear-arousing communications: a critical examination of theory and research. In J.R Eiser
(Ed.), Social psychology and behavioural medicine (pp. 303-337). London: Wiley.
Tanner, J. F. J., Hunt, J. B., & Eppright, D. R. (1991). The Protection Motivation Model: A normative model of
fear appeals. Journal of Marketing, 55(July), 36-45.
Vance, A., & Siponen, M. (2012). IS Security policy violations: A rational choice perspective. Journal of Organiza-
tional and End User Computing, 24(1), 21-41. doi: 10.4018/joeuc.2012010102
Vance, A., Siponen, M., & Pahnila, S. (2012). Motivating IS security compliance: Insights from habit and pro-
tection motivation theory. Information & Management, 49, 190-198. doi: 10.1016/
von Solms, R., & von Solms, B. (2004). From policies to culture. Computer & Security, 23, 275-279.
Warkentin, M., Straubb, D., & Malimagea, K. (2012). Measuring secure behavior : A research commentary. Paper pre-
sented at the Annual Symposium on Information Assurance & Secure Knowledge Management, Albany,
Warkentin, M., & Willison, R. (2009). Behavioral and policy issues in information systems security: The insider
threat. European Journal of Information Systems, 18, 101-105. doi: 10.1057/ejis.2009.12
Westaby, J. D. (2005). Behavioral reasoning theory: Identifying new linkages underlying intentions and behavior.
Organizational Behavior and Human Decision Processes, 98(2), 97-120.
Woon, I., Tan, G., & Low, R. (2005). A protection motivation theory approach to home wireless security. Paper presented
at the International Conference on Information Systems (ICIS).
Dr Hiep Pham is a Senior Lecturer in the school of Business Infor-
mation Technology and Logistics at the RMIT University Vietnam. He
holds a PhD and EMBA from RMIT University and a Master of Com-
merce in Advanced Information Systems and Management from Univer-
sity of New South Wales, Australia. Pham’s research focuses on cyber
security behaviour and management, information management in Logis-
tics & SCM and educational technologies.
Linda Brennan was the Inaugural Professor of Advertising at RMIT
University. In the lead up to becoming a full time academic Professor
Brennan had an active consulting practice in marketing and strategic re-
search, working with a variety of markets, projects and industries. She has
expertise on both qualitative and quantitative research methods and she
has taught research strategies, research methods and 'market' research for
over 10 years. She has published articles and book chapters on research
methodologies (qual and quant) and has a solid understanding of para-
digms, research philosophies, multi-disciplinary and interdisciplinary re-
Behavioural Theories in Security Compliance
search approaches. She is the Editor of Communication, Politics & Culture and Associate Editor for
the Journal of Marketing for Higher Education as well as on the review board of several ranked
journals. She has served on ethics committees in four universities and is a Research Integrity Advisor
at RMIT University. She has supervised, examined and advised many PhD and Masters by Research
over the last 15 years.
Joan Richardson is an Associate Professor in the Department of Busi-
ness Information Technology and Logistics at RMIT University, Austral-
ia. She won an ALTC citation (2011) that recognized her particular con-
tribution to improving student satisfaction and student engagement
through the use of emerging technologies in Digital Literacy curriculum.
In addition, she has worked extensively with Pearson Education Australia
as the principal author for texts, e-texts and multi-media resource libraries
since 2000. Innovations include the use of social networking features to
enable peer engagement, SMS to disseminate assessment reminders and
performance feedback, websites, multi-choice tests and communications sent from the learning man-
agement systems to personal mobile devices. Her substantial record of Information Systems (IS) re-
search also includes six PhD completions and more than 75 peer reviewed book chapters, journals
and conference publications. She presents her research publications and professional achievements,
such as accreditation documentation addressing the Skills for the Information Age (SFIA) framework
at national and international conferences.
... One of the problems with these frameworks is that most of the time it tends to be just a theoretical framework, and when it comes to practical application, it is often difficult, and sometimes impossible, to implement it. This is also the case with behavioural information security research where many models and frameworks are used throughout literature to determine or explain attitude that individuals exhibit towards information security which is indicative of their eventual behaviour (Pham et al., 2017). Behavioural models such as knowledge, attitude and behaviour (Moletsane and Tsibolane, 2020), theory of planned behaviour (Sommestad et al., 2017) and general deterrence theory (Yuryna Connolly et al., 2017) are used to guide research Collective information security behaviour but are seldom incorporated in implementable frameworks which find their way to practical application. ...
... Therefore, a formal framework is proposed for group behaviour in information security. This proposed framework is based in part on aspects found in existing frameworks (Pham et al., 2017), i.e. norms, beliefs, attitudes and so on. The other integral part of the framework will then consist of those aspects that are specific to group behaviour, i.e. the use of a technique, such as behavioural threshold analysis that considers factors, such as the lemming effect and external contextual factors. ...
Full-text available
Purpose This paper aims to present the development of a framework for evaluating group behaviour in information security in practice. Design/methodology/approach Information security behavioural threshold analysis is used as the theoretical foundation for the proposed framework. The suitability of the proposed framework is evaluated based on two sets of qualitative measures (general frameworks and information security frameworks) which were identified from literature. The successful evaluation of the proposed framework, guided by the identified evaluation measures, is presented in terms of positive practical applications, as well as positive peer review and publication of the underlying theory. Findings A methodology to formalise a framework to analyse group behaviour in information security can successfully be applied in a practical environment. This application takes the framework from only a theoretical conceptualisation to an implementable solution to evaluate and positively influence information security group behaviour. Practical implications Behavioural threshold analysis is identified as a practical mechanism to evaluate information security group behaviour. The suggested framework, as implemented in a management decision support system (DSS), allows practitioners to assess the security behaviour and awareness in their organisation. The resulting information can be used to exert an influence for positive change in the information security of the organisation. Originality/value A novel conceptual mapping of two sets of qualitative evaluation measures is presented and used to evaluate the proposed framework. The resulting framework is made practical through its encapsulation in a DSS.
... Many models and frameworks are used throughout information security (InfoSec) literature to determine or explain attitude that individuals exhibit towards InfoSec which is indicative of their eventual behaviour [1]. Such frameworks and models are mainly focussed on the individual's behaviour, and researchers infer that this should apply to group behaviour as well. ...
... In this paper, a formal framework is proposed for group behaviour in InfoSec. This proposed framework is based in part on aspects found in existing frameworks [1], i.e. norms, beliefs, attitudes, etc. The other integral part of the framework will then consist of those aspects that are specific to group behaviour, i.e. the use of a technique, such as behavioural threshold analysis (BTA) that considers factors, such as the lemming effect and external contextual factors. ...
Full-text available
This paper presents the development of a framework for evaluating group behaviour in information security in practice. Information security behavioural threshold analysis is employed as the theoretical foundation for the proposed framework. The suitability of the proposed framework is evaluated based on two sets of qualitative measures (general frameworks and information security frameworks) which were identified from literature. A novel conceptual mapping of the two sets of evaluation measures is presented and used to evaluate the proposed framework. The successful evaluation of the proposed framework, guided by the identified evaluation measures, is presented in terms of positive practical applications, as well as positive peer review and publication of the underlying theory.
... Threat severity can be explained by the degree of harm from unethical behaviors, such as fraud. Threat vulnerability is an assessment by individuals who perceived the situation from unhealthy behavior as a threat to them [15]. The coping appraisal, it comprised of three factors, which are response efficacy, self-efficacy and response cost. ...
This study investigated the moderating effects of situational factors and personality factors on an insider's attitude towards insider compliance within the domain of information security. The situational factors consider the manipulation of the environment via measures of Situational Crime Prevention. There is a dearth of studies that consider the moderating effects of situational factors and personality factors on the information security domain. Studies of this nature could be significant for organizations grappling with the insider threat problem – a challenging and multifaceted domain that requires a multipronged mitigation strategy with technical, organizational, socio-technical, and sociological effects. This study considers the socio-technical influence of the human element on the information security domain.
Context Cybersecurity has seen a rise in behavioral investigation during the past decades. This includes studies being carried out at the primary as well as secondary levels. There are a number of reviews on cybersecurity behavioral research that are reported in different publication venues. To get a holistic view of the cybersecurity behavioral research, a synthesis of this literature as a tertiary study is needed. Objective This paper aims to investigate the demographic, specific and quality trends regarding reviews by synthesizing secondary literature in cybersecurity behavioral research. Method We use a systematic literature review protocol to carry out tertiary study till February 2022. A total of 107 secondary studies including regular as well as systematic reviews are included in our tertiary study. Results The results reveal a growing trend in secondary studies. The quality of the secondary studies lacks quality assessment of primary study and adoption of a synthesis method as per Database of Abstract of Reviews of Effect (DARE) quality assessment criteria. Another finding is the nascent area of cybersecurity behavioral research that is still at the theory development stage due to latest endeavors to identify theoretical building blocks and inconsistent findings in empirical results. Gaps exist in the conceptualization of constructs as taxonomies need to be constructed for non-organizational settings. The replication of studies is lacking as almost half of the secondary studies lack systematic approach to carry out reviews. The dominant epistemological and ontological beliefs are positivist in nature with paucity of research that employs other paradigms to study the human aspect of cybersecurity.
Young adults aged between 18 and 30 are likely to encounter increasing cyber threats. Understanding the cybersecurity behaviors of young adults, and identifying the measures and factors that can help reduce cyber threats is thus crucial. Since the existing studies have not sufficiently explored these factors, this study adopted a socio-behavioral perspective. It employed the primary constructs of the theory of planned behavior (TPB) with other factors, including perceived awareness and knowledge of cyber threats, to predict young adults' behavioral intent to practice cybersecurity behaviors. Data were collected from a random sample of 1581 young adults studying at Technical and Vocational Training Corporation (TVTC) colleges in Saudi Arabia through an online survey and were analyzed using the least-squares partial structural equation modeling (SEM). The results revealed that attitude (ATT), subjective norm (SN), and perceived behavioral control (PBC) strongly influenced young adults’ intentions to practice cybersecurity behavior (IPC). Also important for IPC was the perceived awareness of the consequences of the risks of cyber threats and the need for cybersecurity behavior (PCST). Moreover, while PCST and IPC were directly related to practicing cybersecurity behaviors, PBC was not. Future studies may benefit from examining cultural, and socio-demographic aspects that may influence CSB.
Password manager applications have the potential to alleviate password pain and improve password strength, yet they are not widely adopted. Password managers are dissimilar to other kinds of software tools, given that the leakage of the credentials they store could give a hacker access to all the individual's online accounts. Moreover, adoption requires a deliberate switch away from an existing (manual) password management routine. As such, traditional technology adoption models are unlikely to model password manager adoption accurately. In this paper, we propose and explain how we validated a theoretical model of smartphone password manager adoption. We commenced by carrying out exploratory interviews with 30 smartphone owners to identify factors that influence adoption. These were used to develop a model that reflects the password manager adoption process, building on migration theory. The proposed model, MIGRANT (MIGRation pAssword maNager adopTion), was validated and subsequently refined in a week-long study with 198 smartphone owners, combining self-report and observation to measure constructs. This study contributes to the information security behavioral literature by isolating the main factors that encourage or deter password manager adoption, and those that moor smartphone owners in their current practices, hindering switching. With this investigation, we introduce migration theory as a reference theory for future studies in the information security behavioral field.
Full-text available
Information technology executives strive to align the actions of end users with the desired security posture of management and of the firm through persuasive communication. In many cases, some element of fear is incorporated within these communications. However, within the context of computer security and information assurance, it is not yet clear how these fear-inducing arguments, known as fear appeals, will ultimately impact the actions of end users. The purpose of this study is to investigate the influence of fear appeals on the compliance of end users with recommendations to enact specific individual computer security actions toward the mitigation of threats. An examination was performed that culminated in the development and testing of a conceptual model representing an infusion of technology adoption and fear appeal theories. Results of the study suggest that fear appeals do impact end user behavioral intentions to comply with recommended individual acts of security, but the impact is not uniform across all end users. It is determined in part by perceptions of self-efficacy, response efficacy, threat severity, and social influence. The findings of this research contribute to information systems security research, human computer interaction, and organizational communication by revealing a new paradigm in which IT users form perceptions of the technology, not on the basis of performance gains, but on the basis of utility for threat mitigation.
Full-text available
Research dealing with various aspects of* the theory of planned behavior (Ajzen, 1985, 1987) is reviewed, and some unresolved issues are discussed. In broad terms, the theory is found to be well supported by empirical evidence. Intentions to perform behaviors of different kinds can be predicted with high accuracy from attitudes toward the behavior, subjective norms, and perceived behavioral control; and these intentions, together with perceptions of behavioral control, account for considerable variance in actual behavior. Attitudes, subjective norms, and perceived behavioral control are shown to be related to appropriate sets of salient behavioral, normative, and control beliefs about the behavior, but the exact nature of these relations is still uncertain. Expectancy— value formulations are found to be only partly successful in dealing with these relations. Optimal rescaling of expectancy and value measures is offered as a means of dealing with measurement limitations. Finally, inclusion of past behavior in the prediction equation is shown to provide a means of testing the theory*s sufficiency, another issue that remains unresolved. The limited available evidence concerning this question shows that the theory is predicting behavior quite well in comparison to the ceiling imposed by behavioral reliability.
Full-text available
Purpose – The purpose of this paper is to identify variables that influence compliance with information security policies of organizations and to identify how important these variables are. Design/methodology/approach – A systematic review of empirical studies described in extant literature is performed. This review found 29 studies meeting its inclusion criterion. The investigated variables in these studies and the effect size reported for them were extracted and analysed. Findings – In the 29 studies, more than 60 variables have been studied in relation to security policy compliance and incompliance. Unfortunately, no clear winners can be found among the variables or the theories they are drawn from. Each of the variables only explains a small part of the variation in people's behaviour and when a variable has been investigated in multiple studies the findings often show a considerable variation. Research limitations/implications – It is possible that the disparate findings of the reviewed studies can be explained by the sampling methods used in the studies, the treatment/control of extraneous variables and interplay between variables. These aspects ought to be addressed in future research efforts. Practical implications – For decision makers who seek guidance on how to best achieve compliance with their information security policies should recognize that a large number of variables probably influence employees' compliance. In addition, both their influence strength and interplay are uncertain and largely unknown. Originality/value – This is the first systematic review of research on variables that influence compliance with information security policies of organizations.
Presents an integrative theoretical framework to explain and to predict psychological changes achieved by different modes of treatment. This theory states that psychological procedures, whatever their form, alter the level and strength of self-efficacy. It is hypothesized that expectations of personal efficacy determine whether coping behavior will be initiated, how much effort will be expended, and how long it will be sustained in the face of obstacles and aversive experiences. Persistence in activities that are subjectively threatening but in fact relatively safe produces, through experiences of mastery, further enhancement of self-efficacy and corresponding reductions in defensive behavior. In the proposed model, expectations of personal efficacy are derived from 4 principal sources of information: performance accomplishments, vicarious experience, verbal persuasion, and physiological states. Factors influencing the cognitive processing of efficacy information arise from enactive, vicarious, exhortative, and emotive sources. The differential power of diverse therapeutic procedures is analyzed in terms of the postulated cognitive mechanism of operation. Findings are reported from microanalyses of enactive, vicarious, and emotive modes of treatment that support the hypothesized relationship between perceived self-efficacy and behavioral changes. (21/2 p ref)
Marketing researchers have questioned the use of the fear appeal, believing it to be too difficult to implement properly. AIDS, drug abuse, and other social problems have caused practitioners to return to the fear appeal, but with little direction from marketing theory. The protection motivation paradigm offers a prescriptive model to improve the effectiveness of the fear appeal. The authors propose and empirically test several changes to the PM model. Results indicate that fear appeals should present certain material in a specified order and attack maladaptive behaviors.
Conceptual and methodological ambiguities surrounding the concept of perceived behavioral control are clarified. It is shown that perceived control over performance of a behavior, though comprised of separable components that reflect beliefs about self-efficacy and about controllability, can nevertheless be considered a unitary latent variable in a hierarchical factor model. It is further argued that there is no necessary correspondence between self-efficacy and internal control factors, or between controllability and external control factors. Self-efficacy and controllability can reflect internal as well as external factors and the extent to which they reflect one or the other is an empirical question. Finally, a case is made that measures of perceived behavioral control need to incorporate self-efficacy as well as controllability items that are carefully selected to ensure high internal consistency.
Over the years, various observers have commented upon the imbalance in organisations' approaches to security. In many cases, significantly more attention appears to be devoted to technical aspects rather than those relating to people and processes. However, if survey findings are to be believed, there are at least signs of practices having improved in more recent times. Despite being exposed to the same policies and related training, employees within an organisation can exhibit very different security behaviours. Professor Steven Furnell and Anish Rajendran of Plymouth University propose a model that more fully identifies the factors influencing security behaviour and compliance. It considers forces that originate within the workplace, alongside various workplace-independent factors that might also affect security behaviour. The role of personality is considered within this model, as is the potential for inconsistencies to arise due to situational factors. Organisations might then use these insights to differentiate security-compliant from non-compliant employees and produce effective mitigation plans.