Conference PaperPDF Available

Security but not for security's sake: The impact of social considerations on app developers' choices

Authors:

Abstract and Figures

We explore a dataset of app developer reasoning to better understand the reasons that may inadvertently promote or demote app developers' prioritization of security. We identify a number of reasons: caring vs. fear of users, the impact of norms, and notions of 'otherness' and 'self' in terms of belonging to groups. Based on our preliminary ndings, we propose an interdisciplinary research agenda to explore the impact of social identity (a psychological theory) on developers' security rationales, and how this could be leveraged to guide developers towards making more secure choices.
Content may be subject to copyright.
Security but not for security’s sake: The impact of social
considerations on app developers’ choices
Irum Rauf
The Open University
irum.rauf@open.ac.uk
Dirk van der Linden
University of Bristol
dirk.vanderlinden@bristol.ac.uk
Mark Levine
Lancaster University
m.levine@lancaster.ac.uk
John Towse
Lancaster University
j.towse@lancaster.ac.uk
Bashar Nuseibeh
The Open University; Lero, University
of Limerick
bashar.nuseibeh@open.ac.uk
Awais Rashid
University of Bristol
awais.rashid@bristol.ac.uk
ABSTRACT
We explore a dataset of app developer reasoning to better under-
stand the reasons that may inadvertently promote or demote app
developers’ prioritization of security. We identify a number of rea-
sons: caring vs. fear of users, the impact of norms, and notions of
‘otherness’ and ‘self’ in terms of belonging to groups. Based on
our preliminary ndings, we propose an interdisciplinary research
agenda to explore the impact of social identity (a psychological
theory) on developers’ security rationales, and how this could be
leveraged to guide developers towards making more secure choices.
CCS CONCEPTS
Security and privacy Social aspects of security and pri-
vacy.
ACM Reference Format:
Irum Rauf, Dirk van der Linden, Mark Levine, John Towse, Bashar Nuseibeh,
and Awais Rashid. 2020. Security but not for security’s sake: The impact
of social considerations on app developers’ choices. In IEEE/ACM 42nd
International Conference on Software Engineering Workshops (ICSEW’20),
May 23–29, 2020, Seoul, Republic of Korea. ACM, New York, NY, USA, 4 pages.
https://doi.org/10.1145/3387940.3392230
1 INTRODUCTION
Developing secure apps is a complex socio-technical activity [
6
].
From the urgency of getting apps out there, whether to be the rst
to capitalize on an innovative idea or simply to meet contractual
deadlines, the complexities of monetization, and a not-entirely un-
common lack of structure in development processes, there are many
things that app developers may be thinking about rather than se-
curity [
7
]. Previous research has shown that across dierent kinds
of app development tasks, developers make secure decisions, but
seemingly without having considered security explicitly in their
reasoning [
13
]. This runs counter to the prediction from a social
debt framework [
10
]—when the accrued consequences of decisions
involving developer and development community eventually im-
pact the software product. We perform an exploratory qualitative
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for prot or commercial advantage and that copies bear this notice and the full citation
on the rst page. Copyrights for components of this work owned by others than ACM
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specic permission and/or a
fee. Request permissions from permissions@acm.org.
ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea
© 2020 Association for Computing Machinery.
ACM ISBN 978-1-4503-7963-2/20/05. . . $15.00
https://doi.org/10.1145/3387940.3392230
analysis of a dataset [
14
] covering a diverse range of app develop-
ers, focusing in particular on developers who claim to prioritize
security, yet do not articulate security issues in explaining these
choices. We do so to better understand why they make the decisions
they do, and what aspects come into play.
2 DATA & METHOD
We analyzed a recent qualitative data set on 44 mobile software
developers’ rationales across dierent software development ac-
tivities [
14
]. It provides app developers’ prioritization choices (via
card-sort or selection tasks) of dierent options impacting software
security for six tasks and the rationales for the choices they make.
Included are (1) setting up an IDE by choosing functionality; (2)
xing source-code by deciding what aws to xrst; (3) deciding
where to seek help using an API; (4) deciding whom to involve as
testers; (5) what to consider when selecting an advertisement SDK;
and (6) what clauses to favor for a software license agreement. The
participants were not primed for security and hence were not aware
that dierent options may dierently impact software security.
Approach. The rationale analysis process was iterative. We study
the subset (N=40) of app developers whose choices indicated a
prioritization of security, with rationales that included clear security
consideration and those that did not indicate any clear reection
about security. Moreover, we consider the dierent non-functional
requirement (NFR) they prioritized in the data set. Only ten of
the 40 prioritized security as a non-functional requirement (NFR).
One researcher read the developer’s rationales descriptively and
coded the rationales. The codes were classied into a set of themes
which captured the dierent identied aspects potentially aecting
developers’ reasoning. We identied eight new themes in the data.
Limitations. The coding scheme was discussed with another
author in order to nd disagreements and settle a nal set of themes.
We later agreed to exclude one theme–‘security as a value’–because
of the lack of common or convergent interpretation of the related
rationales. This analysis has certain limitations, most importantly
that it is a qualitative analysis of human reasoning. Our dataset is
available at https://doi.org/10.17605/OSF.IO/3WHD5. It does not
purport to present generalized claims, but serves as an identication
of key concepts that are worthy of further in-depth study.
3 FINDINGS – FACTORS THAT MAY AFFECT
SECURITY REASONING
Two key aspects underpin distinctions identied in the eight newly
identied themes of developers’ rationales:
ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea Rauf, et al.
(1)
whether they had a priori prioritized security as an important
non-functional requirement in an earlier task of the dataset;
(2)
whether their reasoning addressed social aspects, comparing
and contrasting users and other developers to themselves.
Across these aspects, both of which present a dierent lens through
which to analyze the rationales, we identied eighth recurring
themes in developers’ rationales which explain how their reasoning
may aect security rationale, summarized in Fig. 1.
Figure 1: The considerations we identied that a developer
may have towards themselves, users, and other developers.
the developer
the user other developersnorms and values
“this is how
it’s done”
“what about my reputation?”
“i have to
protect
myself
first”
… affects … … affects …
trusts in
reluctance to
engage
fear of
care for
In what follows in Sec. 3.1– 3.2 we explore how the eight identi-
ed themes of developers’ reasoning across these two aspects may
give insight into their (lack) of reasoning about security.
3.1 Prioritizing security from the start (or not)
We noted some dierences between those developers who had a
priori prioritized security (10) and those who did not (30). These
thematic dierences are described below.
Th.1: Caring for users vs. Fearing them:
Several developers
reasoned about their users in order to decide what decisions to make.
A dierence that arose between the developers who prioritized
security was that they seemed more driven by caring for their
users. For example, one participant argued they “wanted to prioritize
things that could help the user” (P5), or “I would not want my users
to experience any negative side eects from the ad library” (P4), to
even placing themselves in the ‘shoes’ of the user: “I prioritized
based on how ‘extortionist’ or ‘evil’ the clauses would feel to me as
a consumer” (P4). Yet, developers who did not prioritize security
seemed more driven by negative feelings of losing their users or
annoying them: “Too many ads distract may annoy users” (P11), or
simply “I don’t want my ads to turn away my users” (P16). Some
reasoned about the impact decisions could have on their reputation:
/. . . / relying on your users will nd you all the bugs, but might lose
you your reputation” (P20). These developers appeared to make
choices driven by a desire to avoid getting into trouble: “At rst [I
will] take care for myself and make sure my team is safe /.../” (P21),
and “I can never be sure there aren’t bugs in my code so I put anything
that protects me in that case at the top” (P17). Feelings of care or fear
of negative user response may thus be an important aspect in what
drives them to consider and even act securely, e.g., by considering
secure behavior to be more socially responsible.
Th.2: Security as a norm:
Literature has shown that devel-
opers often adopt security practice through peer inuence [
15
],
and subsequently consider them as norms or unwritten standards.
Developers who did not a priori prioritize security considered their
secure choices were simply the norm. For example, participants
noted that security solutions “look[ed] ok like this” (P29), were a
common practice (“It’s common practice, you start from the man-
ual. . . "(P3)) or were simply “common sense”(P28). This may indicate
that developers are not motivated to make secure choices because
they are security conscious per sé, but rather because certain pat-
terns or ways of doing things have become ingrained—which aligns
with eorts by the security community to encourage security prac-
tices in developers by making them common place.
Th.3: Being on your own or in a team:
Being part of a team
may provide more structure which in turn leads to more secure
practices and reasoning—simply because there are systems in place
such as code review. This, of course, does not necessarily push de-
velopers to consider security more explicitly, as they might simply
consider it someone else’s responsibility [
5
], but demonstrates that
our participants at least orientate to the potential scrutiny of others.
Looking at rationale of some of the insecure choices by developers
who prioritize security a priori we see developers may shift from se-
cure choices to insecure choices if they do not consider themselves
part of the team any more: “If it was a side project, chances are I would
be publishing it as-is, to be as useful as possible immediately, oering
it without any guarantees” (P4) and “Seems like a right approach
for new personal project” (P9). So, having a formal team structure
which means that someone else will see their work clearly does
impact on the way participants make decisions. However, develop-
ers do not need to be part of a team to think about themselves as a
member of a group. Participants sometimes thought of themselves
as member of broader social groups (e.g., being a developer, or an
academic developer), even when they would be working on their
own. Thinking about oneself as a member of such a group enhances
one’s trust in others, as some developers noted: “[I would] try out
the API and discuss with another peer with experience /. . . / or ask the
developer community” (P9), or simply “I would ask other developers I
know” (P27). This indicates that simply imagining themselves as a
group member might raise similar concerns about how others in
the group might evaluate their work.
Th.4: Trusting other developers:
As alluded to above, trust in
other developers is an important aspect of how developers reason.
Some literature has shown that developers are more likely to trust
other developers’ opinion if they are of a similar socio-economic
or educational status [
15
]. We saw that developers, whether they a
priori prioritized security or not, often relied on opinions of other
developers, with conscious awareness of whom to trust and why.
However, they mostly reached out only after trying rst themselves.
This motivation to work independently seems inuenced by the
need to work eciently as developers consider asking others to be
more time consuming than trying to search for information them-
selves, e.g.,: “I will ask people I know because it’s easier to communi-
cate eciently with someone you already know” (P22). Developers
may thus trust other developers working on similar things: “/. . . /
I have fellow devs who have used it”’ (P16)), or who are seen as an
authority on the subject: “I will rst consult experts/. . . / ”(P31).
Security but not for security’s sake: The impact of social considerations on app developers’ choices ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea
3.2 Thinking about ‘others’ and ‘the self’
We noted further dierences in developers’ reasoning when they
were explicitly thinking and classifying themselves or others.
Th.5: Thinking about ‘others’:
Developers showed they con-
sidered dierent groups, most commonly reasoning about ‘the user’.
As noted before, this could be framed both with both positive and
negative valence. For example, for some participants decision mak-
ing was motivated by an intention to maximize positive user expe-
rience: “I wanted to prioritize things that could help the user” (P5).
Some developers considered their users as customers rather than
users—a subtle but important dierence which reects the tension
between seeing others as a beneciary or a source of revenue. For
example, one participant noted that “Firstly, I should take care of
my business. Secondly, make customers happy” (P24).
When it came to the development and testing process itself,
some participants invoked a number of relevant others (including
‘users’)
1
. These included, friends, colleagues and family and some-
times even fellow developers. For example, one participant noted
of their tester seeking strategies: “I would ask a few colleagues and
friends to test it rst /. . ./ If I was able to get a user group to test it
(can be dicult to arrange) then I would, but this is rare” (P6), while
others similarly reasoned about the impact of using those close
to themselves: “I’d want it tested well, so a random user group is a
good start. Another developer might be a better tester, but that’s only
one person. Friends and family are terrible testers, because they don’t
want to hurt your feelings” (P20).
Developers thus seem to reference a variety of relevant ‘others’
when discussing security related decisions. However, participants
were potentially primed to think of ‘others’ in the choices they were
given for the testing task [
14
]. The way in which these groups are
imagined (for example, whether a user is seen as a user, or more
specically as a customer; or whether another developer is seen as
a friend or a colleague) will impact the way in which information
is evaluated and priorities are set. Thinking about the relevance of
any one of these groups at a particular time is likely to aect the
decision making process.
Th.6: Thinking about ‘the self’:
Another important aspect of
how developers reasoned about their decisions is how they framed
themselves. Sometimes it is clear they are thinking about themselves
as individuals rather than as a member of a social group. This focus
on themselves as individuals can be defensive. In other words,
they can be motivated to act to protect their own self-interest: “I
can never be sure there aren’t bugs in my code so I put [any license
clause] that protects me in that case at the top” (P16). Sometimes this
focus on themselves as individuals can have a positive motivation.
It can reect a desire for autonomy, and respecting professional
relationships with others: “First I’ll use resources I can use without
the help from other people (their time is important as well)” (P22).
In yet another case, developers may switch from talking about
themselves in the rst person to identifying themselves as a member
of a social category. They move from a rst person pronoun to a
collective noun–or rather they qualify their self-description by
reference to the social group: “
As a developer
, I sleep better at night
if I have no knowledge of my user’s passwords” (P5). By prefacing
an account with a claim to group membership (‘as a developer’)
the participant is making a claim to motivation beyond individual
1
It should be noted that the task eliciting these rationales presented participants with
a variety of sources, thus potentially priming them to think of social others.
self-interest. Their decision making is shaped by the norms and
values of a social identity. Engaging in practices which violate these
norms would lead to psychological or perhaps moral discomfort.
The same can be said for another developer, who made a slightly
more rened claim to group membership as a specic group mem-
ber: “As a developer at a university that develops apps for its students
to use, this commercial approach is new to me. I guess I would try to be
as responsible as I could be with advertising, though” (P6). Here again,
the participant is describing the origin of the relevant norms and
values (‘to be as responsible as I could be’) and how they are a con-
sequence of their belonging to a group, “developer at a university”
with obligation to a specic other (university students).
Finally, some developers considered themselves as a potential
member of a user group. In other words, they can imagine them-
selves as developers and users at one and the same time. This is an
important observation as it shows how participants can actively
try to switch footing from one identity to another in order to try
and evaluate the impact of decisions they might make: “I priori-
tized based on how ‘extortionist’ or ‘evil’ the clauses would feel to
me as a consumer” (P4). This participant is emphatically imagining
themselves as a consumer rather than a user. Thus, participants
seem to think about themselves at dierent levels of inclusivity–
sometimes as sovereign individuals and sometimes as members of
social groups. The social groups themselves can be drawn in dif-
ferent ways (as a developer, or as a developer in a university– and
even as a member of an imagined community of consumers). These
dierent ways of thinking about the self can impact on security
relevant decision-making in dierent ways at dierent times.
Th.7: Relying on ‘others’:
When seeking help on a confusing
API, a variety of social considerations became apparent, rather
than straightforward technical or functional considerations. Some
participants revealed their desire to work independently: “I like to
try and work things out fully rst /. . . / Then ask a local expert” (P6),
while others noted eciency: “Searching for info in the web is in 99%
of cases faster than asking another person” (P11). Trust, eciency
and ease of understanding are key aspects here in understanding
why developers reason like this over anything else. Since developers
come from all type of backgrounds and development environments,
their trust in dierent resources may varingly lead to (in)secure
behavior. When developers are asked on how they seek testers for
their app, they seemed inuenced by the trust they have in social
connections rather. For example, one participant noted that “user
groups will be the most honest, friends will be the most dedicated
(P16), and others noted specically the concept of friendliness: “A
friendly tester(s) loosely familiar with the concepts of the product will
be likely to explore most of the options within the product”(P33). This
may indicate that developers prioritize social interaction by how
comfortable they are engaging with, and placing trust in, others.
Th.8: Licensing and ‘self’-defensiveness:
When developers
were asked to consider what clauses to include in a software license
agreement, they showed little security consideration in their rea-
soning. Rather, it seemed developers focused on avoiding getting
into trouble, such as one participant noting that “liability is the
only one i care about” (P20). Indeed, as another participant noted,
getting sued is worse than having someone copy your work as far
as I’m concerned” (P27). This hints at developers thinking of how
others might aect them, and push them to be defensive, protecting
their own interests and ability to ‘be’ an active developer, rather
than consider what eect the license might have on the user.
ICSEW’20, May 23–29, 2020, Seoul, Republic of Korea Rauf, et al.
4 TOWARDS A RESEARCH AGENDA
The aspects of developers’ reasoning that we have discussed above
all share a key thing: social considerations. We propose that re-
search should investigate, in depth, to what extent social consider-
ations may aect both the reasoning about security and eventual
behavior towards security. As Fig. 1’s summary of the relationships
and considerations identied in our analysis shows, developers
have conicting relationships with both users and other developers,
which manifest in both positive (e.g., caring for, trusting in) and
negative (e.g., fearing, being reluctant to engage) ways. Moreover,
an particularly important relationship is that of the developer with
themselves: the self-reection, mediated by norms and values their
environment has instilled in them, driving them to reason in a
particular way. This indicates that, depending on the environment
a developer is situated in, there may be many dierent ‘kinds’ of
typical developers, shaped by their interactions with users, other
developers, and norms and values imposed on them by their wider
environment. In order to understand how these dierent kinds of
developers can eventually be stimulated to explicitly reason and act
upon security as a key priority, acknowledging and dealing with
that diversity is key. The comparing and contrasting of the self
with their users and developers, and those people aecting the way
developers reason, implies that developers’ social identity may be a
vital construct in order to better understand their reasoning.
The Social Identity Approach (SIA) [
3
,
4
] is the product of four
decades of work on the social psychology of the self and its relation-
ship to groups and group processes. Pioneering work by Tajfel [
8
,
9
]
and Turner [
11
,
12
] demonstrated that our sense of self is not xed,
but changes as a function of changes in our social context. People
can dene (and redene) themselves along a continuum from a
more idiosyncratic ‘personal identity’ to a more collective ‘social
identity’. At the same time, because multiple group memberships
are available to us, our social identities can also dynamically update
and adapt in their focus. As social contexts change, or as we begin to
think about ourselves in relation to dierent individuals or groups,
our sense of who we are also changes. This is important because
our identities (personal and social) shape how we make sense of,
and act, in the world. As dierent aspects of identity become more
or less important to us, so do the values that we prioritize, and the
degree that we can inuence (and be inuenced) by others changes
so [
1
]. The Social Identity Approach (SIA) thus oers a psychologi-
cal model that has the potential to explain variation in the way our
participants orientate to decisions that have security implications.
We have seen, in our data, examples of participants talking about
themselves in both personal and social identity terms. We can also
see them talking about dierent groups to which they can belong,
and dierent groups with which they can interact. While we can
see variation in the way they talk about and prioritize decisions
that have security implications, we cannot yet make causal infer-
ences between identities and security related decision-making—an
important next step for future empirical work.
Arst exploration into social identity in software development [
2
]
noted that aspects of social identity aect software developers’ be-
havior, and that this holds several implications for eectiveness
of software development that takes place in teams. However, we
also need to understand the wider demographic of app developers
working solo, in not well dened teams, and without established
organizational support—where the ‘others’ are likely even further
removed. Thus, based on our analysis here, we propose several key
research questions
which need to be answered to better under-
stand the eect of a wider range of developers’ interaction with,
and consideration of, others, on the security of apps they develop.
Which, if any, social considerations mediate decisions
made in software development?
How do software developers identify with their users?
How do software developers identify with ‘other’ de-
velopers?
What are the eects of software developers identify-
ing as part of a specic group on their attitude and
behavior?
Conclusion—
we identied a number of, primarily social, con-
siderations that software developers exhibit in their reasoning about
common software development activities, all of which may come
in place of explicit security considerations. To understand why this
happens, and how we may move these developers towards security,
interdisciplinary research is needed. Social identity in particular
provides an interesting lens to into the why and how of improving
security choices of developers–and hence security of the resultant
apps on which users increasingly rely in their daily lives.
Acknowledgments
This work is partially supported by EPSRC grant EP/P011799/1,
Why Johnny doesn’t write secure software? Secure software devel-
opment by the masses and SFI grant 13/RC/2094.
REFERENCES
[1]
Dominic Abrams, Margaret Wetherell, Sandra Cochrane, Michael A Hogg, and
John C Turner. 1990. Knowing what to think by knowing who you are: Self-
categorization and the nature of norm formation, conformity and group polar-
ization. British journal of social psychology 29, 2 (1990), 97–119.
[2]
Andreas Bäckevik, Erik Tholén, and Lucas Gren. 2019. Social identity in software
development. In 2019 IEEE/ACM 12th International Workshop on Cooperative and
Human Aspects of Software Engineering (CHASE). IEEE, 107–114.
[3]
Rupert Brown. 2020. The social identity approach: Appraising the Tajfellian
legacy. British Journal of Social Psychology (2020).
[4] S Alexander Haslam. 2001. Psychology in organizations. London, Sage.
[5]
Kai-Uwe Loser and Martin Degeling. 2014. Security and privacy as hygiene factors
of developer behavior in small and agile teams. In IFIP International Conference
on Human Choice and Computers. Springer, 255–265.
[6]
Todd Sedano, Paul Ralph, and Cécile Péraire. 2017. Software development waste.
In 2017 IEEE/ACM 39th International Conference on Software Engineering (ICSE).
IEEE, 130–140.
[7]
Soa Sherman and Irit Hadar. 2015. Toward dening the role of the software
architect. In 2015 IEEE/ACM 8th International Workshopon Cooperative and Human
Aspects of Software Engineering. IEEE, 71–76.
[8]
Henri Tajfel. 1981. Human groups and social categories: Studies in social psychology.
Cambridge: Cambridge University Press.
[9]
Henri Ed Tajfel. 1978. Dierentiation between social groups: Studies in the social
psychology of intergroup relations. London:Academic Press.
[10]
Damian A Tamburri, Philippe Kruchten, Patricia Lago, and Hans van Vliet. 2013.
What is social debt in software engineering?. In 2013 6th International Workshop
on Cooperative and Human Aspects of Software Engineering (CHASE). IEEE, 93–96.
[11]
John C Turner, Michael A Hogg, Penelope J Oakes, Stephen D Reicher, and
Margaret S Wetherell. 1987. Rediscovering the social group: A self-categorization
theory. Oxford & New York: Basil Blackwell.
[12]
John C Turner, Penelope J Oakes, S Alexander Haslam, and Craig McGarty. 1994.
Self and collective: Cognition and social context. Personality and social psychology
bulletin 20, 5 (1994), 454–463.
[13]
Dirk van der Linden, Pauline Anthonysamy, Bashar Nuseibeh, Thein T. Tun,
Marian Petre, Mark Levine, John Towse, and Awais Rashid. 2020. Schrödinger’s
Security: Opening the Box on App Developers’ Security Rationale. In Proceedings
of the 42nd International Conference on Software Engineering (ICSE).
[14]
van der Linden, Dirk and others. 2020. Schrödinger’s Security (ICSE 2020) Ap-
pendices. http://hdl.handle.net/1983/f43803de-4ade- 488f-be1a- a2e8ba30c201.
Online; accessed 8 January 2020.
[15]
Shundan Xiao, Jim Witschey, and Emerson Murphy-Hill. 2014. Social inuences
on secure development tool adoption: why security tools spread. In Proceedings
of the 17th ACM conference on Computer supported cooperative work & social
computing. ACM, 1095–1106.
... This increased trust could have an unintended consequence of promoting reliance and subsequent diffusion of responsibility. A reliance on others only works when all group members share in the same values as others, but if a member feels less connected to the group, they can deviate from group expectations on security [69], which could result in diffusion compromising the software. ...
... This has a moral and emotional personal impact." -S86 Others accept responsibility through social motives, such as user engagement or empathy for users [69]. ...
... Developers who felt a strong social connection with members of their group have also been seen to produce more secure code [69], with strong social and institutional structures creating heightened accountability, Anon. ...
Preprint
Full-text available
We apply a social and cognitive psychological approach to better understand software developers’ perceptions of secure software development. Drawing upon psychological theories of social identity and cognitive processing, we illustrate how software developers’ self-defined social identities affect their approaches to development. We also point to behaviours that might indicate areas of increased risk of project delays or failure. Professional freelance software developers together with current computer science students addressed considerations of risk and security during development. A thematic analysis extracted three core themes of responsibility, risk, and optimism. We show how language used about responsibility for code security is framed through concepts of diffusion, displacement, and acceptance of responsibility. We also examine the way developers orientate to risk awareness, appetites for risk, and risk mitigation strategies. Examples of unrealistic optimism biases are highlighted and discussed. We discuss our findings in relation to psychological theories of responsibility, decision making and heuristics and biases, alongside prior work within software engineering. We conclude with a discussion of the advantages of using a psychological lens to examine the rationalisations and trade-offs made by developers when working with security in software.
... Some studies have previously explored particular human-centric issues (e.g., accessibility), developer's issues and characteristics (e.g., emotions) [1,29,38], or specific aspect of software development, such as UI/UX [26,33]. However, there is still very limited evidence-based knowledge about how different types of end-user human-centric issues are discussed and addressed during software development. ...
... In some cases, mobile app developers are not educated in accessibility principles and/or are not incentivised by their organisations to make their apps more accessible [1]. Similarly, Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [38]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [38]. ...
... Similarly, Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [38]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [38]. More recently, a study on the reflection of human values in mobile app reviews shows that a quarter of the 22,119 app reviews analysed contain perceived violation of human values in mobile apps, supporting the recommendation for the use of app reviews as a potential source for mining values requirements in software projects [30]. ...
... Some studies have previously explored particular human-centric issues (e.g., accessibility), developer's issues and characteristics (e.g., emotions) [1,29,38], or specific aspect of software development, such as UI/UX [26,33]. However, there is still very limited evidence-based knowledge about how different types of end-user human-centric issues are discussed and addressed during software development. ...
... In some cases, mobile app developers are not educated in accessibility principles and/or are not incentivised by their organisations to make their apps more accessible [1]. Similarly, Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [38]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [38]. ...
... Similarly, Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [38]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [38]. More recently, a study on the reflection of human values in mobile app reviews shows that a quarter of the 22,119 app reviews analysed contain perceived violation of human values in mobile apps, supporting the recommendation for the use of app reviews as a potential source for mining values requirements in software projects [30]. ...
Conference Paper
Full-text available
Many software systems fail to meet the needs of the diverse end-users in society and are prone to pose problems, such as accessibility and usability issues. Some of these problems (partially) stem from the failure to consider the characteristics, limitations, and abilities of diverse end-users during software development. We refer to this class of problems as human-centric issues. Despite their importance, there is a limited understanding of the types of human-centric issues encountered by developers. In-depth knowledge of these human-centric issues is needed to design software systems that better meet their diverse end-users' needs. This paper aims to provide insights for the software development and research communities on which human-centric issues are a topic of discussion for developers on GitHub. We conducted an empirical study by extracting and manually analysing 1,691 issue comments from 12 diverse projects, ranging from small to large-scale projects, including projects designed for challenged end-users, e.g., visually impaired and dyslexic users. Our analysis shows that eight categories of human-centric issues are discussed by developers. These include Inclusiveness, Privacy & Security, Compatibility, Location & Language, Preference, Satisfaction, Emotional Aspects, and Accessibility. Guided by our findings, we highlight some implications and possible future paths to further understand and incorporate human-centric issues in software development to be able to design software that meets the needs of diverse end users in society.
... Some studies have previously explored particular human-centric issues (e.g., accessibility), developer's issues and characteristics (e.g., emotions) [1,29,38], or specific aspect of software development, such as UI/UX [26,33]. However, there is still very limited evidence-based knowledge about how different types of end-user human-centric issues are discussed and addressed during software development. ...
... In some cases, mobile app developers are not educated in accessibility principles and/or are not incentivised by their organisations to make their apps more accessible [1]. Similarly, Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [38]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [38]. ...
... Similarly, Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [38]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [38]. More recently, a study on the reflection of human values in mobile app reviews shows that a quarter of the 22,119 app reviews analysed contain perceived violation of human values in mobile apps, supporting the recommendation for the use of app reviews as a potential source for mining values requirements in software projects [30]. ...
Preprint
Full-text available
Many software systems fail to meet the needs of the diverse end-users in society and are prone to pose problems, such as accessibility and usability issues. Some of these problems (partially) stem from the failure to consider the characteristics, limitations, and abilities of diverse end-users during software development. We refer to this class of problems as human-centric issues. Despite their importance, there is a limited understanding of the types of human-centric issues encountered by developers. In-depth knowledge of these human-centric issues is needed to design software systems that better meet their diverse end-users' needs. This paper aims to provide insights for the software development and research communities on which human-centric issues are a topic of discussion for developers on GitHub. We conducted an empirical study by extracting and manually analysing 1,691 issue comments from 12 diverse projects, ranging from small to large-scale projects, including projects designed for challenged end-users, e.g., visually impaired and dyslexic users. Our analysis shows that eight categories of human-centric issues are discussed by developers. These include Inclusiveness, Privacy & Security, Compatibility, Location & Language, Preference, Satisfaction, Emotional Aspects, and Accessibility. Guided by our findings, we highlight some implications and possible future paths to further understand and incorporate human-centric issues in software development to be able to design software that meets the needs of diverse end users in society.
... Both cognitive and social psychologists have considered discrepancies between an individual's knowledge and behaviour using different theories and in different application areas, and so we sought insight from that literature about factors that may influence secure coding behaviour. This review was shaped by: 1) our motivation to study how individual developers code 'at the desk'; 2) emergence of developers as a community that values social interactions [129]; and 3) our familiarity with literature on developercentered security, which includes both studies of developers' security behaviour with cognitive tasks [116] and studies to investigate social influences on developers' security choices [183]. Hence, we wanted to be informed by both individual and social perspectives on factors that influence behaviour. ...
... It is also an important determinant of social acceptance among peers. Developers often reach out to other developers who are more knowledgeable if they are working on an unfamiliar task [134] or if they need help in dealing with a confusing API [129]. This type of information seeking behavior is noted by social pscyhologists as informational influence, where developers seek information from others as a way to resolve uncertainty -what Ciadini calls 'social proof' [46] (discussed earlier in section 2.1.4). ...
... While we outline factors that negatively influence a developer's intention to code securely, we also note contrasting observations in recent research that developers may exhibit secure coding behaviour in the absence of a clear security rationale [164] (i.e., may exhibit secure coding behaviour without specific intention to do so). That research also observed that social considerations affect developers' coding decisions, resulting in developers exhibiting secure coding behaviour for reasons other than security goals [129]. We find evidence that suggests that developers shift from secure choices to insecure choices if they do not consider themselves part of the team any more [129], which also aligns with what social psychologists suggest about changes in the immediate social context and its effect on goal orientation [162]. ...
Article
Full-text available
Despite the availability of various methods and tools to facilitate secure coding, developers continue to write code that contains common vulnerabilities. It is important to understand why technological advances do not sufficiently facilitate developers in writing secure code. In order to widen our understanding of developers' behaviour, we considered the complexity of the security decision space of developers using theory from cognitive and social psychology. Our interdisciplinary study reported in this paper (1) draws on the psychology literature to provide conceptual underpinnings for three categories of impediments to achieving security goals, (2) reports on an in-depth meta-analysis of existing software security literature which identified a catalogue of factors that influence developers' security decisions, and (3) characterises the landscape of existing security interventions that are available to the developer during coding and identifies gaps. Collectively, these show that different forms of impediments to achieving security goals arise from different contributing factors. Interventions will be more effective where they reflect psychological factors more sensitively and marry technical sophistication, psychological frameworks, and usability. Our analysis suggests 'adaptive security interventions' as a solution that responds to the changing security needs of individual developers and a present a proof-of-concept tool to substantiate our suggestion.
... This shows that designing human-centric software artefacts requires a more participation from novice users, in contrast to the traditional opinion that expert users provide more reliable contribution to the software design process. Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [58]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [58]. ...
... Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [58]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [58]. ...
Article
Full-text available
Failure to consider the characteristics, limitations, and abilities of diverse end-users during mobile app development may lead to problems for end-users, such as accessibility and usability issues. We refer to this class of problems as human-centric issues . Despite their importance, there is a limited understanding of the types of human-centric issues that are encountered by end-users and taken into account by the developers of mobile apps. In this paper, we examine what human-centric issues end-users report through Google App Store reviews, what human-centric issues are a topic of discussion for developers on GitHub, and whether end-users and developers discuss the same human-centric issues. We then investigate whether an automated tool might help detect such human-centric issues and whether developers would find such a tool useful. To do this, we conducted an empirical study by extracting and manually analysing a random sample of 1,200 app reviews and 1,200 issue comments from 12 diverse projects that exist on both Google App Store and GitHub. Our analysis led to a taxonomy of human-centric issues that characterises human-centric issues into three-high level categories: App Usage, Inclusiveness, and User Reaction. We then developed machine learning and deep learning models that are promising in automatically identifying and classifying human-centric issues from app reviews and developer discussions. A survey of mobile app developers shows that the automated detection of human-centric issues has practical applications. Guided by our findings, we highlight some implications and possible future work to further understand and better incorporate addressing human-centric issues into mobile app development.
... This shows that designing human-centric software artefacts requires a more participation from novice users, in contrast to the traditional opinion that expert users provide more reliable contribution to the software design process. Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [50]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [50]. ...
... Rauf et al. analysed a dataset of app developers to examine the rationale behind developers' prioritisation of security in the software development process [50]. The study shows that social considerations, e.g., fear of users, influenced developers' reasoning in development activities, including security choices [50]. ...
Preprint
Full-text available
Failure to consider the characteristics, limitations, and abilities of diverse end-users during mobile apps development may lead to problems for end-users such as accessibility and usability issues. We refer to this class of problems as human-centric issues. Despite their importance, there is a limited understanding of the types of human-centric issues that are encountered by end-users and taken into account by the developers of mobile apps. In this paper, we examine what human-centric issues end-users report through Google App Store reviews, which human-centric issues are a topic of discussion for developers on GitHub, and whether end-users and developers discuss the same human-centric issues. We then investigate whether an automated tool might help detect such human-centric issues and whether developers would find such a tool useful. To do this, we conducted an empirical study by extracting and manually analysing a random sample of 1,200 app reviews and 1,200 issue comments from 12 diverse projects that exist on both Google App Store and GitHub. Our analysis led to a taxonomy of human-centric issues that categorises human-centric issues into three-high levels: App Usage, Inclusiveness, and User Reaction. We then developed machine learning and deep learning models that are promising in automatically identifying and classifying human-centric issues from app reviews and developer discussions. A survey of mobile app developers shows that the automated detection of human-centric issues has practical applications. Guided by our findings, we highlight some implications and possible future work to further understand and incorporate human-centric issues in mobile apps development.
... Similarly, there appears to be another trope that assumes that security is a universal priority and security is desired in applications for its own sake; while prior work highlighted that developers do not always value security as a feature [43], [77], [78]. The work of Naiakshina et al. showed that developers would only implement password storage securely if explicitly asked to [40]. ...
Conference Paper
Full-text available
Developers struggle to program securely. Prior works have reviewed the methods used to run user-studies with developers, systematized the ancestry of security API usabil-ity recommendations, and proposed research agendas to help understand developers' knowledge, attitudes towards security and priorities. In contrast we study the research to date and abstract out categories of challenges, behaviors and interventions from the results of developer-centered studies. We analyze the abstractions and identify five misplaced beliefs or tropes about developers embedded in the core design of APIs and tools. These tropes hamper the effectiveness of interventions to help developers program securely. Increased collaboration between developers, security experts and API designers to help developers understand the security assumptions of APIs alongside creating new useful abstractions-derived from such collaborations-will lead to systems with better security.
... Rauf et al. [9] discussed developers as social beings and the resulting impact on their security choices. How they relate to other developers and to their context are seen as crucial influences on their security behaviour. ...
Conference Paper
Full-text available
Research has established the wide variety of security failures in mobile apps, their consequences, and how app developers introduce or exacerbate them. What is not well known is why developers do so—what is the rationale underpinning the decisions they make which eventually strengthen or weaken app security? This is all the more complicated in modern app development’s increasingly diverse demographic: growing numbers of independent, solo, or small team developers who do not have the organizational structures and support that larger software development houses enjoy. Through two studies, we open the box on developer rationale, by performing a holistic analysis of the rationale underpinning various activities in which app developers engage when developing an app. The first study does so through a task-based study with app developers (N=44) incorporating six distinct tasks for which this developer demographic must take responsibility: setting up a development environment, reviewing code, seeking help, seeking testers, selecting an advertisement SDK, and software licensing. We found that, while on first glance in several activities participants seemed to prioritize security, only in the code task such prioritization was underpinned by a security rationale—indicating that development behavior perceived to be secure may only be an illusion until the box is opened on their rationale. The second study confirms these findings through a wider survey of app developers (N=274) investigating to what extent they find the activities of the task-based study to affect their app’s security. In line with the task-based study, we found that developers perceived actively writing code and actively using external SDKs as the only security-relevant, while similarly disregarding other activities having an impact on app security. Our results suggest the need for a stronger focus on the tasks and activities surrounding the coding task—all of which need to be underpinned by a security rationale. Without such a holistic focus, developers may write “secure code” but not produce “secure apps”.
Conference Paper
Full-text available
Context: Since software development is a complex socio-technical activity that involves coordinating different disciplines and skill sets, it provides ample opportunities for waste to emerge. Waste is any activity that produces no value for the customer or user. Objective: The purpose of this paper is to identify and describe different types of waste in software development. Method: Following Constructivist Grounded Theory, we conducted a two-year five-month participant-observation study of eight software development projects at Pivotal, a software development consultancy. We also interviewed 33 software engineers, interaction designers, and product managers, and analyzed one year of retrospection topics. We iterated between analysis and theoretical sampling until achieving theoretical saturation. Results: This paper introduces the first empirical waste taxonomy. It identifies nine wastes and explores their causes, underlying tensions, and overall relationship to the waste taxonomy found in Lean Software Development. Limitations: Grounded Theory does not support statistical generalization. While the proposed taxonomy appears widely applicable, organizations with different software development cultures may experience different waste types. Conclusion: Software development projects manifest nine types of waste: building the wrong feature or product, mismanaging the backlog, rework, unnecessarily complex solutions, extraneous cognitive load, psychological distress, waiting/multitasking, knowledge loss, and ineffective communication.
Conference Paper
Full-text available
User motivations are often considered in human computer relations. The analysis of developer behavior often lacks this perspective. Herzberg's distinction of motivators and hygiene factors adds a level for the analyses of those sociotechnical phenomena that lead to skipping of security and privacy requirements especially in agile development projects. Requirements of security and privacy are not considered nice-to-have, but as necessary hygiene factors of systems attractiveness, motivation for extra effort is low with respect to those requirements. The motivators for developers - functionality that makes a system special and which is valued by customers and users are dominant for the decisions about priorities of development - hygiene factors like many security requirements get a lower priority. In this paper we introduce this theory with relation to known problems of (agile) development projects with respect to implementing security and privacy. We present this with a case study of mobile app development in a research project that we analyzed by security and privacy aspects. © IFIP International Federation for Information Processing 2014.
Article
Full-text available
The relationship between the self and the collective is discussed from the perspective of self-categorization theory. Self-categorization theory makes a basic distinction between personal and social identity as different levels of self-categorization. It shows how the emergent properties of group processes can be explained in terms of a shift in self perception from personal to social identity. It also elucidates how self-categorization varies with the social context. It argues that self-categorizing is inherently variable, fluid, and context dependent, as sedf-categories are social comparative and are always relative to a frame of reference. This notion has major implications for accepted ways of thinking about the self: The variability of self-categorizing provides the perceiver with behavioral and cognitive flexibility and ensures that cognition is always shaped by the social context in which it takes place.
Conference Paper
Full-text available
“Social debt” in software engineering informally refers to unforeseen project cost connected to a “suboptimal” development community. The causes of suboptimal development communities can be many, ranging from global distance to organisational barriers to wrong or uninformed socio-technical decisions (i.e., decisions that influence both social and technical aspects of software development). Much like technical debt, social debt impacts heavily on software development success. We argue that, to ensure quality software engineering, practitioners should be provided with mechanisms to detect and manage the social debt connected to their development communities. This paper defines and elaborates on social debt, pointing out relevant research paths. We illustrate social debt by comparison with technical debt and discuss common real-life scenarios that exhibit “sub-optimal” development communities.
Article
Since its original formulation, Tajfel’s Social Identity Theory (SIT) has broadened considerably from its original focus on intergroup relations and is now applied to a wide range of phenomena. Indeed, the ‘social identity approach’ has become one of the most widely used perspectives in contemporary social psychology. In this article, I examine the popularity of Tajfel’s writings on social identity and intergroup relations, especially over the last thirty years when they started to become more generally used. I offer a critical appraisal of the original SIT, both as a theory of intergroup relations and as a theory of identity, concluding that its real value lies in its success in offering an over‐arching perspective on the importance of groups in social life and its ability to stimulate new areas of research. I then widen the discussion to consider how the social identity perspective has been used in a number of other fields of enquiry.
Article
Software architecture is integral part of the software development; however, its integration in the development process has become more challenging with the transition from traditional to agile development methods, and with the architects becoming much more than technological experts responsible for high-level design. Some attention has been paid in recent years to the role of the architect, seeking a contemporary and comprehensive definition of this role. This paper reports on the results of an online survey, with the participation of 104 software architects, aimed at defining the soft aspects of the software architect's role. The results reveal that architects perform a variety of human-centered activities such as mentoring, leadership, reviewing and management,. Moreover, in the contexts of mentoring and leadership, software architects strive to do more than they currently do.
Conference Paper
Security tools can help developers build more secure software systems by helping developers detect or fix security vulnerabilities in source code. However, developers do not always use these tools. In this paper, we investigate a number of social factors that impact developers' adoption decisions, based on a multidisciplinary field of research called diffusion of innovations. We conducted 42 one-on-one interviews with professional software developers, and our results suggest a number of ways in which security tool adoption depends on developers' social environments and on the channels through which information about tools is communicated. For example, some participants trusted developers with strong reputations on the Internet as much as they trust their colleagues for information about security tools.