Content uploaded by Sarah Spiekermann
Author content
All content in this area was uploaded by Sarah Spiekermann on Jul 17, 2014
Content may be subject to copyright.
credit tk
july 2012 | vol. 55 | no. 7 | COMMUNICATIONS OF THE ACM 1
V
viewpoints
V
viewpoints
P
RIVACY MAINTENANCE AND
control is a social value
deeply embedded in our
societies. A global survey
found that 88% of people are
worried about who has access to their
data; over 80% expect governments to
regulate privacy and impose penalties
on companies that do not use data re-
sponsibly. But privacy regulation is not
easy. The Internet’s current economics
as well as national security manage-
ment benefit from the collection and
use of rich user profiles. Technology
constantly changes. And data is like
water: it flows and ripples in ways that
are difficult to predict. As a result, even
a well-conceived, general, and sustain-
able privacy regulation, such as the
European Data Protection Directive
95/46/EC, struggles to ensure its ef-
fectiveness. Companies regularly test
legal boundaries and many risk sanc-
tions for privacy breaches to avoid con-
straining their business
.
Against this background, the Eu-
ropean Commission and other regu-
latory bodies are looking for a more
effective, system- and context-specific
balance between citizens’ privacy
rights and the data needs of compa-
nies and governments. The apparent
solution proposed by regulators now,
but barely specified, is Privacy by De-
sign (PbD). At first sight, the power-
ful term seems to suggest we simply
need to take a few Privacy-Enhancing
Technologies (PETs) and add a good
dose of security, thereby creating a
fault-proof systems’ landscape for
the future. But the reality is much
more challenging. According to Ann
Cavoukian, the Ontario information
and privacy commissioner who first
coined the term, PbD stands for a pro-
active integration of technical privacy
principles in a system’s design (such
as privacy default settings or end-to-
end security of personal data) and the
recognition of privacy in a company’s
risk management processes.
1
PbD can
thus be defined as “an engineering
and strategic management approach
that commits to selectively and sus-
tainably minimize information sys-
tems’ privacy risks through technical
and governance controls.”
DOI:10.1145/2209249.2209263 Sarah Spiekermann
Viewpoint
The Challenges
of Privacy by Design
Heralded by regulators, Privacy by Design holds the promise to solve the digital
world’s privacy problems. But there are immense challenges, including management
commitment and step-by-step methods to integrate privacy into systems.
2 COMMUNICATIONS OF THE ACM | july 2012 | vol. 55 | no. 7
viewpoints
the privacy issue as a nuisance that is
better left to be fixed by their lawyers.
But even if managers took up the
privacy challenge and incorporated the
active governance of personal data into
their companies’ strategic asset man-
agement, they would not be able to de-
termine the right strategy without their
IT departments: PbD requires the guts
and ingenuity of engineers. As the term
implies, the design of systems needs to
be altered or focused to technically em-
brace the protection of peoples’ data.
Consequently, privacy must be on en-
gineers’ requirements radar from the
start of a new IT project. It needs to en-
ter the system development life cycle at
such an early point that architectural
decisions around data processing,
transfer, and storage can still be made.
Managers and engineers (as well as
other potential stakeholders) need to
assess the privacy risks they are willing
to take and jointly decide on techni-
cal and governance controls for those
risks they are not willing to bear.
Privacy by Design Challenges
Even when both managers and engi-
neers are committed to PbD, more
challenges must be overcome:
˲
Privacy is a fuzzy concept and is
thus difficult to protect. We need to
come to terms on what it is we want
to protect. Moreover, conceptually
and methodologically, privacy is often
confounded with security. We need to
start distinguishing security from pri-
vacy to know what to address with what
means.
˲
No agreed-upon methodology sup-
ports the systematic engineering of pri-
vacy into systems. System development
life cycles rarely leave room for privacy
considerations.
˲
Little knowledge exists about the
tangible and intangible benefits and
risks associated with companies’ pri-
vacy practices.
How can these challenges be over-
come? A Privacy Impact Assessment
(PIA) Framework recently created for
RFID technology (see http://ec.europa.
eu/information_society/policy/rfid/pia/
index_en.htm) has been called a “land-
mark for PbD” because it offers some
answers: The PIA Framework suggests
concrete privacy goals and describes a
method to reach them. Pragmatically,
it recommends that organizations use
However, a core challenge for PbD
is to get organizations’ management
involved in the privacy strategy. Man-
agement’s active involvement in the
corporate privacy strategy is key be-
cause personal data is the asset at the
heart of many companies’ business
models today. High privacy standards
can restrict the collection and use of
data for further analysis, limit strate-
gic options, and impact a firm’s bot-
tom line. Consider advertising rev-
enues boosted by behavioral targeting
practices and peoples’ presence on
social networking sites: without per-
sonal data, such services are unthink-
able. PbD proponents hardly embrace
these economic facts in their reason-
ing. In contrast, they take a threat per-
spective arguing that low privacy stan-
dards can provoke media backlash
and lead to costly legal trials around
privacy breaches. And indeed, distrust
caused by privacy breaches is probably
the only real blemish on the image of
technology companies such as Google
or Facebook. Brands are a precious
company asset, the most difficult to
build and the most costly to maintain.
Hence, brand managers should be
keen to avoid privacy risks. Equally, re-
cent data breach scandals have forced
CEOs to quit.
Despite these developments, many
managers still do not understand that
a sustainable strategy for one of their
company’s core assets—personal
data—requires them to actively man-
age this asset. Managing personal data
means optimizing its strategic use,
quality, and long-term availability. Un-
fortunately, few of today’s managers
want to take on this new challenge. In-
stead, they derive what they can from
the information bits they get and leave
Managing personal
data means
optimizing its
strategic use, quality,
and long-term
availability.
ad tk
viewpoints
july 2012 | vol. 55 | no. 7 | COMMUNICATIONS OF THE ACM 3
the specific legislative privacy prin-
ciples of their region or sector or the
OECD Privacy guidelines as a starting
point to determine privacy protection
goals. In Europe, for example, the Eu-
ropean Data Protection Directive 95/46/
EC or its successor should be taken. It
includes the following privacy goals:
˲
Safeguarding personal data qual-
ity through data avoidance, purpose-
specific processing, and transparency
vis-à-vis data subjects.
˲
Ensuring the legitimacy of person-
al and sensitive data processing.
˲
Complying with data subjects’
right to be informed, to object to the
processing of their data, and to access,
correct, and erase personal data.
˲
Ensuring confidentiality and secu-
rity of personal data.
Security and privacy in this view are
clearly distinguished. Security means
the confidentiality, integrity, and avail-
ability of personal data are ensured.
From a data protection perspective se-
curity is one of several means to ensure
privacy. A good PbD is unthinkable
without a good Security by Design plan.
The two approaches are in a “positive
sum” relationship. That said, privacy
is about the scarcity of personal data
creation and the maximization of in-
dividuals’ control over their personal
data. As a result, some worry that PbD
could undermine law enforcement
techniques that use criminals’ data
traces to find and convict them. More
research and international agreement
in areas such as anonymity revocation
are certainly needed to demonstrate
this need not be the case even if we
have privacy-friendly systems.
After privacy goals are clearly de-
fined, we must identify how to reach
them. The PIA Framework mentioned
earlier is built on the assumption that
a PbD methodology could largely re-
semble security risk assessment pro-
cesses such as NIST or ISO/IEC 27005.
These risk assessment processes iden-
tify potential threats to each protec-
tion goal. These threats and their prob-
abilities constitute a respective privacy
risk. All threats are then systematically
mitigated by technical or governance
controls. Where this cannot be done,
remaining risks are documented to be
addressed later.
As in security engineering, PbD
controls heavily rely on systems’ ar-
chitectures.
2
Privacy scholars still put
too much focus on information prac-
tices only (such as Web site privacy
policies). Instead, they should further
investigate how to build systems in
client-centric ways that maximize user
control and minimize network or ser-
vice provider involvement. Where such
privacy-friendly architectures are not
feasible (often for business reasons),
designers can support PbD by using
technically enforceable default poli-
cies (“opt-out” settings) or data scarci-
ty policies (erasure or granularity poli-
cies), data portability, and user access
and delete rights. Where such techni-
cal defaults are not feasible, concise,
accurate, and easy-to-understand no-
tices of data-handling practices and
contact points for user control and re-
dress should come into play.
A challenge, however, is that system
development life cycles and organiza-
tional engineering processes do not
consider such practices. So far, privacy
is simply not a primary consideration
for engineers when designing systems.
This gap raises many questions: When
should privacy requirements first en-
ter the system development life cycle?
Who should be responsible? Given that
privacy controls impact business goals,
who can actually decide on appropriate
measures? Must there be ongoing pri-
vacy management and practices moni-
toring? If organizations purchase stan-
dard software solutions or outsource
operations, pass data to third parties or
franchise their brands, who is respon-
sible for customer privacy?
Conclusion
For privacy to be embedded in the sys-
tem development life cycle and hence
in organizational processes, compa-
nies must be ready to embrace the do-
main. Unfortunately, we still have too
little knowledge about the real dam-
age that is being done to brands and
a company’s reputation when privacy
breaches occur. The stock market sees
some negligible short-term dips, but
people flock to data-intensive services
(such as social networks); so far, they
do not sanction companies for pri-
vacy breaches. So why invest in PbD
measures? Will there be any tangible
benefits from PbD that justifies the in-
vestment? Would people perhaps be
willing to pay for advertisement-free,
privacy-friendly services? Will they in-
cur switching costs and move to com-
petitive services that are more privacy
friendly? Would the 83% of U.S. con-
sumers who claim that they would stop
doing business with a company that
breaches their privacy really do so? We
need to better understand these dy-
namics as well as the current changes
in the social perception of what we re-
gard as private.
But research on the behavioral eco-
nomics of privacy has clearly demon-
strated that regardless of what people
say, they make irrational privacy deci-
sions and systematically underesti-
mate long-term privacy risks. And this
is not only the case for privacy-seeking
individuals, but also for managers who
are making PbD decisions for their
companies.
Therefore, I appreciate that PIAs are
suggested to become mandatory in the
new European data protection legisla-
tion. However, they must be accompa-
nied by a clear set of criteria for judging
their quality as well as sanctions for
noncompliance.
Most important, as this Viewpoint
makes clear: PIAs need to be made
mandatory for the designers of new
technologies—the IBMs and SAPs of
the world—and not just data control-
lers or processors who often get system
designs off the shelf without a say.
Making PIAs mandatory for system
designers could be a great step toward
PbD and support compliance with the
policies defined in Europe, in U.S. Pri-
vacy sectors laws, as well as the Safe-
Harbor Framework.
Only if we force those companies
that design systems, their manage-
ment and their engineers, to embrace
such process-driven, bottom-up ways
to embed laws an ethics into code can
we really protect the core values of our
Western liberal democracies and con-
stitutions.
References
1. Cavoukian, A. Privacy by Design Curriculum 2.0, 2011;
http://privacybydesign.ca/publications/.
2. Spiekermann, S. and Cranor, L.F. Engineering privacy.
IEEE Transactions on Software Engineering 35, 1
(Jan./Feb. 2009), 67–82.
Sarah Spiekermann (sspieker@wu.ac.at) is the head of
the Institute for Management Information Systems at
the Vienna University of Economics and Business, Vienna,
Austria.
Copyright held by author.