Chapter

GDPR and the Concept of Risk:: The Role of Risk, the Scope of Risk and the Technology Involved

If you want to read the PDF, try requesting it from the authors.

Abstract

The prominent position of risk in the GDPR has raised questions as to the meaning this concept should be given in the field of data protection. This article acknowledges the value of extracting information from the GDPR and using this information as means of interpretation of risk. The ‘role’ that risk holds in the GDPR as well as the ‘scope’ given to the concept, are both examined and provide the reader with valuable insight as to the legislature’s intentions with regard to the concept of risk. The article also underlines the importance of taking into account new technologies used in personal data processing operations. Technologies such as IoT, AI, algorithms, present characteristics (e.g. complexity, autonomy in behavior, processing and generation of vast amounts of personal data) that influence our understanding of risk in data protection in various ways.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Full text openly available via direct link at: http://digitalethicslab.oii.ox.ac.uk/sandra-wachter/#tab-7456a56b194183e7f18 To create fair and accountable AI and robotics, we need precise regulation and better methods to certify, explain, and audit inscrutable systems.
Conference Paper
Full-text available
The topic of the risk-based approach to data protection has stirred quite some controversy, with the main criticism arguing that it goes directly counter the fundamental right nature of the right to personal data protection. Given the latter, and following the opinion of the Article 29 Working Party, the General Data Protection Regulation (GDPR) has adopted a risk-based approach that is limited to matters of compliance. This presentation explores what is exactly meant by such compliance oriented risk-based approach, and more in particular how it can nonetheless take into account the whole spectrum of the data subjects' fundamental rights and freedoms affected by data processing operations.
Article
Full-text available
Mature information societies are characterised by mass production of data that provide insight into human behaviour. Analytics (as in big data analytics) has arisen as a practice to make sense of the data trails generated through interactions with networked devices, platforms and organisations. Persistent knowledge describing the behaviours and characteristics of people can be constructed over time, linking individuals into groups or classes of interest to the platform. Analytics allows for a new type of algorithmically assembled group to be formed that does not necessarily align with classes or attributes already protected by privacy and anti-discrimination law or addressed in fairness- and discrimination-aware analytics. Individuals are linked according to offline identifiers (e.g. age, ethnicity, geographical location) and shared behavioural identity tokens, allowing for predictions and decisions to be taken at a group rather than individual level. This article examines the ethical significance of such ad hoc groups in analytics and argues that the privacy interests of algorithmically assembled groups in inviolate personality must be recognised alongside individual privacy rights. Algorithmically grouped individuals have a collective interest in the creation of information about the group, and actions taken on its behalf. Group privacy is proposed as a third interest to balance alongside individual privacy and social, commercial and epistemic benefits when assessing the ethical acceptability of analytics platforms.
Chapter
Full-text available
In this chapter I identify three problems affecting the plausibility of group privacy and argue in favour of their resolution. The first problem concerns the nature of the groups in question. I shall argue that groups are neither discovered nor invented, but designed by the level of abstraction (LoA) at which a specific analysis of a social system is developed. Their design is therefore justified insofar as the purpose, guiding the choice of the LoA, is justified. This should remove the objection that groups cannot have a right to privacy because groups are mere artefacts (there are no groups, only individuals) or that, even if there are groups, it is too difficult to deal with them. The second problem concerns the possibility of attributing rights to groups. I shall argue that the same logic of attribution of a right to individuals may be used to attribute a right to a group, provided one modifies the LoA and now treats the whole group itself as an individual. This should remove the objection that, even if groups exist and are manageable, they cannot be treated as holders of rights. The third problem concerns the possibility of attributing a right to privacy to groups. I shall argue that sometimes it is the group and only the group, not its members, that is correctly identified as the correct holder of a right to privacy. This should remove the objection that privacy, as a group right, is a right held not by a group as a group but rather by the group’s members severally. The solutions of the three problems supports the thesis that an interpretation of privacy in terms of a protection of the information that constitutes an individual—both in terms of a single person and in terms of a group—is better suited than other interpretations to make sense of group privacy.
Conference Paper
Full-text available
Users share massive amounts of personal information and opinion with each other and different service providers every day. In such an interconnected setting, the privacy of individual users is bound to be affected by the decisions of others, giving rise to the phenomenon which we term as interdependent privacy. In this paper we define online privacy interdependence, show its existence through a study of Facebook application permissions, and model its impact through an Interdependent Privacy Game (IPG). We show that the arising negative externalities can steer the system into equilibria which are inefficient for both users and platform vendor. We also discuss how the underlying incentive misalignment, the absence of risk signals and low user awareness contribute to unfavorable outcomes.
Chapter
Full-text available
Algorithms (particularly those embedded in search engines, social media platforms, recommendation systems, and information databases) play an increasingly important role in selecting what information is considered most relevant to us, a crucial feature of our participation in public life. As we have embraced computational tools as our primary media of expression, we are subjecting human discourse and knowledge to the procedural logics that undergird computation. What we need is an interrogation of algorithms as a key feature of our information ecosystem, and of the cultural forms emerging in their shadows, with a close attention to where and in what ways the introduction of algorithms into human knowledge practices may have political ramifications. This essay is a conceptual map to do just that. It proposes a sociological analysis that does not conceive of algorithms as abstract, technical achievements, but suggests how to unpack the warm human and institutional choices that lie behind them, to see how algorithms are called into being by, enlisted as part of, and negotiated around collective efforts to know and be known.
Article
The Internet of Things (IoT) requires pervasive collection and linkage of user data to provide personalised experiences based on potentially invasive inferences. Consistent identification of users and devices is necessary for this functionality, which poses risks to user privacy. The General Data Protection Regulation (GDPR) contains numerous provisions relevant to these risks, which may nonetheless be insufficient to ensure a fair balance between users’ and developers’ interests. A three-step transparency model is described based on known privacy risks of the IoT, the GDPR’s governing principles, and weaknesses in its relevant provisions. Eleven ethical guidelines are proposed for IoT developers and data controllers on how information about the functionality of the IoT should be shared with users above the GDPR’s legally binding requirements. Two use cases demonstrate how the guidelines apply in practice: IoT in public spaces and connected cities, and connected cars.
Preprint
Many have called for algorithmic accountability: laws governing decision-making by complex algorithms, or AI. The EU’s General Data Protection Regulation (GDPR) now establishes exactly this. The recent debate over the right to explanation (a right to information about individual decisions made by algorithms) has obscured the significant algorithmic accountability regime established by the GDPR. The GDPR’s provisions on algorithmic accountability, which include a right to explanation, have the potential to be broader, stronger, and deeper than the preceding requirements of the Data Protection Directive. This Essay clarifies, largely for a U.S. audience, what the GDPR actually requires, incorporating recently released authoritative guidelines.
Article
Article 29 Working Party guidelines and the case law of the CJEU facilitate a plausible argument that in the near future everything will be or will contain personal data, leading to the application of data protection to everything: technology is rapidly moving towards perfect identifiability of information; datafication and advances in data analytics make everything (contain) information; and in increasingly ‘smart’ environments any information is likely to relate to a person in purpose or effect. At present, the broad notion of personal data is not problematic and even welcome. This will change in future. When the hyperconnected onlife world of data-driven agency arrives, the intensive compliance regime of the General Data Protection Regulation (GDPR) will become ‘the law of everything’, well-meant but impossible to maintain. By then we should abandon the distinction between personal and non-personal data, embrace the principle that all data processing should trigger protection, and understand how this protection can be scalable.
Article
The goal of this contribution is to understand the notion of risk as it is enshrined in the General Data Protection Regulation (GDPR), with a particular on Art. 35 providing for the obligation to carry out data protection impact assessments (DPIAs), the first risk management tool to be enshrined in EU data protection law, and which therefore contains a number of key elements in order to grasp the notion. The adoption of this risk-based approach has not come without a number of debates and controversies, notably on the scope and meaning of the risk-based approach. Yet, what has remained up to date out of the debate is the very notion of risk itself, which underpins the whole risk-based approach. The contribution uses the notions of risk and risk analysis as tools for describing and understanding risk in the GDPR. One of the main findings is that the GDPR risk is about "compliance risk" (i.e., the lower the compliance the higher the consequences upon the data subjects' rights). This stance is in direct contradiction with a number of positions arguing for a strict separation between compliance and risk issues. This contribution sees instead issues of compliance and risk to the data subjects rights and freedoms as deeply interconnected. The conclusion will use these discussions as a basis to address the long-standing debate on the differences between privacy impact assessments (PIAs) and DPIAs. They will also warn against the fact that ultimately the way risk is defined in the GDPR is somewhat irrelevant: what matters most is the methodology used and the type of risk at work therein.
Article
The importance of the concept of risk and risk management in the data protection field has grown explosively with the adoption of the General Data Protection Regulation (2016/679). The article explores the concept and the role of risk, as well as associated risk regulation mechanisms in EU data protection law. It shows that with the adoption of the General Data Protection Regulation there is evidence of a two-fold shift: first on a practical level, a shift towards risk-based data protection enforcement and compliance, and second a shift towards risk regulation on the broader regulatory level. The article analyses these shifts to enhance the understanding of the changing relationship between risk and EU data protection law. The article also discusses associated potential challenges when trying to manage multiple and heterogeneous risks to the rights and freedoms of individuals resulting from the processing of personal data.
Article
A Regulatory Mariage de Figaro: Risk Regulation, Data Protection, and Data Ethics - Volume 8 Issue 1 - Alessandro SPINA
Accountability is the ability to provide good reasons in order to explain and to justify actions, decisions and policies for a (hypothetical) forum of persons or organisations. Since decision-makers, both in the private and in the public sphere, increasingly rely on algorithms operating on Big Data for their decision-making, special mechanisms of accountability concerning the making and deployment of algorithms in that setting become gradually more urgent. In the upcoming General Data Protection Regulation, the importance of accountability and closely related concepts, such as transparency, as guiding protection principles, is emphasised. Yet, the accountability mechanisms inherent in the regulation cannot be appropriately applied to algorithms operating on Big Data and their societal impact. First, algorithms are complex. Second, algorithms often operate on a random group level, which may pose additional difficulties when interpreting and articulating the risks of algorithmic decision-making processes. In light of the possible significance of the impact on human beings, the complexities and the broader scope of algorithms in a big data setting call for accountability mechanisms that transcend the mechanisms that are now inherent in the regulation.
Chapter
This chapter focuses on big data analytics and, in this context, investigates the opportunity to consider informational privacy and data protection as collective rights. From this perspective, privacy and data protection are not interpreted as referring to a given individual, but as common to the individuals that are grouped into various categories by data gatherers. The peculiar nature of the groups generated by big data analytics requires an approach that cannot be exclusively based on individual rights. The new scale of data collection entails the recognition of a new layer, represented by groups’ need for the safeguard of their collective privacy and data protection rights. This dimension requires a specific regulatory framework, which should be mainly focused on the legal representation of these collective interests, on the provision of a mandatory multiple-impact assessment of the use of big data analytics and on the role played by data protection authorities.
Article
In brief • Inhaled corticosteroids increase the risk of oral candidiasis in people with chronic obstructive pulmonary disease • Low-dose aspirin is ineffective as primary cardiovascular prevention in type 2 diabetes • Electronic prescriptions reduce the risk of nonadherence compared to paper scrips • Prescriptions for oral anti-osteoporosis drugs in the UK have plateaued in men and decreased among women in recent years, while gaps in care remain
Chapter
Accountability made its formal debut in the field of international data protection more than 30 years ago, when it was adopted as a data protection principle in the Organization for Economic Cooperation and Development (OECD) Guidelines.4 As of late, the policy discourse on the regulation of data protection has been rife with references to accountability. Most notably, in 2010, the Article 29 Data Protection Working Party issued an Opinion on the principle of accountability in which it elaborated upon the possibility of including a general provision on accountability in the revised Data Protection Directive.5 The European Commission has also made a reference to the possibility of introducing a principle of accountability in its subsequent Communication outlining a strategy for modernising the EU data protection framework.6 Within the context of these documents, the introduction of an accountability principle is seen mainly as a way to help ensure ‘that data controllers put in place effective policies and mechanisms to ensure compliance with data protection rules’.7 While this objective is in line with previous iterations of the principle of accountability, important nuances exist among the various documents which have promulgated accountability as a data protection principle.
Book
Nearly two decades after the EU first enacted data protection rules, key questions about the nature and scope of this EU policy, and the harms it seeks to prevent, remain unanswered. The inclusion of a Right to Data Protection in the EU Charter has increased the salience of these questions, which must be addressed in order to ensure the legitimacy, effectiveness and development of this Charter right and the EU data protection regime more generally. The Foundations of EU Data Protection Law is a timely and important work which sheds new light on this neglected area of law, challenging the widespread assumption that data protection is merely a subset of the right to privacy. By positioning EU data protection law within a comprehensive conceptual framework, it argues that data protection has evolved from a regulatory instrument into a fundamental right in the EU legal order and that this right grants individuals more control over more forms of data than the right to privacy. It suggests that this dimension of the right to data protection should be explicitly recognised, while identifying the practical and conceptual limits of individual control over personal data. At a time when EU data protection law is sitting firmly in the international spotlight, this book offers academics, policy-makers, and practitioners a coherent vision for the future of this key policy and fundamental right in the EU legal order, and how best to realise
Article
The multireference configuration interaction (MRCI) method, based on complete-active-space self-consistent-field wave functions, has been employed to study the X1Σ + and A1Π states of NiC. Potential energy curves, dipole moments, and transition dipole moment functions (TDM) have been computed over a wide range of internuclear separations. Based on these results, the Franck-Condon factors and radiative lifetimes for several vibrational levels of the A1Π state were computed.
Article
We live in an age of “big data.” Data have become the raw material of production, a new source for immense economic and social value. Advances in data mining and analytics and the massive increase in computing power and data storage capacity have expanded by orders of magnitude the scope of information available for businesses and government. Data are now available for analysis in raw form, escaping the confines of structured databases and enhancing researchers’ abilities to identify correlations and conceive of new, unanticipated uses for existing information. In addition, the increasing number of people, devices, and sensors that are now connected by digital networks has revolutionized the ability to generate, communicate, share, and access data. Data creates enormous value for the world economy, driving innovation, productivity, efficiency and growth. At the same time, the “data deluge” presents privacy concerns which could stir a regulatory backlash dampening the data economy and stifling innovation. In order to craft a balance between beneficial uses of data and in individual privacy, policymakers must address some of the most fundamental concepts of privacy law, including the definition of “personally identifiable information”, the role of individual control, and the principles of data minimization and purpose limitation. This article emphasizes the importance of providing individuals with access to their data in usable format. This will let individuals share the wealth created by their information and incentivize developers to offer user-side features and applications harnessing the value of big data. Where individual access to data is impracticable, data are likely to be de-identified to an extent sufficient to diminish privacy concerns. In addition, organizations should be required to disclose their decisional criteria, since in a big data world it is often not the data but rather the inferences drawn from them that give cause for concern.
Chapter
We live at a time when the issues related to the protection of personal data feature a markedly contradictory approach – indeed, a veritable social, political and institutional schizophrenia. There is increased awareness of the importance of data protection as regards not only the protection of the private lives of individuals but their very freedom. This approach is reflected by many national and international documents, lastly the Charter of Fundamental Rights of the European Union, where data protection is recognised as a fundamental, autonomous right. Still, it is increasingly difficult to respect this general assumption, because internal and international security requirements, market interests and the re-organisation of the public administration are heading towards the diminution of relevant safeguards, or pushing essential guarantees towards disappearance.
The Challenge of "Big Data" for Data Protection' 2 International Data Privacy Law
  • C Kuner
Kuner, C., et al.: The Challenge of "Big Data" for Data Protection' 2 International Data Privacy Law 47 (2012)
To Say What the Law of the EU Is : Methods of Interpretation and the European Court of Justice’ EUI Working Papers AEL
  • K Lenaerts
  • J A Gutiérrez-Fons
Lenaerts, K., Gutiérrez-Fons, J.A.: To Say What the Law of the EU Is : Methods of Interpretation and the European Court of Justice' EUI Working Papers AEL 2013/9 (2013).
Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian
  • C Cadwalladr
  • E Graham-Harrison
Cadwalladr, C., Graham-Harrison, E.: Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian, 17 March 2018
The “risk revolution” in EU data protection law: we can’t have our cake and eat it, too
  • C Quelle
  • R Leenes
  • R Van Brakel
  • S Gutwirth
  • De Hert
Quelle, C.: 'The "risk revolution" in EU data protection law: we can't have our cake and eat it, too. In: Leenes, R., van Brakel, R., Gutwirth, S., De Hert, P. (eds.) Data Protection and Privacy: The Age of Intelligent Machines, 1st edn, vol. 10. Hart Publishing (2017)
the present preliminary reference is affected by the fact that when the Commission proposal for the Directive was made in 1990 the internet in the present sense of the www did not exist and nor where there any search engines
  • S L Case Google Spain
  • A G Jääskinen
Case Google Spain SL, Opinion of AG JÄÄSKINEN [4], para 10: "the present preliminary reference is affected by the fact that when the Commission proposal for the Directive was made in 1990 the internet in the present sense of the www did not exist and nor where there any search engines. […] nobody could foresee how it would revolutionise the world".
Communication from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions
Commission, Communication from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions, Artificial Intelligence for Europe COM(2018) 237 Final. https:// ec.europa.eu/digital-single-market/en/news/communication-artificial-intelligence-europe
Communication from the Commission to the European Parliament and the Council, Stronger protection, new opportunities -Commission guidance on the direct application of the General Data Protection Regulation as of 25
Commission, Communication from the Commission to the European Parliament and the Council, Stronger protection, new opportunities -Commission guidance on the direct application of the General Data Protection Regulation as of 25 May 2018, COM(2018) 43 final. https://ec.europa.eu/commission/sites/beta-political/files/data-protection-communicatio n-com.2018.43.3_en.pdf
Commission Staff Working Document: Liability for Emerging Digital Technologies Accompanying the document Communication from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions
Commission, Commission Staff Working Document: Liability for Emerging Digital Technologies Accompanying the document Communication from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions, Artificial Intelligence for Europe SWD (2018) 137 Final (2018). https://ec.europa.eu/digital-single-market/en/news/europeancommission-staff-working-document-liability-emerging-digital-technologies
Internet and Electoral Campaigns -Study on the use of Internet in electoral campaigns
Council of Europe, Internet and Electoral Campaigns -Study on the use of Internet in electoral campaigns, DGI(2017)11. https://rm.coe.int/use-of-internet-in-electoral-campaign s-/16807c0e24
Algorithms and Human Rights Study on the Human Rights dimensions of automated data processing techniques (in particular algorithms) and possible regulatory implications
Council of Europe, Algorithms and Human Rights Study on the Human Rights dimensions of automated data processing techniques (in particular algorithms) and possible regulatory implications, DGI(2017)12. https://edoc.coe.int/en/internet/7589-algorithms-and-humanrights-study-on-the-human-rights-dimensions-of-automated-data-processing-techniques-andpossible-regulatory-implications.html
Preliminary Opinion on Privacy by Design
EDPS (European Data Protection Supervisor), 'Opinion 5/2018, Preliminary Opinion on Privacy by Design', 31 May 2018
Opinion of the EDPS on the Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of Regions -"A comprehensive approach on personal data protection in the European Union
EDPS (European Data Protection Supervisor), Opinion of the EDPS on the Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of Regions -"A comprehensive approach on personal data protection in the European Union", Brussels, 14 January 2011. https://edps.europa.eu/sites/ edp/files/publication/11-01-14_personal_data_protection_en.pdf
Joined Cases C-293/12 and C-594/12
Joined Cases C-293/12 and C-594/12, Digital Rights Ireland and Seitlinger and Others ECLI:EU:C: 2014:238 (2014)