Content uploaded by Kimberly A. Houser
Author content
All content in this area was uploaded by Kimberly A. Houser on Oct 15, 2018
Content may be subject to copyright.
1
GDPR: The End of Google and Facebook or a New Paradigm in Data
Privacy?
Houser, K. A. and Voss, G. (forthcoming 2018). GDPR: The end of Google and Facebook or a New
Paradigm in Data Privacy? Richmond Journal of Law & Technology 25-1.
Kimberly A. Houser, Assistant Professor of Legal Studies, Oklahoma State University,
335 Spears College of Business, Stillwater, OK 74074, 405-744-9430, k.houser@okstate.edu
W. Gregory Voss, Associate Professor of Business Law, Toulouse Business School, 1 place Alphonse
Jourdain – CS 66810, 31068 Toulouse Cedex 7, France, +33-561-294-746, g.voss@tbs-education.fr
Abstract
EU Data Protection Agencies have been vigorously enforcing violations of regional and
national data protection law in recent years against U.S. tech companies but few changes have
been made to their business model of exchanging free services for personal data. With the
Cambridge Analytica debacle revealing how insufficient American privacy law is, we now find
ourselves questioning whether the General Data Protection Regulation (GDPR) is not the
onerous 99 article regulation to be feared, but rather a creation years ahead of its time. This
paper will explain how the differences in U.S. and EU privacy and data protection law and
ideology have led to a wide divergence in enforcement actions and what U.S. companies will
need to do in order legally process the data of their users in the EU. The failure of U.S. tech
companies to fulfill the requirements of the GDPR, which has extraterritorial application and
becomes applicable on May 25, 2018, could result in massive fines (up to $4 billion using the
example of Google). The GDPR will mandate a completely new business model for these U.S.
tech companies that have been operating for well over a decade with very loose restrictions
under U.S. law. Will the GDPR be the end of Google and Facebook or will it be embraced as
the gold standard of how companies ought to operate?
Table of Contents
GDPR: The End of Google and Facebook? .......................................................................................... 1
Introduction ........................................................................................................................................... 3
I. Current EU and U.S. privacy and data security law ..................................................................... 6
A. EU privacy and data security law ........................................................................................... 7
1. 95 directive ......................................................................................................................... 7
2. Safe harbor .......................................................................................................................... 8
B. U.S. privacy and data security law ....................................................................................... 10
1. Federal privacy law .......................................................................................................... 10
2. State data protection law ................................................................................................... 12
C. Difference between EU and U.S. laws ................................................................................. 13
II. Enforcement actions against U.S. tech companies ..................................................................... 16
A. EU enforcement actions ....................................................................................................... 16
2
1. Google .............................................................................................................................. 16
2. Facebook ........................................................................................................................... 23
a) Schrems (Safe Harbor case) ............................................................................................. 23
b) Facebook cookies cases - CNIL ........................................................................................ 24
B. U.S. Data Privacy Law Enforcement Actions ...................................................................... 27
1. Google .............................................................................................................................. 27
2. Facebook ........................................................................................................................... 30
C. Difference between EU and U.S. enforcement actions ........................................................ 31
III. General Data Protection Regulation ........................................................................................ 36
A. New/expanded concepts ....................................................................................................... 37
1. Regulation v. directive ...................................................................................................... 38
2. Increased penalties ............................................................................................................ 38
3. Expanded definition of personal data ............................................................................... 39
4. Extraterritoriality .............................................................................................................. 40
5. Ongoing requirements/culture change .............................................................................. 45
B. Important provisions of the GDPR....................................................................................... 46
1. Applicability to controllers and processors ...................................................................... 46
2. Right to be forgotten ......................................................................................................... 47
3. Right to data portability .................................................................................................... 48
4. Lawful Basis for Processing ............................................................................................. 49
5. Data Protection Officer ..................................................................................................... 49
6. Affirmative Consent ......................................................................................................... 51
7. Data Protection by design ................................................................................................. 53
8. Impact Assessments .......................................................................................................... 54
9. Profiling ............................................................................................................................ 56
10. Security Requirements ...................................................................................................... 57
11. Data breach notification requirements .............................................................................. 59
C. Steps for compliance with the GDPR .................................................................................. 61
D. Cross-border Data Transfers ................................................................................................ 63
1. Privacy Shield ................................................................................................................... 63
2. Model Contract Clauses.................................................................................................... 65
3. Binding Corporate Rules (BCRs) ..................................................................................... 66
IV. Conclusion ............................................................................................................................... 67
3
Introduction
Recently, the world watched in both shock and amusement as Mark Zuckerberg tried to explain
the Facebook business model to hopelessly out-of-touch U.S. Senators. Simply stated, Facebook and
Google provide a free service to users in exchange for the use of their data.
1
The information is
collected, categorized and analyzed in order to provide extremely targeted advertisements, the bread
and butter of these giant tech companies’ business model. The advertisers then gain access to this
data.
2
Neither Google nor Facebook charge users for access to their platforms,
3
but do charge for the
access to the user profiles created.
4
Although your name is not provided to these advertisers, a unique
identifier is.
5
Although it is in Facebook and Google’s financial interest to keep the contents of your
unique identifier proprietary, data mining of their sites does occur and your data and unique identifier
(if not your name, address, and most recent purchase) are collected and shared.
6
While this business model is legal in the U.S., the way these tech companies operate has long
been a point of contention for European regulators. The FTC has brought only a handful of actions
against companies such as Facebook and Google, but are limited in what they can do because of the
lack of omnibus privacy and data security legislation.
7
The FTC is the administrative agency charged
1
See Chris Jay Hoofnagle & Jan Whittington, Free: Accounting for the Costs of the Internet’s Most Popular Price, 61
UCLA L. Rev. 606, 628 (2014).
2
Mark Zuckerberg indicates that the data is not sold, however, apps apparently do gain access to the data which can be
used to target individuals on other platforms. Thus, although the data is not sold, access to the data is. Other tech
companies do in fact sell the data, such as Twitter. In addition, the information online can also be mined by third parties.
Selina Wang, Twitter Sold Data Access to Cambridge Analytica-Linked Researcher, BLOOMBERG (Apr. 30, 2018),
https://www.bloomberg.com/news/articles/2018-04-29/twitter-sold-cambridge-analytica-researcher-public-data-access.
3
Cf., Scott Leland, Why google is not a platform, FORBES (OCT. 19, 2011),
https://www.forbes.com/sites/scottcleland/2011/10/19/why-googles-not-a-platform/#554e45ef6bbe.
4
G. S. Hans, Privacy Policies, Terms of Service, and FTC Enforcement: Broadening Unfairness Regulation for a New
Era, 19 MICH. TELECOMM. & TECH. L. REV. 163, 164 (2012), http://repository.law.umich.edu/mttlr/vol19/iss1/5.
5
Each smartphone, for example, is uniquely differentiated from the other millions by Google Advertiser Identifier
(GAID), so that you may receive targeted ads on your phone while Google collects the information given by your phone,
such as location. Mitchell Reichgut, Advertiser ID Tracking And What It Means For You, FORBES (May 5, 2016),
https://www.forbes.com/sites/onmarketing/2016/05/16/advertiser-id-tracking-and-what-it-means-for-
you/#5500800e18bf.
6
It is also clear that this data can be vulnerable to outside parties, as the Cambridge Analytical situation indicates.
7
See, e.g., Consent decrees between the Federal Trade Commission (FTC) and, respectively, Facebook and Google both
first came out in 2011. Press Release, Fed. Trade Comm'n, Facebook Settles FTC Charges (Nov. 29, 2011),
https://www.ftc.gov/news-events/press-releases/2011/11/facebook-settles-ftc-charges-it-deceived-consumers-failing-
keep; Press Release, Fed. Trade Comm'n, FTC Approves Final Settlement with Facebook (Aug. 10, 2012),
4
with protecting consumers against deceptive and unfair trade practices. Most of the actions center on
how these companies engaged in deceptive and unfair practices by misrepresenting their use and
sharing of their users’ data.
On the other side of the Atlantic, EU Data Protection Authorities (DPAs) have been actively
and consistently enforcing regional and national privacy laws against these companies. Hundreds of
cases have been brought against U.S. tech companies by these local DPAs,
8
but while most of these
actions have resulted in finding that these tech companies have violated privacy and data security law,
the consequence has predominantly been small fines due to the limits in the local regulations adopted
pursuant to the European Data Protection Directive 95/46/EC (95 Directive).
9
This may all change on
May 25, 2018, when the EU’s General Data Protection Regulation (GDPR)
10
becomes applicable.
11
The GDPR, which replaces the 95 Directive, will allow European data protection authorities (“DPAs”)
to fine companies up to the higher of €20,000,000 or 4 percent of their global turnover for the most
serious category of data protection violations,
12
potentially increasing maximum fines to over $1
billion for a company such as Facebook and over $3 billion for one such as Google.
https://www.ftc.gov/news-events/press-releases/2012/08/ftc-approves-final-settlement-facebook. Facebook, Inc., Docket
No. C-4365, File No. 092-3184 (Fed. Trade Comm'n July 27, 2012) (complaint),
https://www.ftc.gov/sites/default/files/documents/cases/2012/08/120810facebookcmpt.pdf. Google, Inc., Docket No. C-
4336, File No. 102-3136 (Fed. Trade Comm'n Oct. 13, 2011) (agreement containing consent order),
https://www.ftc.gov/sites/default/files/documents/cases/2011/10/111024googlebuzzdo.pdf. See also, Consent decree
between the FTC and Twitter from 2011. Twitter, Inc., Docket No. C-4316, File No. 092-3093 (Fed. Trade Comm'n
Mar. 2, 2011) (decision and order),
https://www.ftc.gov/sites/default/files/documents/cases/2011/03/110311twittercmpt.pdf.
8
See Baker McKenzie, Data Protection Enforcement, http://enforcement.bakermckenzie.com/.
9
In 2016, Google was fined €100,000 by the French data protection agency -- Commission Nationale de l'Informatique
et des Libertés (CNIL) -- for failing to apply Europe’s “right to be forgotten” law stemming from a 2014 ruling by the
European Court of Justice (ECJ) which gave citizens the right to have internet search engines remove inaccurate or
insufficient information about them from search results. In 2017, Facebook was fined €150,000 by the CNIL for
violating the French Data Protection Act, by collecting users’ “personal data” – the European term that is similar to (but
more expansive than) the U.S. term “personally identifiable information” (PII) – and using a cookie to obtain behavioral
information, without adequately informing users. CNIL, FACEBOOK Sanctioned for Several Breaches of the French
Data Protection Act, CNIL (May 16, 2017), https://www.cnil.fr/en/facebook-sanctioned-several-breaches-french-data-
protection-act.
10
Regulation 2016/679 of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal
Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation),
2016 O.J. (L 119) 1 [hereinafter GDPR].
11
Id., arts. 99(2) & 94(1).
12
Id., arts. 83(5) & 83(6).
5
Although the stated reason for the passage of the GDPR is to harmonize laws across member
states and to give users more control over their data, it seems likely that this regulation is also intended
to hold all companies in the tech field to the same standards. Because of the lax privacy and data
security laws in the U.S. tech companies like Google and Facebook have become behemoths:
Facebook with 80% of the market share of social media platforms
13
and Google with 90% of the
market share of search engines.
14
There is a perception that these American companies are at an unfair
advantage because of the lax privacy laws in the U.S. as compared to the EU. Members of the European
Commission have indicated that the extraterritoriality of the GDPR will eliminate the unfair advantage
that these U.S. tech companies enjoyed with respect to their privacy practices and virtually unregulated
use of data will, at least with respect to users in Europe, opening the way for European tech companies
the ability to compete on a level playing field.
The reason for the differences in these laws
15
and enforcement actions stems from the vastly
different ideologies behind American and European data protection laws, which need to be understood
in order to fully interpret European privacy and data protection laws. Because the EU is setting a much
higher standard in privacy and data protection law,
16
it is essential that U.S. companies take action
now to comply with these requirements or potentially lose the ability to operate in the EU based on
13
Social Media Stats Worldwide, Apr. 2017 - Apr. 2018, http://gs.statcounter.com/social-media-stats. This is down from
88% one year ago most likely due in part to the exposure of the connection between Facebook and Cambridge Analytica.
14
Search Engine Market Share Worldwide, Apr. 2017 – Apr. 2018, http://gs.statcounter.com/search-engine-market-
share. Unlike Facebook, there has been no meaningful decline in Google’s market share over the past year.
15
For a discussion of this in the context of sensitive consumer data and cloud computing, see Nancy J. King & V.T.
Raja, What Do They Really Know About Me in the Cloud? A Comparative Law Perspective on Protecting Privacy and
Security of Sensitive Consumer Data, 50 AM. BUS. L.J. 413 (2013); with respect to the definition of personal data (and
personally-identifiable information), see Paul M. Schwartz & Daniel J. Solove, Reconciling Personal Information in the
United States and European Union, 102 CALIF. L. REV. 877 (2014). For a more general comparative view, see James Q.
Whitman, The Two Western Cultures of Privacy: Dignity versus Liberty, 113 YALE L.J. 1151 (2004); see also Paul M.
Schwartz, The EU-U.S. Privacy Collision: A Turn to Institutions and Procedures, 126 HARV. L. REV. 1966 (2013).
16
See Schwartz, supra note 15, at 1966-67 (noting the considerable impact of European law and the “relative lack of
American influence”); see also Daniel J. Solove & Paul M. Schwartz, INFORMATION PRIVACY LAW 1098 (5th ed., 2015)
(“Outside of Europe, other countries from around the world are moving toward adopting comprehensive privacy
legislation on the European model”); and Graham Greenleaf, ASIAN DATA PRIVACY LAWS: TRADE AND HUMAN RIGHTS
PERSPECTIVES 7 (Pbk. ed., 2017) (2014) (pointing out the “major influence” of European data privacy standards
worldwide, including Asia, and the “increasingly isolated position” of the United States); and Griffin Drake, Navigating
the Atlantic: Understanding EU Data Privacy Compliance Amidst a Sea of Uncertainty, 91 S. CALIF. L. REV. 163, 175
(2017) (noting the trend toward following EU-style legislation but highlighting the examples of the United States and
China as bucking this trend).
6
their current business model.
17
It is also likely that the Snowden revelations concerning the U.S.
government’s massive surveillance program of which led to invalidation of the Safe Harbor
Framework that companies had been relying on in order to allow cross-border transfers of personal
data from the EU to the U.S. contributed to the shoring up of EU privacy and data security law and its
application to players outside of the EU.
Data privacy is an important global social and economic issue. According to a PwC survey,
92% of American companies consider compliance with the GDPR a top priority as it becomes
applicable on May 25, 2018,
18
however, it will require a massive paradigm shift for American
companies trading in data. This paper will help enable a greater understanding of the differences
between American and European privacy standards and what this will mean for U.S. companies, using
Google and Facebook cases as examples.
This paper will discuss (I) current privacy and data security law in the EU and U.S., (II)
prototypical enforcement actions against Google and Facebook for violating privacy and data
security laws demonstrating the different handling in each of the two blocks, (III) a summary of the
provisions of the GDPR and (IV) guidance for compliance with the GDPR and what it means for
U.S. tech companies.
I. Current EU and U.S. privacy and data security law
The right to data protection is one of the fundamental rights in the EU, and such rights are
considered inalienable.
19
This concept “appears to be grounded in the concept of human dignity,”
20
which highlights one of the differences between U.S. and EU law.
21
Although some EU member
states began enacting privacy laws beginning in the 1970s, the first attempt to harmonize laws
17
See Paul M. Schwartz, The EU-U.S. Privacy Collision: A Turn to Institutions and Procedures, 126 HARV. L. REV.
1966 (2013).
18
GDPR Compliance Top Data Protection Priority for 92% of US Organizations in 2017, According to PwC Survey,
PWC (Jan. 23, 2017), https://www.pwc.com/us/en/press-releases/2017/pwc-gdpr-compliance-press-release.html.
19
See ORLA LYNSKEY, THE FOUNDATIONS OF EU DATA PROTECTION LAW (2015) 240-244.
20
Id., at 242.
21
See generally, Whitman, supra note 15.
7
throughout the EU was the 95 Directive which will be replaced with the GDPR. The U.S., on the
other hand, does not have any overarching federal privacy statute and handles privacy and data
security on a sectoral basis.
22
A. EU privacy and data security law
1. 95 directive
The European Data Protection Directive 95/46/EC was adopted by the EU to protect the
privacy of personal data collected for or about citizens of the EU, especially as it relates to
processing, using or exchanging such data.
23
It was based on recommendations proposed by the
Organisation for Economic Co-operation and Development's (OECD) and was designed to
harmonize data protection laws and establish rules for the transfer of personal data to “third
countries” outside of the Union. It resulted in the creation of data protection authorities (DPAs) in
each of the OECD member states and charged them with creating and enforcing regulations to meet
the privacy and data security requirements of the 95 Directive. Overall, the directive closely matches
the recommendations of the OECD and the core concepts of privacy as a fundamental human right.
24
The OECD recommendations are founded on seven principles, which are numbered starting
with seven in the text:
7. There should be limits to the collection of personal data and any such data should be obtained
by lawful and fair means and, where appropriate, with the knowledge or consent of the data
subject.
8. Personal data should be relevant to the purposes for which they are to be used, and, to the
extent necessary for those purposes, should be accurate, complete and kept up-to-date.
9. The purposes for which personal data are collected should be specified not later than at the
time of data collection and the subsequent use limited to the fulfilment of those purposes or
22
Paul M. Schwartz, The Value of Privacy Federalism, in SOCIAL DIMENSIONS OF PRIVACY 324, 324–27 (Beate Roessler
& Dorota Mokrosinska eds., 2015) as cited in Margot E. Kaminski, When the Default is No Penalty: Negotiating Privacy
at the NTIA, 94 DENVER UNIV. L. REV. 923, 924 n.1 (2016),
http://static1.1.sqspcdn.com/static/f/276323/27253932/1474324079427/93.4_925+Kaminski.pdf?token=AaHDdG6%2F
WtSPEnn7g4p1FFmVpoo%3D.
23
Protection of Personal Data, EUR-LEX (Mar. 8, 2014), https://eur-lex.europa.eu/legal-
content/EN/TXT/?uri=LEGISSUM%3Al14012.
24
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals
with Regard to the Processing of Personal Data and on the Free Movement of Such Data, 1995 O.J. (L 281) 31
hereinafter 95 Directive.
8
such others as are not incompatible with those purposes and as are specified on each occasion
of change of purpose.
10. Personal data should not be disclosed, made available or otherwise used for purposes other
than those specified in accordance with [the previous principle] except:
a) with the consent of the data subject; or
b) by the authority of law.
11. Personal data should be protected by reasonable security safeguards against such risks as loss
or unauthorised access, destruction, use, modification or disclosure of data.
12. There should be a general policy of openness about developments, practices and policies with
respect to personal data. Means should be readily available of establishing the existence and
nature of personal data, and the main purposes of their use, as well as the identity and usual
residence of the data controller.
13. An individual should have the right:
a) to obtain from a data controller, or otherwise, confirmation of whether or not the
data controller has data relating to him;
b) to have communicated to him, data relating to him within a reasonable time;
at a charge, if any, that is not excessive;
in a reasonable manner; and
in a form that is readily intelligible to him;
c) to be given reasons if a request made under subparagraphs(a) and (b) is denied, and
to be able to challenge such denial; and
d) to challenge data relating to him and, if the challenge is successful to have the data
erased, rectified, completed or amended.
14. A data controller should be accountable for complying with measures which give effect to
the principles stated above.
25
The enforcement actions discussed in this paper were all brought under regulations adopted
by member states to comply with the 95 Directive. It also limited the transfer of data outside of the
EU to entities located in countries whose laws provided adequate levels of protection for the data to
the extent protected in the EU.
26
2. Safe harbor
The United States and the European Union are each other’s largest trade and investment
partners with the trade in goods and services amounting to over $1 trillion dollar per year.
27
One of
25
OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, ORG. FOR ECON.
COOPERATION & DEV.,
http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm.
26
Because the U.S. laws were considered inadequate, a cross border transfer document had to be negotiated between the
EU and the U.S. It was known as the Safe Harbor and is discussed in the next section.
27
In 2016, the figure that results from adding exports and imports between the United States and the European Union in
both merchandise and services is €1.04 trillion. See European Commission, Directorate-General for Trade, USA – Trade
Statistics – Overview, Nov. 22, 2017, http://trade.ec.europa.eu/doclib/docs/2006/september/tradoc_111704.pdf.
Converted into dollars at the Treasury reporting rate of exchange as of year-end 2016
(https://www.fiscal.treasury.gov/fsreports/rpt/treasRptRateExch/itin-12-2016.pdf) the figure is roughly $1.096 trillion.
9
the provisions of the 95 Directive limited the transfer of personal data outside of the EU to countries
with an adequate level of protection of personal data,
28
unless a derogation applied.
29
Because
information regarding European citizens could be transferred to and stored in or processed in the
United States, there was a concern that the lax privacy laws in the United States were insufficient to
protect European citizens from harm. Because the U.S. did not meet the required standard of
protection, the Safe Harbor agreement was negotiated between the two blocks to allow for U.S.
companies to transfer personal data to the U.S.
30
The Safe Harbor agreement provided that if U.S.
companies, receiving data transfers agreed to comply with the standards contained therein, and self-
certified as compliant, they were safe from data protection law enforcement action by European
DPAs, because enforcement actions would be taken by the FTC for noncompliance.
31
In recent years, cross-border data flow has expanded exponentially due to the widespread use
of the internet. The 95 Directive was intended to address concerns with the flow of information to
companies outside of the European Union.
32
It defines the “data controller” as an entity that
determines the purposes and means of the collection and processing of the information
33
and the
“processor” as the party that processes the information on behalf of the controller.
34
As a practical
matter, when speaking of cross-border transfers under the 95 Directive, the controller is usually
located within the European Union, and is transferring personal information outside of the European
Union for processing. The controller is clearly bound to respect the laws of the European Union, but
28
95 Directive art. 25(1).
29
Id. art. 26.
30
See Klint Finley, Thank (Or Blame) Snowden For Europe's Big Privacy Ruling, WIRED (Oct. 6, 2015 09:06 PM),
https://www.wired.com/2015/10/tech-companies-can-blame-snowden-data-privacy-decision/.
31
Nonetheless, the EU data controller was still subject to the provisions of the 95 Directive and as such subject to the
jurisdiction of one or more DPAs in the European Union, (which could include the EU subsidiary of a U.S. firm, or
could even extend to a U.S. firm found to have an establishment in the European Union, in certain circumstances).
32
Indeed, within the EU, the 95 Directive and the GDPR are meant to ensure the free-flow of data given the harmonized
level of protection throughout the EU. GDPR art. 1(3).
33
The 95 Directive defines “controller” as the “natural or legal person, public authority, agency or any other body which
alone or jointly with others determines the purposes and means of the processing of personal data; …” 95 Directive art.
2(d). The corresponding definition in the GDPR remains largely the same. GDPR art. 4(7).
34
The 95 Directive defines the processor as a “natural or legal person, public authority, agency or any other body which
processes personal data on behalf of the controller.” 95 Directive art. 2(e). The corresponding definition in the GDPR is
virtually identical. See GDPR art. 4(8).
10
the receiving American companies have argued that the laws do not apply to the processor located in
the United States.
35
The Safe Harbor remained in place until the European Court of Justice
invalidated the agreement in large part due to the Snowden revelations regarding the U.S.
government’s monitoring and secret collection of information.
36
Its replacement, the Privacy Shield,
is discussed in Section IV.C.1.
B. U.S. privacy and data security law
1. Federal privacy law
Unlike EU data protection law, U.S. privacy law is handled on a sectorial basis. The handling
and processing of personal data is regulated by both states and the federal government, but for the
most part, relates to the specific category of information at issue. The categories covered under
federal law are healthcare data (under the Health Information and Portability Accountability Act,
HIPAA),
37
financial data (under the Gramm Leach Bliley Act, GLB)
38
children’s information (under
the Children’s Online Privacy Protection Act, COPPA),
39
students’ personal information (under
Family Educational Rights and Privacy Act, FERPA),
40
and consumer information (under the Fair
Credit Reporting Act, FCRA),
41
but, significantly, these statutes were enacted prior to significant
personal use of the internet.
42
The main regulatory body addressing privacy breaches is the FTC.
43
35
Protection of personal data, supra note 24.
36
Europeans are much more aware of the dangers of government collection of information due to World War II being
fought on European ground. According to a Congressional Research Service Report, “in the United States, what the
European Commission (the EU’s executive) refers to as the “collecting and processing of personal data” is allowed
unless it causes harm or is expressly limited by U.S. law. In Europe, by contrast, processing of personal data is
prohibited unless there is an explicit legal basis that allows it (citations omitted).” MARTIN A. WEISS & KRISTIN
ARCHICK, CONG. RESEARCH SERV., R44257, U.S.-EU DATA PRIVACY: FROM SAFE HARBOR TO PRIVACY SHIELD 1
(2016), https://fas.org/sgp/crs/misc/R44257.pdf.
37
29 U.S.C. § 1001.
38
15 U.S.C. § 6801.
39
15 U.S.C. § 6501.
40
20 U.S.C.S. § 1232(g).
41
15 U.S.C. § 1681.
42
HIPAA was enacted in 1996, GLB in 1999, COPPA in 1998, FERPA in 1974, and FCRA in 1970. By way of
comparison, Facebook was not launched until 2007.
43
See Enforcing Privacy Promises, FED. TRADE COMM’N, https://www.ftc.gov/news-events/media-resources/protecting-
consumer-privacy/enforcing-privacy-promises (last visited May 21, 2018). The Division of Privacy and Identity
Protection is a division of the Bureau of Consumer Protection of the FTC responsible for privacy and data security
11
Although it was not specifically charged to enforce privacy policies, the FTC is responsible for
taking action against companies engaged in unfair and deceptive trade practices.
44
Although there is
no federal legal requirement in the U.S. for internet service providers to maintain privacy policies
informing users of how their information will be used, nor are companies required to obtain
permission to use the data, companies that do supply privacy policies can be subject to action for
failing to comply with them or otherwise misleading the public.
45
One of the reasons why the FTC got involved in regulating privacy issues at the time it did
was that the 95 Directive was going into law and it would require the U.S. to have “adequate”
privacy protections before Europeans’ data could be transferred to America. Because there was no
enforcement body in the U.S. to comply with the enforcement requirements of the Safe Harbor
agreement, Congress convinced the FTC to take on this role.
46
It was not until March 2012, that the
FTC came out with its Recommendations for Businesses and Policymakers to address the issue of
consumer privacy.
47
While this report provides guidance for businesses, it does not mandate any
particular action.
48
As privacy scholars Solove and Hartzog point out, there is virtually no case law
on these privacy enforcement actions because nearly all FTC enforcement actions have resulted in
settlements between the FTC and the companies investigated.
49
This also means that the companies
issues. Division of Privacy and Identity Protection, FED. TRADE COMM’N, http://www.ftc.gov/bcp/bcppip.shtm (last
visited May 21, 2018). Actions against tech companies for violating user’s privacy and data security rights are brought
under the FTC’s power to investigate both deception and unfair practices. For a thorough explanation of FTC’s
enforcement standards. See G. S. Hans, supra note 4.
44
Enforcing Privacy Promises, FED. TRADE COMM’N, https://www.ftc.gov/news-events/media-resources/protecting-
consumer-privacy/enforcing-privacy-promises (last visited May 21, 2018). The FTC is also charged with enforcing other
federal laws relating to consumer privacy and security.
45
Daniel J. Solove & Woodrow Hartzog, The FTC and The New Common Law of Privacy, 114 COLUM. L. REV. 583
(2014).
46
CHRIS JAY HOOFNAGLE, FEDERAL TRADE COMMISSION PRIVACY LAW AND POLICY 145-192 (2016).
47
FED. TRADE COMM’N, PROTECTING CONSUMER PRIVACY IN AN ERA OF RAPID CHANGE (Mar. 2012),
https://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-report-protecting-consumer-privacy-
era-rapid-change-recommendations/120326privacyreport.pdf.
48
The report does, however, acknowledge the need for Congress to set baseline privacy protection laws. See id.
49
Daniel J. Solove & Woodrow Hartzog, supra note 46.
12
seldom had to admit to any wrongdoing. These privacy enforcement actions are discussed in Section
II.B. herein.
2. State data protection law
Although the FTC has brought a number of actions involving data breaches by companies,
most of these fall under one of the above-mentioned sector-specific statutes, or under section 5 of the
FTC Act regarding unfair and deceptive trade practices. Very few class-action cases have made it to
court due to the lack of harm being shown with a data breach. One statute that has been used
successfully to certify a class action, has been the Stored Communications Act.
50
The FTC Act does
not permit private causes of action.
51
Facebook settled a class action suit for $9.5 million in Lane v.
Facebook, Inc., 696 F.3d 811 (9th Cir. 2012) with respect to its ill-fated Beacon program which
automatically posted user’s offsite platform activities to their Facebook newsfeed (such as the
purchase of theater tickets).
52
In Lane, the plaintiffs alleged that Facebook had violated both
California and federal law, including the Electronic Communications Privacy Act, the Computer
Fraud and Abuse Act, the Video Privacy Protection Act, California's Computer Crime Law, and the
California Consumer Legal Remedies Act. However, it should be noted that most cases for data
breaches and tort actions have not been as successful because many courts require a showing of
harm.
53
In Spokeo, Inc. v. Robins, the Supreme Court indicated that the plaintiff lacked standing
50
Google settled Gaos v. Google Inc., Case No.: 5:10-CV-4809 EJD (N.D. Cal. 2013) for $8.5 million and In re Google
Buzz Privacy Litig., No. C 10–00672 JW, 2011 WL 7460099 (N.D. Cal. June 10, 2011) for approximately $8.5 million.
51
See, e.g., Days Inn of Am. Franchising, Inc. v. Windham, 699 F. Supp. 1581 (N.D. Ga. 1988).
52
Beacon was shut down after two years in 2009.
53
Fourteen states permit a private cause of action, but most limit recovery to actual damages. See Baker Hostetler, Data
Breach Charts, Nov. 2017, p. 22-23,
https://www.bakerlaw.com/files/Uploads/Documents/Data%20Breach%20documents/Data_Breach_Charts.pdf. See also,
Matthew Renda, Claims That Google Shares Data Look Shaky, COURTHOUSE NEWS SERV. (Aug. 24, 2016),
https://www.courthousenews.com/claims-that-google-shares-data-look-shaky/.
13
because of the absence of actual injury due to the data breach.
54
The majority of cases regarding data
breaches are pursued under state law.
55
All 50 states have enacted data breach notification statutes, following the lead of California’s
2003 statute.
56
California’s statute requires notification to individuals if their personal information
has been released. The California statute defines “personal information” as a person’s first name or
initial and last name combined with any one or more of the following: social security number;
driver’s license number or state identification card number; account number, credit, or debit card
number, in combination with any required security code, access code, or password that would permit
access to an individual’s financial account, as well as medical information and health insurance
information.
57
The state statutes vary widely on what constitutes a data breach and when and if users
need to be notified.
58
C. Difference between EU and U.S. laws
Data privacy is an important global social and economic issue. Because companies operate
across borders online,
59
it is vital that they understand the privacy and data protection laws in the
countries with which they do business.
60
The United States and the European Union are each other’s
54
136 S.Ct. 1540 (May 16, 2016). Robins, the plaintiff, alleged that Spokeo, a “people search engine,” disclosed
incorrect information about Robins in violation of the Fair Credit Reporting Act.
55
For a detailed list of such state data breach notification laws, see Daniel J. Solove & Paul M. Schwartz, PRIVACY LAW
FUNDAMENTALS 2017 205-213, IAPP (2017).
56
Daniel Solove, Breach Notification Laws Now in All 50 States, TEACHPRIVACY (Apr. 7, 2018),
https://teachprivacy.com/breach-notification-laws-now-in-all-50-states/.
57
Confidentiality of Medical Information Act, A.B. 1298. 2007-699 Cal. Adv. Legis. Serv. 1298 (Deering), codified at
CAL. CIV. CODE § 1798.29(e)(4)-(5) (West 2010).
58
Lesemann, Dana, Once More Unto the Breach: An Analysis of Legal, Technological and Policy Issues Involving Data
Breach Notification Statutes, AKRON INTELLECTUAL PROPERTY JOURNAL, Vol. 4, p. 203, 213 (2010). Available at
SSRN: https://ssrn.com/abstract=1671082.
59
The United States and the European Union remain each other’s largest trade and investment partners with the trade in
goods and services amounting to over $1 trillion dollar per year. MARTIN A. WEISS & KRISTIN ARCHICK, CONG.
RESEARCH SERV., R44257, U.S.-EU DATA PRIVACY: FROM SAFE HARBOR TO PRIVACY SHIELD 4 (2016). In addition,
cross-border data flows between the United States and Europe are the highest in the world—almost double the data flows
between the United States and Latin America and 50% higher than data flows between the United States and Asia. See
Joshua P. Meltzer, The Importance of the Internet and Transatlantic Data Flows for U.S. and EU Trade and Investment
(Brookings, Global Economy & Development, Working Paper 79, 2014) as cited in MARTIN A. WEISS & KRISTIN
ARCHICK, supra note 37, at 4.
60
Andre R. Jaglom, Liability On-Line: Choice of Law and Jurisdiction on the Internet, or Who’s In Charge Here?, 16
INT'L L. PRACTICUM 66 (2003), http://www.thsh.com/documents/liabilityon-line.pdf.
14
largest trade and investment.
61
However, with respect to privacy, there are striking differences
between European and American privacy laws.
62
While the European Union’s focus is on protecting
human rights and social issues, the U.S. seems to be concerned with providing a way for companies
collecting information
63
to use it while balancing the privacy rights that consumers expect.
64
Although data protection and privacy are important issues for consumers, as the Cambridge
Analytica hearings demonstrated, the U.S. does not provide adequate privacy and data security
protection. While Facebook founder was brought to task for “allowing” Cambridge Analytica to
happen, the Senate demonstrated its complete lack of understanding modern technology and is
woefully ill-equipped to create adequate privacy and data security laws. It seems that Congress is
relying on the FTC to use Article 5 (regarding deceptive and unfair trade practices) to monitor and
enforce privacy and data security in the U.S.
The vast differences between U.S. and EU privacy law directly relates to the differences in
the respective ideologies behind these laws.
65
While the U.S. Constitution does not mention a right
61
For the amount of their trade, see note 28 supra.
62
Edward R. Alo, EU Privacy Protection: A Step Towards Global Privacy, 22 MICH. ST. INT’L L. REV. 1095, 1110–1111
(2013), http://digitalcommons.law.msu.edu/cgi/viewcontent.cgi?article=1155&context=ilr.
63
According to a Congressional Research Service Report, “The U.S. Department of Commerce recently reiterated that
the large-scale collection, analysis, and storage of personal information is central to the Internet economy; and that
regulation of online personal information must not impede commerce.” GINA STEVENS, CONG. RESEARCH SERV.,
R41756, PRIVACY PROTECTIONS FOR PERSONAL INFORMATION ONLINE 2 (2011).
64
For a discussion of these issues, see GRY HASSELBALCH & PERNILLE TRANBERG, DATA ETHICS: THE NEW
COMPETITIVE ADVANTAGE 132-133 (2016), https://dataethics.eu/en/data-monopolies-value-clashes/.
65
Edward R. Alo, EU Privacy Protection: A Step Towards Global Privacy, 22 MICH. ST. INT’L L. REV. 1095, 1110–1111
(2013), http://digitalcommons.law.msu.edu/cgi/viewcontent.cgi?article=1155&context=ilr.
15
to privacy,
66
it is expressly included in the Charter of Fundamental Rights of the European Union.
67
The Charter of Fundamental Rights, whose rights, freedoms, and principles were recognized as
having the same value as the European Union treaties
68
(Treaty on European Union
69
and Treaty on
the Functioning of the European Union
70
), and its treatment of personal data protection as a
fundamental right represent the vast difference between how the United States and the European
Union view personal information and as a result, the policy behind their respective privacy laws. For
example, in the European Union, unless some other legal basis for processing personal data applies
(e.g., one such other legal basis occurs when processing is necessary for performance of a contract to
which the data subject is a party),
71
companies that provide online services to residents of the
European Union can be required to obtain documented “hard consent” from customers before
processing and storing their data.
72
This is the exact opposite of what U.S. companies do. Rather
than requiring users to opt in to the sharing of their personal information, people using their services
must actively opt out or in the alternative stop using the service.
73
In addition, U.S. law does not
66
As pointed out by one scholar, “The word “privacy” does not appear in the United States Constitution. Yet concepts of
private information and decisionmaking are woven through the entire document, and courts have developed a substantial
jurisprudence of constitutional privacy.” WILLIAM MCGEVERAN, PRIVACY AND DATA PROTECTION LAW 3 (2016). See,
also, ELLEN ALDERMAN & CAROLINE KENNEDY, THE RIGHT TO PRIVACY xiii (1995). This having been said, one scholar
reminds us that “it was a matter of general agreement, in the 1890s, that the Constitution prohibited prosecutors and civil
plaintiffs from rummaging through private papers in search of sexual secrets or anything else.” JEFFREY ROSEN, THE
UNWANTED GAZE 5 (2000). Two commentators speak of “information privacy,” contrasting it with “decisional privacy,”
the latter of which has been at the heart of Supreme Court cases. “Information privacy law is an interrelated web of tort
law, federal and state constitutional law, federal and state statutory law, evidentiary privileges, property law, contract
law, and criminal law.” DANIEL J. SOLOVE & PAUL M. SCHWARTZ, INFORMATION PRIVACY LAW 2 (5th ed. 2015).
66
In
1890, Warren and Brandeis made the argument in the Harvard Law Review that the right of privacy is implied by the
constitution and derived from both common law and the concepts of “the right to be left alone” and the right to keep
personal information out of the public domain. Samuel D. Warren; Louis D. Brandeis Harvard Law Review, Vol. 4, No.
5. (Dec. 15, 1890). This right to privacy has been adopted by the Supreme Court and throughout the states. William M.
Beaney, The Constitutional Right to Privacy in the Supreme Court, 1962 SUP. CT. REV. 212-251.
67
Article 8(1) of the Charter of Fundamental Rights of the European Union, provides that: “Everyone has the right to the
protection of personal data concerning him or her.” Charter of Fundamental Rights of the European Union art. 8(1),
2000 O.J. (C 364) 1, 10. In addition to protecting the personal information of those in the European Union, it also
protects their right to private or family life. Id. art. 7, at 10.
68
Consolidated Version of the Treaty on European Union, Oct. 26, 2012, 2012 O.J. (C 326) 13, art. 6(1).
69
Id.
70
Consolidated Version of the Treaty on the Functioning of the European Union, Oct. 26, 2012, 2012 O.J. (C 326) 47).
71
See the various legal bases for processing under GDPR art. 6 in Section IV D infra.
72
See Allison Grande, Cybersecurity Policy To Watch For The Rest Of 2017, LAW360 (July 12, 2017),
https://www.law360.com/articles/937323/cybersecurity-policy-to-watch-for-the-rest-of-2017.
73
For example, the GLBA only requires financial institutions provide customers an opportunity to opt out from the
sharing of their nonpublic personally identifiable customer information with non-affiliated third parties. The proposed
16
prevent companies from sharing the information they collect with third parties, provided such
activities are disclosed.
74
This means consent is not required under U.S. law for secondary uses of
data. As the next section will demonstrate, U.S. companies have been running afoul of European
privacy law for over a decade due to a business model which does not work with European privacy
law.
II. Enforcement actions against U.S. tech companies
A. EU enforcement actions
Over the past decade, EU member state DPAs have brought enforcement actions against
major U.S. technology companies for privacy law violations. In this section we consider several of
these cases focusing on those involving Google and Facebook as being representative of the types of
actions brought concerning privacy law.
1. Google
a) Google Street View Privacy Case
In 2007, Google launched its street view program in the U.S. whereby vehicles were fitted
with cameras and other equipment which took panoramic photographs along roadways to
complement its Google maps app. It turned out that aside from photographing images of houses and
businesses along these roadways, the vehicles pick up GPS data, “wifi network names,” and possibly
content from open wireless networks.
75
Google eventually admitted to having “been mistakenly
collecting samples of payload data from open networks,” which may have included e-mail, text,
photographs, or even websites that people were viewing, while its Street View cars were traveling
FCC rule, FTC Report and Order 16-148 (Nov. 2, 2016) https://www.documentcloud.org/documents/3733807-FCC-
Rule.html, which was nullified in May, 2017, would have required institutions to receive affirmative consent (opt in) to
share information. See also Anne T. McKenna, Pass Parallel Privacy Standards or Privacy Perishes, 65 RUTGERS L.
REV. 1041 (2013), http://elibrary.law.psu.edu/fac_works/350/.
74
Brian Fung & Chris Timberg, The FCC just passed sweeping new rules to protect your online privacy, WASH. POST
(Oct. 27, 2016), https://www.washingtonpost.com/news/the-switch/wp/2016/10/27/the-fcc-just-passed-sweeping-new-
rules-to-protect-your-online-privacy/?utm_term=.8e1da637b0c3.
75
See VIKTOR MAYER-SCHÖNBERGER & KENNETH CUKIER, BIG DATA 109 (Mariner Books 2014) (2013).
17
around taking photographs.
76
This collection of data was discovered after German authorities asked
to audit the cars
77
because homeowners feared that images of their domiciles could lead to burglary.
As a result, Google agreed to allow houses to be blurred out of images on request (by “opt out”).
78
At the time, all 27 European member states had created data protection laws derived from the 95
Directive. Although the laws varied from jurisdiction to jurisdiction, they all prohibited the
interception of personal data, and in some member states, made it a criminal offense.
79
Germany
issued a fine against Google for $189,000 as a result of their data privacy violation. The capturing of
data by Google Street View resulted in similar EU member state DPA enforcement actions in
Europe
80
and various other nations worldwide.
81
Belgium settled with Google for €150,000 to close
charges on the company’s unauthorized collection of private data from unencrypted Wi-Fi networks
by Google Street View in April, 2011.
82
The French Commission Nationale de l'Informatique et des
Libertés (CNIL), the DPA in France, sanctioned Google Street View’s collection of personal data on
March 17, 2011, in what then was a record fine of €100,000.
83
The failure by Google to provide adequate information to the data subjects about the
processing of their data also violated French law.
84
A report regarding the CNIL press release on the
case indicated that “inspections carried out by the CNIL in late 2009 and early 2010 demonstrated
76
See, e.g., Maggie Shiels, Google admits wi-fi data collection blunder, BBC NEWS (May 15, 2010 00:29 GMT),
http://news.bbc.co.uk/2/hi/technology/8684110.stm.
77
Id.
78
VIKTOR MAYER-SCHÖNBERGER & KENNETH CUKIER, supra note 76, at 154.
79
Andrea Ward & Paul Van den Bulck, Differing Approaches to Data Protection/Privacy Enforcement and Fines,
Through the Lens of Google Street View, IAPP: THE PRIVACY ADVISOR (June 1, 2013), https://iapp.org/news/a/2013-06-
01-differing-approaches-to-data-protection-privacy-enforcement-and/.
80
Press Release, Federal Data Protection and Information Commissioner (FDPIC), Judgment in Google Street View
case: Court finds in favour of the FDPIC , Apr. 4, 2011, https://www.edoeb.admin.ch/edoeb/en/home/data-
protection/Internet_und_Computer/online-services/google-street-view.html.
81
See Investigations of Google Street View, EPIC.ORG (last visited on Jan. 13, 2018), https://epic.org/privacy/streetview/.
82
See Andrea Ward & Paul Van den Bulck, supra note 80.
83
The CNIL found that SSID information and MAC addresses collected by Google allowed identification of data
subjects if combined with other location data collected, and that the same was true of “payload data” that Google had
inadvertently collected as well. The significance of such data allowing identification is that it would then be considered
“personal data” meaning that EU data protection law obligations, as implemented in member state law, would apply to
its processing. Myria Saarinen (Latham & Watkins LLP), France's CNIL announces a record fine of €100,000,
LEXOLOGY (Mar. 28, 2011), https://www.lexology.com/library/detail.aspx?g=967da607-a9a1-41f9-a082-5fb0dbbed204.
84
Id.
18
that vehicles (Google Street View cars used for Google Maps services) deployed on the French
territory collected and recorded not only photographs but also data transmitted by individuals’
wireless Wi-Fi networks without their knowledge (citation omitted).”
85
The Netherlands DPA -- College Bescherming Persoonsgegevens --issued an order on
Google on March 23, 2011 in connection with violations related to Google Street View, as a result of
an investigation indicated that the company had used its Street View vehicles to collect data
on more than 3.6 million Wi-Fi routers in the Netherlands, both secured and unsecured,
during the period March 4, 2008, to May 6, 2010, and had also calculated a geolocation for
each router. Such acts constituted a violation of the PDPA [the Dutch personal data
protection act]. According to a DPA press release, “MAC [media access control] addresses
combined with a calculated geolocation constitute personal data in this context, because the
data can provide information about the owner of the WiFi router in question” (citation
omitted).
86
The order could have resulted in an up to €1 million fine against Google in this instance,
87
but
Google was able to avoid it by complying.
88
Google was also recently fined $1.4 million by Italy for
its Google Street View’s privacy violations.
89
b) Google privacy policy case
The first set of EU DPA Google privacy policy enforcement actions against Google came as
a result of revisions made to the latter’s privacy policy in March 2012.
90
Google indicated that it was
going to consolidate all of its some 70 product’s privacy policies into one. However, this new policy
85
Nicole Atwill, Online Privacy Law: France, LIBRARY OF CONGRESS (June 2012, last updated on June 5, 2015),
https://www.loc.gov/law/help/online-privacy-law/france.php.
86
Wendy Zeldin, Online Privacy Law: Netherlands, Library of Congress (June 2012, last update on June 5, 2015),
https://www.loc.gov/law/help/online-privacy-law/netherlands.php.
87
See Press Release, College bescherming persoonsgegevens (CBP) (Dutch DPA), Dutch DPA issues several
administrative orders against Google, Apr. 19, 2011, https://autoriteitpersoonsgegevens.nl/en/news/dutch-dpa-issues-
several-administrative-orders-against-google.
88
See Stephen Gardner, Dutch DPA Concludes That Google Is in Breach of Data Protection Act, BNA (Dec. 2, 2013),
https://www.bna.com/dutch-dpa-concludes-n17179880411/#!.
89
Google pays a 1 million euro fine imposed by the Italian DPA because of Google’s Street View service, GARANTE PER
LA PROTEZIONE DEI DATI PERSONALI (April 3, 2018), http://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-
display/docweb/3041085.
90
Letter from Article 29 Data Prot. Working Party to Larry Page, Chief Exec. Officer, Google Inc.
1 (Oct. 16, 2012), http://goo.gl/kL1Wg6.
19
would allow it to share data between companies and products.
91
In response to this change, the
Article 29 Data Protection Working Party (WP 29), which is an influential advisory group that
includes representatives of EU member state DPAs,
92
made recommendations to Google for
modifications to its practices and its privacy policy to bring them into compliance.
93
When Google
failed to make the recommended changes to its practices and policy,
94
a number of DPAs
brought enforcement actions against Google for violating member state law which
required data controllers to obtain user consent to the sharing of their information
across companies and products, among other violations.
95
France ordered Google to (1) define the specific purposes of processing users’ personal data,
(2) define retention periods for personal data not to exceed the period necessary for the purposes
collected, and (3) inform users and obtain consent prior to storing cookies on their devices.
96
Similarly, the five other DPAs cited Google for failing to comply with similar provisions of their
data protection legislation in relation to Google’s data collection practices.
97
Google’s failure to
comply resulted in a €150,000 fine by the CNIL.
98
In addition, Google was ordered to cease its
personal data processing and to publish the order on its French home page for 48 hours detailing the
order and the fine.
99
91
Mark Hachman, Google Overhauls, Consolidates Privacy Policies, PC MAG (Jan. 24, 2012 8:59PM EST) ,
https://www.pcmag.com/article2/0,2817,2399308,00.asp.
92
WP29 was created under 95 Directive art. 29.
93
See Letter from Article 29 Data Prot. Working Party, supra note 91.
94
See CNIL, Google privacy policy: WP29 proposes a compliance package, CNIL (Sept. 25, 2014),
https://www.cnil.fr/en/google-privacy-policy-wp29-proposes-compliance-package.
95
The enforcement action was taken by the DPAs of France, Germany, Italy, the Netherlands, Spain, and the United
Kingdom. See W. Gregory Voss, European Union Data Privacy Law Developments, 70 BUS. LAW. 253, 254 (2014). In
addition, the UK’s DPA ordered Google to sign a consent decree improving the way it collected personal information in
the UK. Brian Davidson, UK—Google To Change Privacy Policy After ICO Investigation, IAPP: THE PRIVACY
ADVISORY (Feb. 24, 2015), https://iapp.org/news/a/ukgoogle-to-change-privacy-policy-after-ico-investigation/.
96
Lance Whitney, France orders Google to change its privacy policies, CNET (June 20, 2013 6:33 AM PDT),
https://www.cnet.com/news/france-orders-google-to-change-its-privacy-policies/.
97
Id.
98
Id.
99
Délibération n°2013-420 de la formation restreinte prononçant une sanction pécuniaire à l’encontre de la société X
(Deliberation No. 2013-420 of the Sanctions Committee of CNIL Imposing a Financial Penalty
Against Company X) 28 (Jan. 3, 2014), http://goo.gl/exjL12. (The decision has now been rendered anonymous and
Google Inc. is referred to as “Société X” (Company X).) This fine came out just after Google was fined for violating
Spanish data protection laws (Dec. 18, 2013). A total fine of €900,000 was imposed by the Spanish DPA. See Legal
20
The second set of enforcement actions as a result of the 2012 privacy policy changes were
instituted by the Italian DPA in 2014 for violation of Italian law and ordered Google to provide more
“effective information notices” to it users
100
and to obtain prior consent from its users for the
processing of their personal information.
101
This included both users of Gmail and Google Search.
It was discovered that Google was processing information in user’s Gmail accounts for the purposes
of behavioral advertising
102
by using cookies and engaging in other profiling activities in order to
create targeted ads (by analyzing the user’s navigation behaviors).
103
The order also set forth time
frames for which Google was to respond to data deletion requests by authenticated users holding
Google accounts, where the user requested deletion of his or her data in accordance with data
protection law.
104
Grounds of the Decision R/02892/2013 of 18 December 2013 on the Infringement Proceedings Instigated by the Spanish
Data Protection Agency Against the Entities Google Inc. and Google Spain, S.L., http://goo.gl/9gTpOi. The Dutch DPA
also found that Google’s combining of personal data breached the Dutch data protection act. College Bescherming
Persoonsgegevens [Dutch DPA], Investigation into the combining of personal data by Google
Report of Definitive Findings (Nov. 11, 2013) (English translation),
https://autoriteitpersoonsgegevens.nl/sites/default/files/downloads/mijn_privacy/en_rap_2013-google-privacypolicy.pdf.
Subsequently (on December 15, 2014), the Dutch DPA imposed an incremental penalty payment (an order to stop
violations of the law or face a penalty) on Google on this basis, and if it did not take prescribed corrective measures by
the end of February 2015, it was to risk being subject to a fine of up to €15 million. Press Release, College Bescherming
Persoonsgegevens [Dutch DPA], CBP issues sanction to Google for infringements privacy policy (Dec. 15, 2014),
https://autoriteitpersoonsgegevens.nl/en/news/cbp-issues-sanction-google-infringements-privacy-policy. The Dutch DPA
later found that Google had indeed taken action, and did not have to make a penalty payment at the time, but still had to
obtain proper unambiguous consent from users to the combining of their personal data. Press Release, College
Bescherming Persoonsgegevens [Dutch DPA], Privacy campaign Google following possible sanction of the Dutch DPA
(July 9, 2015), https://autoriteitpersoonsgegevens.nl/en/news/privacy-campaign-google-following-possible-sanction-
dutch-dpa.
100
Decision Setting Forth Measures Google Inc. Is Required to Take to Bring the Processing of
Personal Data Under Google’s New Privacy Policy into Line with the Italian Data Protection Code (July 10, 2014), para.
1, http://goo.gl/EgAT1x.
101
Decision Setting Forth Measures Google Inc. Is Required to Take to Bring the Processing of
Personal Data Under Google’s New Privacy Policy into Line with the Italian Data Protection Code (July 10, 2014),
http://goo.gl/EgAT1x.
102
Decision Setting Forth Measures Google Inc. Is Required to Take to Bring the Processing of
Personal Data Under Google’s New Privacy Policy into Line with the Italian Data Protection Code (July 10, 2014), para.
2, http://goo.gl/EgAT1x.
103
Id. It should be noted the the definition of “processing” (or “processing of personal data”) is very broad in the 1995
Directive: “any operation or set of operations which is performed upon personal data, whether or not by automatic
means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use,
disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or
destruction.” 1995 Directive, art. 2 (b).
104
Id. The decision specifically carves out “right to be forgotten” deletion requests, and in the
grounds for the decision, the Italian DPA refers to the Court of Justice of the European Union
Case C-131/12, Judgment of the Court (Grand Chamber), Google Spain SL v. Agencia Española de Protección de Datos
(AEPD) (May 13,2014), 2014 E.C.R. 317, eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:62012CJ0131, as
21
In 2014, the Hamburg DPA, acting for Germany, also issued an order noting Google’s
violations of German data protection law with respect to its data processing activities and user
profiling, such as the use of the substantial information Google collects about users’ habits combined
with other information Google obtains, such as location data.
105
Then on September 23, 2014, the
WP29 confirmed the findings of a meeting between the WP29, Google, and the above-mentioned
DPAs summarizing what Google was ordered to do and leaving open the possibility of adding
additional requirements at a later date.
106
What is significant about these actions is that they demonstrate the commitment to data
protection in the European Union and help reiterate the WP 29 and EU DPAs’ recommendations that
under EU data protection law notices must be given for each separate service provided and that the
sharing of information between different services without user consent is prohibited, emphasizing
the data retention time limit requirements, and the importance of complying with the DPAs’
regulations and orders.
well as guidelines then to come from the Article 29 Data Protection Working Party for treatment of “right to be
forgotten” deletion requests, stating that “the Garante will limit itself, at this stage, to issuing specific instructions with
regard to data deletion requests lodged by . . . users holding Google accounts . . . and will limit the scope of application
of this decision to data deletion requests that concern features other than Google Search.” See id. at dec. para. 5. The
Italian DPA further required that “the relevant data should be deactivated over the initial 30 days.” Id. at dec. para. 3(a).
It was further stipulated that “during the period in question the only processing operation allowed in respect of the
relevant data shall be the recovering of lost information,” while encryption must be used, or “where necessary”
anonymization techniques, in order to protect the data against unauthorized access. Id. at dec. para. 3(b).
105
Press Release, Der Hamburgische Beauftragte für Datenschutz und Informationsfreiheit [Hamburg Comm’r of Data
Prot. & Freedom of Info.], Major Changes in Google’s Data Processing Required—Data Protection Commissioner Issues
Administrative Order (Sept. 30, 2014), http://goo.gl/3kjBYk.
For example it may be possible
• to compile detailed travel profiles by evaluating location data,
• to detect specific interests and preferences by evaluating search engine use,
• to assess the user’s social and financial status, their whereabouts and many other of their habits by
analysing the collected data and
• to infer information such as friend relationships, sexual orientation and relationship status.
106
Letter from Article 29 Data Prot. Working Party to Larry Page, Chief Exec. Officer, Google Inc.
(Sept. 23, 2014), available at http://goo.gl/5lxDe2. The appendix, which deals with the issues of information
requirements (including those for specific services such as YouTube, Google Analytica, and DoubleClick), user controls,
and data retention policies, is at http://goo.gl/t7sLSZ.
22
c) Google Spain - “Right to Be Forgotten”
The Google Spain case involved an action by Mr. Mario Costeja Gonzáles after Google
refused to remove a link to a 1998 newspaper article regarding a real estate foreclosure as part of
social security debt collection activities against Costeja Gonzales. Costeja Gonzáles argued that the
old link contained obsolete information and was prejudicial to him. He brought the case before the
Spanish DPA (AEPD), which upheld his complaint against Google Spain SL and Google Inc.
Google Spain SL and Google Inc. challenged the AEPD’s decision in the Spanish Audencia
Nacional (National High Court), which then referred relevant questions about the 95 Directive to the
Court of Justice of the European Union (ECJ).
107
The ECJ ruled in Costeja Gonzáles’s favor
indicating that an individual’s objection to a search engine’s link to personal information would
require the weighing of the public’s interest in the information, the relevance or obsolescence of the
information, and the individual’s right to keep sensitive data out of the public eye.
108
In response to
the court’s order, at the end of May 2014 Google set up an online form for exercising the right to
delist.
109
In addition, in the fall of 2014 Google set up an advisory council to help it in determining
when to honor delisting requests and held hearings in major EU capital cities.
110
The advisory
107
For a discussion of this case, see W. Gregory Voss, The Right to Be Forgotten in the European Union: Enforcement
in the Court of Justice and Amendment to the Proposed General Data Protection Regulation, 18 J. INTERNET L. 3, 3-5
(July 2014).
108
Case C-131/12, supra note 105. See Stephanie Condon, Google 'right to be forgotten' case goes to top EU court,
ZDNET (July 19, 2017 18:05 GMT (19:05 BST)), https://www.zdnet.com/article/google-right-to-be-forgotten-case-goes-
to-top-eu-court/. One scholar has noted how the Google Spain case is in direct contravention of U.S. ideology especially
as it concerns the public’s right to know and how the ECJ’s opinion contains logical inconsistencies. Robert C. Post,
Data Privacy and Dignitary Privacy: Google Spain, the Right to Be Forgotten, and the Construction of the Public
Sphere, 67 DUKE L. J. 981 (2018),
https://scholarship.law.duke.edu/cgi/viewcontent.cgi?referer=&httpsredir=1&article=3928&context=dlj.
109
See Google sets up 'right to be forgotten' form after EU ruling, BBC (May 30, 2014),
http://www.bbc.com/news/technology-27631001. As of May 17, 2018, Google had received 680,578 delisting requests,
examined 2,535,275 URLs after delisting requests, and had deleted 951,349 (44%) URLs from search results.
Transparency Report, GOOGLE.COM, https://transparencyreport.google.com/eu-privacy/overview (last visited on May 17,
2018).
110
Jef Ausloos, Forget, erase and delist, but don’t forget the broader issue, INTERNET POLICY REVIEW (Jan. 22, 2015),
https://policyreview.info/articles/news/forget-erase-and-delist-dont-forget-broader-issue/353. See also Advisory
Council, GOOGLE, https://archive.google.com/advisorycouncil/ (last visited on May 23, 2018).
23
council’s report suggested that since 95% of internet searches in Europe used country-specific
domains (a figure supplied by Google), that Google would be compliant if it removed the
information from the country domain at issue.
111
Google alleged that it had complied with the 2014
ruling by removing the results from the country-code top level domain addresses (ccTLDs) of the
search engines corresponding to the affected countries in Europe (e.g., .de for Germany, .es for
Spain, .fr for France, etc.). However, WP29 indicated that the guidelines
112
in its order required them
to remove the information from all searches and all domains. On May 15, 2015, the CNIL issued an
order for Google to completely remove the information from all of the possible searches and
domains.
113
Google replied that “no one country should have the authority to control what content
someone in a second country can access” and that the CNIL did not have global authority to issue
such an order.
114
The ECJ has yet to issue its decision on whether or not Google must remove the
complained of data from searches worldwide or just in Europe.
115
2. Facebook
a) Schrems (Safe Harbor case)
This case was brought by Maximillian Schrems, an Austrian citizen with respect to
Facebook’s cross border transfer of data. As a Facebook user, Schrems’s personal data was
transferred from servers in Ireland to servers in the U.S. The 95 Directive only permits cross border
transfers if the receiving country can ensure an adequate level of protection by reason of domestic
law or international agreements. Because the Snowden revelations revealed that U.S. law offered no
real protection against surveillance by the U.S. government with respect to data transferred there,
Schrems filed an action with the Irish Data Protection Commissioner (DPC). The DPC dismissed
111
LUCIANO FLORIDI ET AL., THE ADVISORY COUNCIL TO GOOGLE ON THE RIGHT TO BE FORGOTTEN 19 (2015),
https://static.googleusercontent.com/media/archive.google.com/en//advisorycouncil/advisement/advisory-report.pdf.
112
Art. 29 Data Prot. Working Party, Guidelines on the Implementation of the Court of Justice of the European Union
Judgment on “Google Spain and Inc. v. Agencia Española de Protección de Datos (AEPD) and Mario Costeja González”
(Nov. 26, 2014) (WP 225).
113
95 Directive art. 25.
114
Peter Fleischer, Implementing a European, not global, right to be forgotten, GOOGLE EUROPE BLOG (July 30, 2015),
https://europe.googleblog.com/2015/07/implementing-european-not-global-right.html.
115
See Stephanie Condon, supra note 109.
24
Schrems’ case indicating that the transfer was permitted under the Safe Harbor agreement
116
between the U.S. and Europe. After Schrems appealed, the case was sent to the ECJ which in 2015
invalidated the Safe Harbor agreement that U.S. companies had relied upon in transferring data from
Europe to the U.S.
117
The Safe Harbor agreement was found to be invalid by the ECJ in 2015 because the personal
data transferred to the U.S. by Facebook Ireland Ltd to servers belonging to its parent company
Facebook Inc. in the U.S., did not receive adequate protection due to “the significant over-reach” of,
inter alia, the National Security Agency’s surveillance.
118
Specifically, the Court of Justice of the
EU further ruled that the Safe Harbor framework was invalid for several reasons: it allowed for
government interference of the directive’s protections, it did not provide legal remedies for
individuals who seek to access data related to them or to have their data erased or amended, and it
prevented national supervisory authorities from appropriately exercising their powers.
119
b) Facebook cookies cases - CNIL
In 2013, France published rules confirming that the use of cookies requires the user’s
consent. It was thereafter discovered that Facebook had been placing cookies on both users’ and
116
The 95 Directive was intended to provide guidance to the member states and harmonize privacy laws throughout
Europe. It required the member states to create laws to protect citizens’ information following the terms of the 95
Directive, although each member state could determine how to do that, and provided that personal data could not be
transferred to other countries unless those countries had similar protections in place. Because the U.S. was not
considered to have these protections, the Safe Harbor was created as discussed in I.A.2. herein. The only other countries
that qualified were Andorra, Argentina, Canada (for commercial organizations), Faeroe Islands, Guernsey, Israel, Isle of
Man, Jersey, New Zealand, Switzerland, and Uruguay. See Adequacy of the protection of personal data in non-EU
countries, EUR. COMM’N, https://ec.europa.eu/info/law/law-topic/data-protection/data-transfers-outside-eu/adequacy-
protection-personal-data-non-eu-countries_en (last visited on May 23, 2018).
117
Court of Justice of European Union, Press Release No. 117/15 (October 6, 2016),
https://curia.europa.eu/jcms/upload/docs/application/pdf/2015-10/cp150117en.pdf.
118
See the discussion of the facts in Case C-362/14 Judgment of the Court (Grand Chamber), Maximillian Schrems v
Data Protection Commissioner (Oct. 6, 2015), ECLI:EU:C:2015:650,
http://curia.europa.eu/juris/celex.jsf?celex=62014CJ0362&lang1=en&type=TXT&ancre=, paras. 28-30. According to
the NSA, the U.S. Intelligence Community relied on Section 702 of the Foreign Intelligence Surveillance Act to compel
providers to facilitate surveillance on specific foreign targets located outside the U.S. for the purpose of acquiring critical
intelligence on issues ranging from international terrorism to cybersecurity. See Expanded Look - "Section 702" Saves
Lives, Protects the Nation and Allies, NSA (Dec. 12, 2017), https://www.nsa.gov/news-features/news-stories/2017/702-
saves-lives-protects.shtml.
119
Case C-362/14 supra note 119.
25
visitors’ browsers without informing them. On May 16, 2017, the CNIL announced that it had fined
jointly Facebook Inc. and Facebook Ireland €150,000 for violating the French Data Protection Act,
by collecting massive amounts of users’ “personal data” and using cookies to obtain behavioral
information, without adequately informing the users.
120
The €150,000 fine was the maximum that
was allowed under such law at the time the CNIL’s investigation began in 2014.
121
The Contact Group of the Data Protection Authorities of the European Union (Contact
Group)
122
which was formed in 2014 to address this issue asserted that their respective national data
protection laws do apply to the processing of personal data of users and non-users by Facebook
123
consistent with case law from the European Court of Justice (the cases of Google Spain, Weltimmo
and Amazon)
124
and Article 4(1)(a) of the 95 Directive.
125
Although Facebook disputes their
authority,
126
the Contact Group asserted in its statement that their respective national data protection
laws do apply to the processing of personal data of users and non-users by Facebook. The DPAs
120
FACEBOOK Sanctioned for Several Breaches of the French Data Protection Act, CNIL
(May 16, 2017), https://www.cnil.fr/en/facebook-sanctioned-several-breaches-french-data-protection-act.
121
Loi no. 2016-1321 du 7 octobre 2016 pour une République numérique [Act no. 2016-1321 of Oct. 7, 2016 for a
Digital Republic], JOURNAL OFFICIEL DE LA REPUBLIQUE FRANÇAISE [J.O.] [OFFICIAL GAZETTE OF FRANCE], Oct. 8,
2016, art. 65(I), https://www.legifrance.gouv.fr/jo_pdf.do?id=JORFTEXT000033202746. As a result of an amendment
made by France’s Digital Republic Act, the French Data Protection Act later authorized fines for data protection
violations of up to €3 million. Loi n° 78-17 du 6 janvier 1978 relative à l'informatique, aux fichiers et aux libertés [Act
No. 78-17 of Jan. 6, 1978 on Information Technology, Data Files and Civil Liberties], as amended, art. 47,
https://www.legifrance.gouv.fr/affichTexte.do?cidTexte=JORFTEXT000000886460&fastPos=1&fastReqId=377767711
&categorieLien=cid&oldAction=rechTexte.
122
The Contract Group consists of the DPAs from the Netherlands, France, Spain, Hamburg (on behalf of Germany) and
Belgium.
123
Common Statement by the Contact Group of the Data Protection Authorities
of The Netherlands, France, Spain, Hamburg and Belgium (May 16, 2017) [hereinafter Common Statement]
https://autoriteitpersoonsgegevens.nl/sites/default/files/atoms/files/common_statement_16_may_2017.pdf.
124
Case C-131/12 Judgment of the Court (Grand Chamber), supra note 109; Case C-230/14 Judgment of the Court
(Third Chamber), Weltimmo s.r.o. v Nemzeti Adatvédelmi és Információszabadság Hatóság (ECLI:EU:C:2014:317)
(Oct. 1, 2015); and Case C-191/15 Judgment of the Court (Third Chamber) Verein für Konsumenteninformation v.
Amazon EU Sàrl (Amazon), (ECLI:EU:C:2016:612) (July 28, 2016),
http://curia.europa.eu/juris/celex.jsf?celex=62015CJ0191&lang1=en&type=TXT&ancre.
125
95 Directive art. 4(1)(a).
126
Facebook believes that the Irish data protection authority, not the French CNIL or any other EU data regulator, can
enforce data protection laws against them because their European headquarters are located in Dublin. Samuel Gibbs,
Facebook facing privacy actions across Europe as France fines firm €150k, THE GUARDIAN, (May 16, 2017 16.23 BST)
https://www.theguardian.com/technology/2017/may/16/facebook-facing-privacy-actions-across-europe-as-france-fines-
firm-150k.
26
pointed to the presence of multiple Facebook offices in the European Union and their targeting of
advertising to users and non-users in the European Union.
127
Investigations were also conducted by Belgium, the Netherlands, Germany and Spain for
data privacy violations around the tracking of users and non-users and the use of user data for
targeted advertising. In February 2018, a Belgian court ordered Facebook to stop breaking privacy
laws by tracking people on third-party websites or risk a fine of €250,000 a day or up to €125
million if it did not comply with the court’s judgment, which Facebook is reported to be
appealing.
128
In September 2017, the Spanish DPA fined Facebook a total of €3 million for its
violations.
129
In May 2017, the Dutch DPA concluded that Facebook had violated privacy law and
the DPA reserved the right to impose sanctions later.
130
In February 2018, the German regional court
in Berlin ruled that Facebook failed to provide enough information to users to gain informed consent
and that Facebook’s pre-checked opt-in boxes violated German privacy and data protection law.
131
The violations relate to the “quality of the information provided to users, the validity of
consent and the processing of personal data for advertising purposes.”
132
The French authorities
indicated that Facebook was using cookies to collect browsing data of internet users without their
knowledge or consent.
133
Facebook has argued that they are only subject to the privacy and data
protection laws of Ireland where their European subsidiary is located. Although the ECJ has not yet
issued its opinion, the Attorney General for the EU (who is an adviser to the court) indicated that
127
See Common Statement, supra note 124.
128
See Robert-Jan Bartunek, Facebook loses Belgian privacy case, faces fine of up to $125 million, REUTERS (Feb. 16,
2018 4:00 PM), https://www.reuters.com/article/us-facebook-belgium/facebook-loses-belgian-privacy-case-faces-fine-
of-up-to-125-million-idUSKCN1G01LG.
129
The Spanish DPA fines Facebook for violating data protection regulations, AGENCIA ESPAÑOLA DE PROTECCIÓN DE
DATOS (Sept. 11, 2017),
http://www.agpd.es/portalwebAGPD/revista_prensa/revista_prensa/2017/notas_prensa/news/2017_09_11-iden-
idphp.php.
130
Dutch data protection authority: Facebook violates privacy law, AUTORITEIT PERSOONSGEGEVENS (May 16, 2017),
https://autoriteitpersoonsgegevens.nl/en/news/dutch-data-protection-authority-facebook-violates-privacy-law.
131
Germany court rules Facebook personal data usage illegal, JURIST (Feb. 12, 2018),
http://www.jurist.org/paperchase/2018/02/germany-court-rules-facebook-personal-data-usage-illegal.php
132
See Samuel Gibbs, supra note 127.
133
Id.
27
social networks are subject to the oversight of other EU member states when they have a presence in
such state or are collecting data from individuals in those states.
134
B. U.S. Data Privacy Law Enforcement Actions
Although U.S. actions against U.S. tech companies have been relatively rare, this section will
compare U.S. enforcement activities resulting from the same or similar activities which gave rise to
the European DPA actions.
1. Google
a) Google street view
In May 2010, after Congress became aware of European regulator’s investigation of
Google’s Street View which had collected massive amounts of Wi-Fi data, it asked the FTC to
investigate.
135
In 2011, the FTC dropped its investigation into Google Street View, even as countries
in the EU were assessing fines against Google for violating their privacy laws, because Google
assured the FTC that it had stopped this practice.
136
According to the FTC:
To this end, we note that Google has recently announced improvements to its internal
processes to address some of the concerns raised above, including appointing a director of
privacy for engineering and product management; adding core privacy training for key
employees; and incorporating a formal privacy review process into the design phases of new
initiatives. The company also publicly stated its intention to delete the inadvertently collected
payload data as soon as possible. Further, Google has made assurances to the FTC that the
company has not used and will not use any of the payload data collected in any Google
product or service, now or in the future. This assurance is critical to mitigate the potential
harm to consumers from the collection of payload data. Because of these commitments, we
are ending our inquiry into this matter at this time. (citations omitted.)
137
Around the same time, the Federal Communications Commission (FCC) opened an
investigation into Google’s Street View activities after EPIC filed a complaint, asking the FCC to
134
Natasha Lomas, In Europe, Facebook’s data law buffer looks to be on borrowed time, TECHCRUNCH (Oct. 24, 2017),
https://techcrunch.com/2017/10/24/in-europe-facebooks-data-law-buffer-looks-to-be-on-borrowed-time/.
135
EPIC v. FTC (Google Street View), EPIC.ORG, https://epic.org/privacy/streetview/foia_1/default.html (last visited on
May 23, 2018).
136
FTC: Investigating Google Street View is a “waste of summer,” EPIC.ORG (Jan. 20, 2011),
https://epic.org/2011/01/ftc-investigating-google-stree.html.
137
Letter from Fed. Trade Comm’n to Albert Gidari, Esq., Perkins Coie LLP (Counsel to Google) (Oct. 27, 2010),
https://www.ftc.gov/sites/default/files/documents/closing_letters/google-inquiry/101027googleletter.pdf.
28
investigate violations of Section 705 of the Communications Act which adds additional restrictions
to the Federal Wiretap Act prohibiting the unauthorized interception of communication "by wire or
radio."
138
Section 705 requires establishing both the interception and use of a communication,
whereas the Wiretap Act is violated by interception alone. Although the FCC fined Google $25,000
for obstructing its investigation, it never made a final determination that the collection of Wi-Fi data
violating federal law. Google was not required to turn over the intercepted data alleging it to be a
trade secret and the key witness asserted his Fifth Amendment right against self incrimmination.
139
Thus, no action was ever concluded holding Google accountable for its Street View activities in the
U.S. contrary to finding that such activities violated European law.
b) Google Buzz/Safari
In 2010, Google launched the social media platform Google Buzz which allowed users to
share information via posts which could be deemed public or private. Google then prepopulated the
platform with user’s email addresses and names, as well as the names and email addresses of their
contacts. This was considered to be an unfair practice by the FTC because Google has previously
represented in its Gmail privacy policy that the information provide to create a Gmail account would
only be used for email.
140
In addition, the posts were made public by default contrary to its privacy
policy indicating that Google would seek a user’s consent prior to using their information for a
purpose other than for which it was initially collected.
141
Like most consent decrees, Google agreed
138
47 U.S. Code § 605 (1996).
139
Charles Arthur, Google fined by FCC over Street View, THE GUARDIAN, (Apr. 16, 2012 08.05 BST),
https://www.theguardian.com/technology/2012/apr/16/google-fined-fcc-street-view.
140
Google, Inc.; Analysis of Proposed Consent Order to Aid Public Comment [File No. 102 3136], 76 FED. REG. 65, 66
(F.T.C. Apr. 5, 2011), https://www.ftc.gov/sites/default/files/documents/cases/2011/04/110405googlebuzzfrn.pdf.
141
“Part I of the proposed order prohibits Google from misrepresenting the privacy and confidentiality of any ‘‘covered
information,’’ as well as the company’s compliance with any privacy, security, or other compliance program, including
but not limited to the U.S.-EU Safe Harbor Framework. . . Part II of the proposed order requires Google to give Google
users a clear and prominent notice and to obtain express affirmative consent prior to sharing the Google user’s
information with any third party in connection with a change, addition or enhancement to any product or service, where
such sharing is contrary to stated sharing practices in effect at the time the Google user’s information was collected. . .
Part III of the proposed order requires Google to establish and maintain a comprehensive privacy program that is
reasonably designed to: (1) Address privacy risks related to the development and management of new and existing
products and services, and (2) protect the privacy and confidentiality of covered information. The privacy program must
be documented in writing and must contain privacy controls and procedures appropriate to Google’s size and
complexity, the nature and scope of its activities, and the sensitivity of covered information. . . Part IV of the proposed
29
that it would not do this again.
142
But in 2012, Google agreed to pay a $22.5 million fine to the FTC
for violating the consent decree by representing to users of Apple’s Safari that is would not place
cookies on those searches or use same for targeted advertisements when it in fact did.
143
c) Google Privacy Policy
In 2012, Google’s changes to its privacy policy (allowing Google to combine personal
information from one of its services with another), which resulted in a fine by the CNIL and others
as mentioned above, instigated a complaint by EPIC in the DC Circuit Court of Appeals demanding
that the FTC enforce its consent decree with Google which required Google to expressly permit
users to opt out prior to Google sharing information with third parties.
144
The court dismissed the
complaint by EPIC indicating that the FTC had discretion over which actions to bring.
145
When
Google tried to change its privacy policy in 2016 permitting the combination of DoubleClick’s data
with its own, Consumer Watchdog filed a complaint with the FTC asking it to investigate this
change.
146
This change was promoted to the public as giving them greater control over their data but
failed to expressly inform the users that it would be combining user’s PII with advertiser’s browsing
order requires that Google obtain within 180 days, and on a biennial basis thereafter for twenty (20) years, an assessment
and report from a qualified, objective, independent third-party professional, certifying, among other things, that: it has in
place a privacy program that provides protections that meet or exceed the protections required by Part III of the proposed
order; and its privacy controls are operating with sufficient effectiveness to provide reasonable assurance that the privacy
of covered information is protected.” Id.
142
Press Release, Fed. Trade Comm’n, FTC Charges Deceptive Privacy Practices in Googles Rollout of Its Buzz Social
Network (Mar. 30, 2011), https://www.ftc.gov/news-events/press-releases/2011/03/ftc-charges-deceptive-privacy-
practices-googles-rollout-its-buzz.
143
Press Release, Fed. Trade Comm’n, Google Will Pay $22.5 Million to Settle FTC Charges it Misrepresented Privacy
Assurances to Users of Apple's Safari Internet Browser (Aug. 9. 2012), https://www.ftc.gov/news-events/press-
releases/2012/08/google-will-pay-225-million-settle-ftc-charges-it-misrepresented.
144
Casey Johnston, Privacy group demands FTC force Google to roll back privacy policy changes, ARS TECHNICA (Feb.
9, 2012 6:44 PM), https://arstechnica.com/gadgets/2012/02/privacy-group-demands-ftc-force-google-to-roll-back-
privacy-policy-changes/.
145
FTC Chairman: Google Users Face a "brutal choice" -- Europeans: "Google's new policy does not meet the
requirements of the European Directive on Data Protection.", EPIC.ORG (Feb. 28, 2012), https://epic.org/2012/02/ftc-
chairman-google-users-face.html.
146
Complaint, In The Matter Of Google Inc.’s Change In Data Use Policies, Consumer Watchdog and Privacy Rights
Clearing House (F.T.C. Dec. 16, 2016), http://www.consumerwatchdog.org/resources/ftc_google_complaint_12-5-
2016docx.pdf.
30
data.
147
In August, 2017, a complaint was filed by EPIC with the FTC against Google for using
credit card information to evaluate the success of ads without giving consumers a reasonable way to
opt out of the collection and use of their information.
148
There has been no investigation or
resolution by the FTC as of this time.
2. Facebook
a) Facebook privacy policy
In 2011 the FTC brought an action against Facebook because of changes it made to its
website, which contradicted what it told to its users. Although users could provide in their privacy
setting that their data would only be shared with “Only Friends,” it was in fact shared with third
parties. Although Facebook denied this, the FTC alleged and included in its consent decree that
advertisers had been made privy to personally identifiable information of Facebook users when they
clicked on an ad in their feed. In addition, the complaint alleged that information could be accessed
by Facebook even after a user deleted their account. The FTC also concluded that Facebook did not
comply with the US-EU Safe Harbor Framework despite certifying that it did.
149
147
Cynthia J. Larose & Michael B. Katz (Mintz, Levin, Cohn, Ferris, Glovsky and Popeo, P.C.), 2017 Federal Trade
Commission and Google Complaint, NAT’L L. REV. (Jan. 4, 2017), https://www.natlawreview.com/article/2017-federal-
trade-commission-and-google-complaint.
148
George Lynch, Privacy Group FTC Privacy Petition Challenges Googles Ad Program, BLOOMBERG NEWS, at
https://www.bna.com/privacy-group-ftc-n73014462591/. See Complaint, In the Matter of Google, Inc., The Electronic
Privacy Information Center (EPIC) (F.T.C. July 31, 2017),
https://epic.org/privacy/ftc/google/EPIC-FTC-Google-Purchase-Tracking-Complaint.pdf.
149
“The FTC complaint lists a number of instances in which Facebook allegedly made promises that it did not keep:
• In December 2009, Facebook changed its website so certain information that users may have designated as
private – such as their Friends List – was made public. They didn't warn users that this change was coming, or get their
approval in advance.
• Facebook represented that third-party apps that users' installed would have access only to user information that
they needed to operate. In fact, the apps could access nearly all of users' personal data – data the apps didn't need.
• Facebook told users they could restrict sharing of data to limited audiences – for example with "Friends Only."
In fact, selecting "Friends Only" did not prevent their information from being shared with third-party applications their
friends used.
• Facebook had a "Verified Apps" program & claimed it certified the security of participating apps. It didn't.
• Facebook promised users that it would not share their personal information with advertisers. It did.
• Facebook claimed that when users deactivated or deleted their accounts, their photos and videos would be
inaccessible. But Facebook allowed access to the content, even after users had deactivated or deleted their accounts.
• Facebook claimed that it complied with the U.S.- EU Safe Harbor Framework that governs data transfer
between the U.S. and the European Union. It didn't.” Press Release, Fed. Trade Comm’n, Facebook Settles FTC Charges
That It Deceived Consumers By Failing To Keep Privacy Promises (Nov. 29, 2011), https://www.ftc.gov/news-
events/press-releases/2011/11/facebook-settles-ftc-charges-it-deceived-consumers-failing-keep.
31
On August 10, 2012, The FTC entered into a consent decree with Facebook regarding the
charges that Facebook deceived consumers by telling them their information on Facebook was
private and then allowing it to be shared and made public. The settlement required Facebook to take
several steps: giving consumers clear notice and obtaining their consent before sharing their
information beyond the privacy settings, maintaining a privacy program to protect consumer
information, and obtain privacy audits from an independent third party.
150
In August 2016, the
Electronic Privacy Information Center (EPIC) filed a complaint against Facebook with the FTC for
transferring previously collected WhatsApp user data to Facebook for targeted advertising purposes.
This was alleged to be an unfair and deceptive trade practice because at the time Whatsapp collected
the data, the privacy policy did not mention that it could be transferred to Facebook.
151
FTC has not
made a ruling as of this time and is “carefully reviewing” EPIC’s complaint.
152
More recently, the FTC will be investigating Facebook’s handling of user information in
light of the Cambridge Analytica’s access to the data of 50-90 million Facebook users which may
have impacted the last presidential election.
153
Mark Zuckerberg testified in front of 44 members of
the Senate the week of April 10, 2018. The last time Zuckerberg was pulled in front of the
committee was in October 2017 to answer questions on how Facebook may have been involved in
the spreading of fake news prior to the election.
154
C. Difference between EU and U.S. enforcement actions
150
FTC Approves Final Settlement With Facebook, supra note 7.
151
Complaint, In the Matter of WhatsApp, Inc., The Electronic Privacy Information Center (EPIC) and The Center for
Digital Democracy (CDD) (F.T.C. Aug. 29, 2016),
https://www.democraticmedia.org/sites/default/files/field/public/2016/epic-cdd-ftc-whatsapp-complaint-2016.pdf.
152
In re WhatsApp, EPIC.ORG, https://epic.org/privacy/internet/ftc/whatsapp/#Acquisition (last visited on May 23, 2018).
153
Tiffany Hsu & Ceclia Kang, Demands Grow for Facebook to Explain Its Privacy Policies, N.Y. TIMES (Mar. 26,
2018), https://nyti.ms/2pIJY12.
154
Austin Carr, Senators Grill Facebook, Twitter & Google On Fake News: “Your Power Scares Me,” FAST COMPANY
(Oct. 31, 2017), https://www.fastcompany.com/40489793/senators-grill-facebook-twitter-google-on-fake-news-your-
power-scares-me.
32
What is most telling about the FTC enforcement actions is that besides fines and promises
made by Google and Facebook, there appears to have been no further monitoring of their actions,
contrary to the consent decrees that require annual audits to ensure compliance with the orders.
Although EPIC continues to bring suits attempting to force the FTC to enforce these settlement
decrees, the courts have consistently held that EPIC cannot force a discretionary agency action. In
addition, the FTC itself seems hesitant to insert itself into these company’s affairs.
155
For example,
Price Waterhouse Coopers was retained by the FTC to audit Facebook’s privacy practices to ensure
compliance with its 2011 consent decree. The audit, provided to EPIC pursuant to a FOIA request,
indicated that “In our opinion, Facebook’s privacy controls were operating with sufficient
effectiveness to provide reasonable assurance to protect the privacy of covered information [through
February 2017].” This audit was conducted after a significant amount of profile data was provided to
Cambridge Analytica. The FTC provided only portions of this audit claiming the trade secret
exemption to FOIA. According to Marc Rotenberg, Executive Director of EPIC, “It’s troubling that
the FTC seems unwilling to bring any legal action against either Facebook or Google to enforce
privacy settlements.”
156
From the EU enforcement actions that we have reviewed above, several lessons may be
gleaned for compliance under EU data protection regulation. First, we have seen that EU data
protection law has extraterritorial effect, a subject we will be addressing further in the context of the
GDPR in Section III below. The jurisdiction of the EU member state DPAs may extend to U.S.
technology companies, and thus the latter must take this into consideration in their compliance
efforts. Secondly, the Google Street View cases have pointed out the importance of incorporating
“privacy by default and design,” a concept discussed further below, from the start to avoid violations
155
Kirk Victor, FTC failed to enforce Facebook consent decree, critics charge amid firestorm, MLEX MARKET INSIGHT
(Apr. 2, 2018), https://mlexmarketinsight.com/contact-us/ftcwatch/selected-2017-articles/ftc-failed-to-enforce-facebook-
consent-decree,-critics-charge-amid-firestorm.
156
Thomas Claburn, Facebook privacy audit by auditors finds everything is awesome!, THE REGISTER (Apr. 21, 2018
00:09), https://www.theregister.co.uk/2018/04/21/facebook_privacy_audit_finds_everything_is_awesome/.
33
from happening. This is especially important in the context of future potential fines and EU member
state DPAs’ powers under the GDPR. We have seen one example where good privacy-enhancing
design by Google could have avoided the problems that its Street View service encountered. In
effect, Google failed to take into account harms their street collection actions could have caused.
157
In addition, these cases highlight the importance of understanding the broad definition of “personal
data” under EU legislation, and the necessity of taking that into consideration when devising
compliance programs and designing new products and services.
Next, the Google Privacy Policy actions underscored the importance of engaging the relevant
EU member state DPAs prior to taking action, such as instituting a new user policy. In that case,
Google had presented the DPAs with more or less a “fait accompli,” which led to problems.
Furthermore, respect of EU data protection principles based originally on OECD guidelines, such as
requirements of transparency (including providing information/notices to the data subject as to
processing personal data, another broad term under EU law), was shown to be crucial. Other lessons
from these cases include the necessity of complying with the purpose limitation (with the necessary
definition of the purpose of collection and processing), and the limiting of the time period for
retention of data so that data is not kept indefinitely. In addition, the importance of having
procedures instituted in order to comply with requests for the exercise of data subject rights, such as
responding to data deletion requests, was made evident in the Google Spain case. There, Google
was forced to rapidly institute procedures after the ECJ decision, establishing an online link-deletion
request form. Moreover, as was seen in the Facebook Privacy case, adequate information must be
provided to the user and his or her consent should be obtained before placing a cookie on a user’s
device.
158
157
There is a conflict as to whether Google intended to collect wifi data or whether it was a bug in its software. See
David Kravets, An Intentional Mistake: The Anatomy of Google’s Wi-Fi Sniffing Debacle, WIRED (May 2, 2012 07:18
PM), https://www.wired.com/2012/05/google-wifi-fcc-investigation/.
158
This general concept of prior informed consent was enshrined in the ePrivacy Directive (Directive 2002/58/EC) by
the 2009 amendments to it (Directive 2009/136/EC), and this is expected to be modified by a proposed ePrivacy
34
The Google Spain “right to be forgotten” (“right to delisting”) case served to extend an
existing right, which does not exist the United States – the right to deletion and/or correction of
personal data when inaccurate or obsolete – to require the delisting of links to such data on the
Internet by search engines, when requested by data subjects, after a balancing of the interests of the
public to such information with the privacy rights of the relevant data subject. Furthermore, the
French DPA maintains that such delisting must be applied to Internet domains worldwide, not just
EU domains.
The Facebook cross-border data transfer case involving the invalidation of the Safe Harbor
was very enlightening in many respects. First, it demonstrated that a data subject has the right to
access his or her personal data held by the technology company, as Schrems exercised this right to
obtain his data from Facebook. Second, the case showed the impact of U.S. mass surveillance on
arrangements between the EU and the U.S. as discussed above in Section II.A.2(a) and as evidenced
in the Privacy Shield negotiations (see Section III C. 1 below). Finally, the ECJ’s decision
highlighted the importance attributed to data protection as a fundamental right of individuals in EU
courts. Companies relying on the Privacy Shield should bear this in mind.
With respect to anticipated actions, it is very likely, however, that because of the wide
Cambridge Analytica publicity that the FTC will be taking some sort of action against Facebook.
159
Christopher Wylie, who previously worked at Cambridge Analytica, was one of the designers
involved in using data from Facebook to create psychological profiles of votes both within the U.S.
and Britain. These profiles were then used to target political ads which are alleged to have
influenced both the U.S. presidential election and the Brexit vote. Although Cambridge Analytica
has been the subject of investigations in both countries, Robert Mueller’s in the US, and by the
Regulation. See W. Gregory Voss, First the GDPR, Now the Proposed ePrivacy Regulation, 21 J. INTERNET L. 3, 3 & 5-
6 (2017).
159
It is not possible, however, to anticipate the result of such an investigation as Facebook has already argued that it
complied with the 2011 consent decree. It does seem that it stopped Cambridge Analytica’s access once the breach was
discovered but did not notify the FTC. This will most likely be the primary issue.
35
Electoral Commission and the Information Commissioner’s Office in the UK, both triggered in
February 2017, due to an Observer article, the extent of Facebook’s knowledge and involvement is
just now being questioned. The focus will be on whether Facebook violated the FTC 2011 consent
decree.
160
The DPAs in the EU are already calling for hearings with Zuckerberg and other industry
officials to discuss potential violation of their privacy and data security laws. According to The New
York Times, Cambridge Analytica collected data on
users’ identities, friend networks and “likes.” The idea was to map personality traits based on
what people had liked on Facebook, and then use that information to target audiences with
digital ads. Researchers in 2014 asked users to take a personality survey and download an
app, which scraped some private information from their profiles and those of their friends,
activity that Facebook permitted at the time and has since banned. The technique had been
developed at Cambridge University. . . Dr. Kogan [a professor at Cambridge] built his own
app and in June 2014 began harvesting data for Cambridge Analytica.
161
A hearing will be necessary because there is some dispute as to whether this was a data
breach or if the data was permitted to Cambridge Analytica for academic research. The 95 Directive
does not include a provision on data breach notification. This will be a new requirement under the
GDPR. Many member states, however, will find this to be a violation of their data protection
regulations as regardless of how Cambridge Analytica came to possess this data from Facebook,
Facebook did not gain consent for this use. The UK is especially interested in this issue as it could
cast doubt on the legitimacy of the Brexit vote which was already unpopular.
162
160
When asked why Facebook didn't disclose the Cambridge Analytica issue to the external company that did the audit,
Facebook pointed us to an exchange between US Representative Bob Latte and Mark Zuckerberg during the House
hearing, wherein the Facebook chief responded: "[O]ur view is that this -- what a developer did -- that they represented
to us that they were going to use the data in a certain way, and then, in their own systems, went out and sold it -- we do
not believe is a violation of the consent decree." Facebook Deputy Chief Privacy Officer Rob Sherman also said in a
statement: "We remain strongly committed to protecting people's information. We appreciate the opportunity to answer
questions the FTC may have." Mariella Moon, FTC-mandated audit cleared Facebook's privacy policies in 2017,
ENGADGET (Apr. 20, 2018), https://www.engadget.com/2018/04/20/ftc-audit-cleared-facebook/.
161
Kevin Granville, Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens, N.Y. TIMES (Mar.
19, 2018), https://www.nytimes.com/2018/03/19/technology/facebook-cambridge-analytica-explained.html.
162
Stephanie Bodoni, Zuckerberg Summoned by U.K. Parliament as Data Firm Faces Searches, BLOOMBERG NEWS
(Mar. 20, 2018 18:06 UTC+1), https://www.bloomberg.com/news/articles/2018-03-20/u-k-data-chief-seeks-to-search-
cambridge-analytica-over-breach.
36
The EU has a robust set of privacy and data security laws which will be strengthened with
the applicability of the GDPR on May 25, 2018. While there have been hundreds of enforcement
actions taken against U.S. tech companies in recent years,
163
the low maximum fines permitted under
the laws created pursuant to the 95 Directive have not been substantial enough to force change in the
way these tech companies collect and utilize data. This will change with the extraterritorial
jurisdiction and enormous fines possible under the GDPR. The FTC is the main agency concerned
with privacy and data security in the U.S., but actions against U.S. tech companies have been few
and far between. The penalties imposed in the few actions taken are also not significant enough to
force change.
164
While EU actions are public, the FTC investigations and mandatory audits of
Facebook and Google are kept private.
165
Without an overarching federal privacy and data security
law, the U.S. relies on state action and the limited power of the FTC to protect consumers from
deceptive and unfair practices.
III. General Data Protection Regulation
In 2012, the European Commission formally initiated the updating of the 95 Directive which
had provided the base for EU member state’s local privacy laws. Although quite advanced when
implemented in 1995, advances in technology and certain shortcomings of the 95 Directive led to the
proposal of a new regulation. The proposal was designed to harmonize data protection laws
throughout Europe,
166
enhance data transfer rules outside of the EU, and to provide greater control
over one’s personally identifiable information. After several years of discussions, the European
163
In addition to violations of privacy and data protection law, member states in the EU have brought actions against
U.S. tech companies for anti-trust, labor, national security, and tax law violations. See Jeff John Roberts, Why Google,
Facebook, and Amazon Should Worry About Europe, FORTUNE (Jul. 20, 2017), http://fortune.com/2017/07/20/google-
facebook-apple-europe-regulations/.
164
See Margot E. Kaminski, supra note 22.
165
See G. S. Hans, supra note 4, at 191.
166
The GDPR applies not only to the EU member states but also to the EFTA States of Iceland, Lichtenstein and
Norway. Bernd Schmidt, The Applicability of the GDPR within the EEA, TECHNICALLY LEGAL (Feb. 9, 2018),
https://planit.legal/blog/en/the-applicability-of-the-gdpr-within-the-eea/.
37
Parliament approved the GDPR on April 14, 2016.
167
The main changes that are of concern to
American companies are the extra-territorial application, significantly increased administrative
sanctions, additional rights provided to data subjects, data breach notification requirements,
limitations on profiling, and the introduction of compliance mechanisms (including massive record-
keeping requirements).
168
According to a PwC survey, 92% of American companies consider
compliance with the General Data Protection Regulation (GDPR) a top priority as it becomes
applicable on May 25, 2018.
169
A. New/expanded concepts
Much of the GDPR is based on the 95 Directive. While this will make the transition easier
for companies located in the EU, it will require a significant change in understanding regarding data
usage for U.S. companies. The GDPR corrects four problems immediately. First, it harmonizes the
laws across the EU. Second, it gives DPAs the tools they need to enforce privacy laws with the
increase in maximum fines that can be assessed. Third, it will allow DPAs to go after companies
located outside of the EU. Fourth, it expands the definition of personal data to cover advances in
what machines may be able to collect in the future.
In general, the GDPR is intended to apply to organizations outside of the EU when its
territorial and material scope conditions are met, explicitly covers a wider range of data, and
includes requirement for processors in addition to controllers. It sets out these specific requirements
in 99 articles. The following will discuss the extraterritoriality, the expansion of to whom and what it
applies, as well as the main data protection principles.
167
Shara Monteleone, Legislative Train Schedule, EUROPEAN PARLIAMENT (May 20, 2018),
http://www.europarl.europa.eu/legislative-train/theme-area-of-justice-and-fundamental-rights/file-general-data-
protection-regulation.
168
See id., at 222-230.
169
Press Release, PwC, GDPR Compliance Top Data Protection Priority for 92% of US Organizations in 2017 (Jan. 23,
2017), https://www.pwc.com/us/en/press-releases/2017/pwc-gdpr-compliance-press-release.html.
38
1. Regulation v. directive
While the 95 Directive was a directive, the GDPR is a regulation. Regulations have binding
legal force throughout every EU member state and are directly applicable in every member
state. Directives describes a result that every member state must achieve, but they are free to decide
how to incorporate the goal of the directive into national laws.
170
The 95 Directive had a number of
weaknesses which were addressed in the GDPR. A 2009 report by the RAND corporation and
sponsored by the UK’s Information Commissioner’s Office indicated that the 95 Directive provided
inconsistencies between the member states laws as each could determine on its own how the goals of
the Directive were to be implemented.
171
In addition, enforcement actions were inconsistent and
could result in multiple jurisdictions bringing actions for the same or similar violation. The rules on
cross-border transfers were outdated because of advances in technology, such as cloud storage. The
definitions of controllers and processors were incomplete.
Because the Regulation must be implemented into each member state’s laws, it addresses the
inconsistency problem of the 95 Directive as well as providing the likelihood that a company will
only have to deal with one DPA.
172
2. Increased penalties
One of the weaknesses of the 95 Directive was the low maximum fines permitted by Member
State laws. The GDPR corrects this by substantially increasing penalties for violations of the
regulation of to up to 4% of annual global turnover or €20 Million (whichever is greater).
173
According to its Form 10-K filed with the U.S. Securities and Exchange Commission, Facebook
170
Regulations, Directive and other acts, EUROPEAN UNION, https://europa.eu/european-union/eu-law/legal-acts_en.
171
NEIL ROBINSON ET AL., REVIEW OF THE EUROPEAN DATA PROTECTION DIRECTIVE (2009),
https://ico.org.uk/media/about-the-ico/documents/1042349/review-of-eu-dp-directive.pdf.
172
The concept of a "One-Stop-Shop" is to provide a single, uniform decision-making process in circumstances in which
multiple regulators have responsibility for regulating the same activity performed by the same organization in different
Member States. The WP29 has issued Guidelines on Lead DPAs (WP 244) which provide further clarity on how to
determine which DPA is the lead DPA for a given controller. Art. 29 Data Prot. Working Party, Guidelines for
identifying a controller or processor’s lead supervisory authority (Dec. 13, 2016) (WP 244),
http://ec.europa.eu/information_society/newsroom/image/document/2016-51/wp244_en_40857.pdf.
173
GDPR art. 83.
39
earned $27.638 billion in 2016,
174
which when calculated on that basis, could bring sanctions for its
future personal data violations under the GDPR conceivably up to a maximum of more than $1.105
billion. Google’s 2017 earnings were $109.65 billion,
175
which could potentially result in a fine of
$4.386 billion. The examples given in the summary of the GDPR on the EUGDPR website include
“not having sufficient customer consent to process data or violating the core of Privacy by Design
concepts.” Failure to keep adequate records, failure to notify the supervising authority of a data
breach or failing to conduct a privacy impact assessment are also subject to a substantial penalty of
up 2% of annual global turnover or €10 Million (whichever is greater).
176
This would significantly
increase the motivation of tech companies to comply with the new law.
3. Expanded definition of personal data
Another issue addressed is the definition of personal data to include advances in technology.
The GDPR expands the definition of “personal data” by adding genetic identity and GPS data,
although for the most part reiterates that 95 Directive definition:
any information relating to an identified or identifiable natural person (‘data subject’); an
identifiable natural person is one who can be identified, directly or indirectly, in particular by
reference to an identifier such as a name, an identification number, location data, an online
identifier or to one or more factors specific to the physical, physiological, genetic, mental,
economic, cultural or social identity of that natural person;
177
Recital (26) to the GDPR also clarifies that personal data that has been pseudonymized
remains personal data subject to the requirements of the GDPR.
178
Article 4(5) of the GDPR defines
174
Facebook, Inc., Annual Report Pursuant to Section 13 of 15(d) of the Securities Exchange Act 1934 (Form 10-K) for
the fiscal year ended Dec. 31, 2016, Comm’n File No.: 001-35551, at 31,
https://www.sec.gov/Archives/edgar/data/1326801/000132680117000007/fb-12312016x10k.htm.
175
Google's revenue worldwide from 2002 to 2017 (in billion U.S. dollars), STATISTA,
https://www.statista.com/statistics/266206/googles-annual-global-revenue/.
176
GDPR art. 83(5)-(6).
177
GDPR art. 4(1).
178
Although anonymized data is not. “Although similar, anonymization and pseudonymization are two distinct
techniques that permit data controllers and processors to use de-identified data. The difference between the two
techniques rests on whether the data can be re-identified. Recital 26 of the GDPR defines anonymized data as “data
rendered anonymous in such a way that the data subject is not or no longer identifiable.” Although circular, this
definition emphasizes that anonymized data must be stripped of any identifiable information, making it impossible to
derive insights on a discreet individual, even by the party that is responsible for the anonymization. When done properly,
40
pseudonymization as “the processing of personal data in such a way that the data can no longer be
attributed to a specific data subject without the use of additional information.”
179
It remains subject
to the GDPR because it can be re-identified with additional information, whereas, anonymized data
is not subject to the GDPR.
180
Pseudonymization has the advantage of permitting data processors to
use personal data with less risk to the rights of the users because the data cannot be tied to an
identifiable person without additional information. For this reason, the GDPR provides that
pseudonymization may be taken into account as one of a set of possible factors by controllers to
“ascertain whether processing for another purpose is compatible with the purpose for which the
personal data are initially collected.”
181
4. Extraterritoriality
One of the main concerns, or rather points of contention, that U.S. companies have with the
GDPR is its extra-territorial scope. While initially companies understood that they would be subject
to European law if they had a “data controller” in the EU,
182
they are not wholly on board with the
idea that the GDPR applies to them if they do not. From a practical perspective, the focus of the new
regulation’s territorial scope is mainly related to where the user is located, not the processor,
although either may lead to application of that EU law.
183
anonymization places the processing and storage of personal data outside the scope of the GDPR. The Article 29
Working Party has made it clear, though, that true data anonymization is an extremely high bar, and data controllers
often fall short of actually anonymizing data.” Matt Wes, Looking to comply with GDPR? Here's a primer on
anonymization and pseudonymization, IAPP (April 25, 2015), https://iapp.org/news/a/looking-to-comply-with-gdpr-heres-
a-primer-on-anonymization-and-pseudonymization/.
179
“By rendering data pseudonymous, controllers can benefit from new, relaxed standards under the GDPR. For
instance, Article 6(4)(e) permits the processing of pseudonymized data for uses beyond the purpose for which the data
was originally collected. Additionally, the GDPR envisions the possibility that pseudonymization will take on an
important role in demonstrating compliance under the GDPR. Both Recital 78 and Article 25 list pseudonymization as a
method to show GDPR compliance with requirements such as Privacy by Design.” Id.
180
There are a number of scholars in the U.S. who have argued that re-identification of anonymized information is
possible. See, e.g., Boris Lubarsky, Re-Identification of “Anonymized” Data, 1 GEO. L. TECH. REV. 202 (2017),
https://www.georgetownlawtechreview.org/re-identification-of-anonymized-data/GLTR-04-2017/.
181
GDPR art. 6(4). GDPR recital 78 and art. 25 both identify pseudonymization as a possible way to show GDPR
compliance with requirements such as Privacy by Design.
182
The GDPR defines a “controller” as “the natural or legal person, public authority, agency or other body which, alone
or jointly with others, determines the purposes and means of the processing of personal data; ….” GDPR art. 4(7). A
“processor” may process personal data “on behalf of the controller.” GDPR art. 4(8).
183
The part of the GDPR relevant to the user’s location follows:
“2. This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller
or processor not established in the Union where the processing activities are related to:
41
Previous law was ambiguous in its description of who outside of the EU was subject to the
provisions of the 95 Directive and the Member State laws.
184
The GDPR makes clear that anyone
processing the data of residents of the EU, regardless of whether or not that have an office in the EU,
is subject to the regulation.
185
The 95 Directive had a form of extraterritorial effect through the
limiting of cross-border personal data transfers from the European Union to countries found to have
adequate level of data protection.
186
In addition, the 95 Directive applied to non-EU data controllers
if the controller either had an establishment on the territory of an EU member state where the data
processing was carried out, or where equipment on the territory of a member state was used for the
processing.
187
With respect to personal data, Article 4 of the 95 Directive provided that the law
applicable is the law of the member state in which the data processor has an establishment and where
the processing takes place.
188
If the non-EU data controller did not have such an establishment, it
could still be subject to EU law if it made use of equipment on the territory of a member state for its
data processing (other than for mere “transit” of the data).
189
In this case, it was required to designate
a representative established in the territory of such member state.
190
As an illustration, the Court of
Justice of the European Union found that it had jurisdiction under the 95 Directive in the now-
famous Google Spain “right to be forgotten” (or “right to delisting”) case,
191
with respect to the
California-headquartered (and Delaware-incorporated) corporation Google Inc. and its search
(a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data
subjects in the Union; or
(b) the monitoring of their behaviour as far as their behaviour takes place within the Union.” GDPR art. 3(2).
184
See, e.g., Dan Jerker B. Svantesson, The Extraterritoriality of EU Data Privacy Law --- Its Theoretical Justification
and Its Practical Effect on U.S. Businesses, 50 STAN. J. INT’L L. 53 (2014) (proposing the definition of extraterritorial
jurisdiction as one which seeks to control or directly affect an object’s activities outside such state’s territory. Id., at 60).
185
GDPR art. 3(2).
186
95 Directive art. 25(1).
187
95 Directive art. 4(1)(a). Svantesson comments that this provision provides “considerable scope for
extraterritoriality,” especially given the possibility for EU member states to adopt broad ranges of views as to what
constitutes being “established.” See Dan Jerker B. Svantesson, supra note 185, at 66.
188
EU General Data Protection Regulation – Key Changes, DLA PIPER, https://www.dlapiper.com/en/uk/focus/eu-data-
protection-regulation/key-changes/ (last visited on May 23, 2018).
189
95 Directive art. 4(1)(c). It has been highlighted that this provision, which bases jurisdiction on the use of
“equipment” and not on nationality or residency “can lead to conflicts of law.” See CHRISTOPHER KUNER,
TRANSBORDER DATA FLOWS AND DATA PRIVACY LAW 124-125 (2013).
190
95 Directive art. 4(2).
191
Case C-131/12, supra note 109.
42
engine, as Google Inc. had Google Spain SL as an establishment in the European Union, the latter
which raised funds through advertising used to finance the search engine.
192
The GDPR, does in fact, go much further. Not only does it apply to processing in the context
of activities of the establishment of a data controller or a processor in the European Union,
193
but it
also applies to the processing of personal data of data subjects in the European Union by controllers
or processors not established in the European Union, if such processing relates to the offering of
goods or services (whether free or for-payment) to such data subjects, or involves the monitoring of
their behavior, insofar as such behavior takes place in the European Union.
194
However, one
commentator claims that the requirement of appointing a representative in such circumstances,
pursuant to GDPR Article 27, might lead to stable arrangements, in which case the establishment
clause of Article 3(1) might be triggered so as to be “likely to even absorb the remaining field of
application of the two alternatives posed under Article 3(2).”
195
This extraterritorial effect may also
be extended by practice through what has been called the “Brussels effect”: “the GDPR is likely to
de facto influence the setting of global standards for online data protection significantly by virtue of
its territorial scope, as data controllers can be expect to adjust their compliance according to the
highest level of data protection required from them.”
196
It bears repeating that many legal systems, including the U.S., extend the reach of their laws
outside of the territorial boundaries through long arm statutes and the concept of minimum contacts.
In fact, many U.S. states enforce their breach notification laws on companies headquartered in other
192
See W. Gregory Voss, The Right to Be Forgotten in the European Union: Enforcement in the Court of Justice and
Amendment to the Proposed General Data Protection Regulation, 18 J. INTERNET L. 3, 4 (2014).
193
GDPR art. 3(1).
194
GDPR art. 3(2).
195
Merlin Gömann, The New Territorial Scope of EU Data Protection Law: Deconstructing a Revolutionary
Achievement, 54 COMMON MKT. L. REV. 567, 575 (2017).
196
Id., at 568, citing Anu Bradford, The Brussels Effect, 107 NW. U. L. REV. 1 (2012).
43
states (or outside of the United States, for that matter), if the entity “conducts business” in the
state.
197
Extra-territorial application of U.S. antitrust law is well established.
198
There are a number of
U.S. statutes relating to antitrust that have extraterritorial jurisdiction. With respect to the Sherman
Act, the court in U.S. v. Alcoa found jurisdiction over acts that occurred outside of the U.S. but had
consequences within U.S. borders.
199
In January 2017, the FTC and DOJ updated the 1995 Antitrust
Guidelines for International Enforcement and Cooperation to clarify and broaden the scope of
enforcement of U.S. antitrust laws against foreign entities.
200
In addition, a number of decisions in
recent years have interpreted the Foreign Trade Antitrust Improvements Act as applying U.S.
antitrust law to anticompetitive conduct occurring outside of the U.S.
201
The update makes clear the
expanded scope of the extraterritorial application of U.S. antitrust law. Thus, it would be difficult to
argue that the EU cannot enforce laws for conduct occurring outside of its territorial boundaries.
202
197
Timothy M. Banks, The Long-Arm of Data Protection and Data Production Laws, IAPP (May 20, 2014),
https://iapp.org/news/a/the-long-arm-of-data-protection-and-data-production-laws/.
198
Richard W. Beckler & Matthew H. Kirtland, Extraterritorial Application of U.S. Antitrust Law: What Is a “Direct,
Substantial, and Reasonably Foreseeable Effect” Under the Foreign Trade Antitrust Improvements Act?, 38 TEX. INTL.
L. J. 11 (2003), http://www.tilj.org/content/journal/38/num1/Beckler-Kirtland11.pdf.
199
148 F.2d 416, 443 (2d Cir. 1945).
200
Antitrust Guidelines for International Enforcement and Cooperation, U.S. DEP’T JUST. & FED. TRADE COMM’N (Jan.
13, 2017), https://www.justice.gov/atr/internationalguidelines/download.
201
Jennifer B. Patterson & Terri A. Mazur, Recent Developments in the Extraterritorial Reach of the US Antitrust Laws,
ARNOLD & PORTER (Aug. 13, 2014),
https://www.arnoldporter.com/en/perspectives/publications/2014/08/20140813_recent_developments_in_extrater_10866
.
202
This extraterritorial effect might be criticized as “regulatory overreach” or as an “intrusion into State sovereignty,”
which are not new concerns, even in the data protection context. Merlin Gömann, supra note 196, at 568. Yet, not only
do various grounds for jurisdiction exist under international custom, but extraterritoriality effects are allowed in other
areas such as international economic law. See, e.g., Dan Jerker B. Svantesson, supra note 185, at 79-87. One scholar
states that, “Whereas the enactment of extraterritorial legislation was once viewed as the preserve of the United States
and as provoking the wrath of the EU; today—so the argument goes---extraterritoriality is a phenomenon that is both
tolerated by the EU and that is increasingly practiced in its name” (citations omitted). Examples of these include Anti-
trust or competition law (such as regulations of the United States and the European Union), securities laws (including the
Sarbanes-Oxley Act), and taxes, among others. See JOHN H. JACKSON, SOVEREIGNTY, THE WTO, AND CHANGING
FUNDAMENTALS OF INTERNATIONAL LAW 22 (2006). Nonetheless, Scott argues that “while the EU only very rarely
enacts extraterritorial legislation, it makes frequent recourse to a mechanism that may be labeled “territorial extension,”
allowing it to govern activities not on its territory. Joanne Scott, Extraterritoriality and Territorial Extension in EU
Law?, 62 AM. J. COMP. L. 87, 88 (2014). Furthermore, she indicates that there “are countless U.S. measures that give
rise to territorial extension.” Id., at 89. Scott defines “territorial extension” as “The application of a measure is triggered
by a territorial connection but in applying the measure that regulator is required, as a matter of law, to take into account
conduct or circumstances abroad.” Id., at 90. Moreover, according to the U.S. Department of Commerce’s Bureau of
Economic Analysis, the fines and penalties resulting from application of domestic law to foreign corporations in the
areas of antitrust (and EU competition law), anti-bribery legislation and other (primarily financial) legislation have
44
Despite this, it is likely that U.S. tech companies will seek to avoid extraterritorial
jurisdiction of the GDPR. In Facebook Belgium v. Belgian Privacy Commission
203
Facebook argued
that Facebook Ireland was the sole data controller with respect to Belgian data subjects, and that
only the Irish Privacy Commission, to the exclusion of the Belgian Privacy Commission, had
jurisdiction. The initial suit against Facebook’s argued that its non-user tracking mechanisms
violated EU and Belgian Privacy Laws. The Belgian Privacy Commission asked the Court to stop
Facebook from placing cookies on non-users’ browsers without “sufficient and adequate
information” about Facebook’s practice and how they use the data, and to cease collecting the datr
cookies through their social plug-ins. Facebook argued that neither Belgian nor EU law applied in
this situation because it had met the establishment test allowing Ireland to have jurisdiction over data
processing issues. Although Facebook lost this argument in the lower court,
204
the Court of Appeal
of Brussels overruled (quashed) the lower court’s decision and held that the Belgian courts lacked
jurisdiction over Facebook because its European headquarters were located in Ireland.
205
Following an interlocutory procedure at the end of 2017, the case was then sent back to the
lower court where it found that the party having the financial-decision-making capacity for the
processing of personal data of data subjects in Belgium was the Chief Operating Decision Maker of
Facebook Inc., and thus Facebook Inc. was a co-controller, and that the Belgian lower court had
jurisdiction for the three Facebook entities with respect to Belgian data subjects in this case where
resulted in positive net U.S. unilateral transfers (after deduction for amounts paid, mainly to the European Commission
and member state competition authorities in competition law cases) of $ 25.635 billion from 1999-2013, leaving the
United States with seemingly little about which to grumble. Christopher L. Bach, Fines and Penalties in the U.S.
International Transactions Accounts, BEA BRIEFING 55, 57 (July 2013),
https://www.bea.gov/scb/pdf/2013/07%20July/0713_fines_penalties_international_accounts.pdf.
203
Dutch-speaking Court of First Instance, Brussel, Nov. 9, 2015, 15/57/C (Belg.),
https://www.privacycommission.be/sites/privacycommission/files/documents/Judgement%20Belgian%20Privacy%20Co
mmission%20v.%20Facebook%20-%2009-11-2015.pdf. For a discussion of this case and the other proceedings, see
Belgian Privacy Commission, Recommendation no. 03/2017 of 12 April 2017,
https://www.privacycommission.be/sites/privacycommission/files/documents/recommendation_03_2017_0.pdf.
204
Id. For a short discussion of the lower court decision, see Mila Owen, Belgian Court Demands that Facebook Stop
Tracking Non-Members, JOLT DIGEST (Dec. 10, 2015), http://jolt.law.harvard.edu/digest/belgian-court-demands-that-
facebook-stop-tracking-non-members.
205
Hof van beroep HvB Court of Appeal Brussel, 18th Chamber June 29, 2016, 2016/KR/2 (Belg.),
https://www.privacycommission.be/sites/privacycommission/files/documents/arrest%20Facebook.pdf.
45
Facebook was tracking the behavior in Belgium of the data subjects, and ruled in favor of the
Belgian Privacy Commission and against Facebook.
206
As seen in the Belgian case, the determination of who is or is not a data controller is an
arduous one. Under the GDPR, however, Articles 4(7) and 4(8) would clearly define Facebook as a
data controller because the “purpose and means” of the data processing are determined in the U.S.,
not Ireland. This seems to be specifically aimed at U.S. companies trying to make similar arguments
to Facebook’s. It is very likely that there will be lawsuits regarding this issue after the GDPR
becomes applicable.
Companies might also be tempted to set forth in their terms of use a choice of law provision
making U.S. law applicable to any dealings with a U.S. company’s website, hoping to avoid the
requirements of the GDPR. However, this was anticipated by the GDPR. Even if a company is not
established in the EU, it is expressly subject to the GDPR if it processes information regarding data
subjects in the EU either through the offering of goods or services to them
207
or by monitoring their
behavior (i.e., targeted marketing), to the extent that such behavior occurs in the EU.
208
5. Ongoing requirements/culture change
It should be noted that there is no checklist that a company can go through and then certify
that they have complied with the GDPR for the reason that the requirements are ongoing.
Compliance with the GDPR will require a complete culture change for U.S. companies because the
rights afforded data subjects in the EU are not rights that American data subjects have nor that U.S.
companies have been operating under. The shift in thinking will be from an ownership model to a
leasing model. Essentially, all employees of a business will need to change their outlook from “this
is the company’s data” to the idea that “this data belong to the data subject and we are just leasing
206
Dutch-speaking Court of First Instance, Brussel, 24th Chamber, Feb. 16, 2018, 2016/153/A (Belg.), (French
translation)
https://www.privacycommission.be/sites/privacycommission/files/documents/jugement_facebook_16022018.pdf.
207
GDPR art. 3(2)(a).
208
GDPR, art. 3(2)(b) and recital 24.
46
it.” A company can only collect and process a user’s data to the extent explicit consent is given for
their activities and such consent can be withdrawn at any time. An individual’s rights will generally
trump a company’s rights to an individual’s data. The following section will describe some of the
more important provisions of the GDPR.
B. Important provisions of the GDPR
1. Applicability to controllers and processors
Only controllers had direct legal responsibility under the 95 Directive. Under the GDPR,
both controllers and processors are responsible for compliance. Article 4 defines data controllers and
data processors as follows:
(7) ‘controller’ means the natural or legal person, public authority, agency or other body
which, alone or jointly with others, determines the purposes and means of the processing of
personal data; where the purposes and means of such processing are determined by Union or
Member State law, the controller or the specific criteria for its nomination may be provided
for by Union or Member State law;
(8) ‘processor’ means a natural or legal person, public authority, agency or other body which
processes personal data on behalf of the controller;
For example, if a company A sells books to consumers and uses company B to track the orders and
obtain payment information from the consumers, company A is the controller and company B is the
processor.
209
“The GDPR treats the data controller as the principal party for responsibilities such as
collecting consent, managing consent-revoking, enabling right to access, etc. A data subject who
wishes to revoke consent for his or her personal data therefore will contact the data controller to
initiate the request, even if such data lives on servers belonging to the data processor. The data
controller, upon receiving this request, would then proceed to request the data processor remove the
revoked data from their servers.”
210
Although the controller is primarily responsible for compliance,
the processor can be liable under the GDPR for noncompliance.
211
Because of the extraterritorial
209
Data Controllers and Processors, https://www.gdpreu.org/the-regulation/key-concepts/data-controllers-and-
processors/.
210
Id.
211
The 95 Directive only provides for the liability of controllers. See 95 Directive art. 23(1).
47
jurisdiction of the GDPR, this change to the law does broaden which companies may be found liable
under the GDPR for failing to honor rights given to European users.
2. Right to be forgotten
As described in Section II.A.1(c) above, the right to be forgotten was established in the
Google Spain case. While this right was mentioned in the 95 Directive,
212
the GDPR expands this
right to be consistent with the ruling in Google Spain.
Article 17 reads:
1. The data subject shall have the right to obtain from the controller the erasure of
personal data concerning him or her without undue delay and the controller shall have the
obligation to erase personal data without undue delay where one of the following grounds
applies:
a) the personal data are no longer necessary in relation to the purposes for which
they were collected or otherwise processed;
b) the data subject withdraws consent on which the processing is based according to
point (a) of Article 6(1), or point (a) of Article 9(2), and where there is no other legal
ground for the processing;
c) the data subject objects to the processing pursuant to Article 21(1) and there are no
overriding legitimate grounds for the processing, or the data subject objects to the
processing pursuant to Article 21(2);
d) the personal data have been unlawfully processed;
e) the personal data have to be erased for compliance with a legal obligation in Union
or Member State law to which the controller is subject;
f) the personal data have been collected in relation to the offer of information society
services referred to in Article 8(1).
2. Where the controller has made the personal data public and is obliged pursuant to paragraph
1 to erase the personal data, the controller, taking account of available technology and the
cost of implementation, shall take reasonable steps, including technical measures, to inform
controllers which are processing the personal data that the data subject has requested the
erasure by such controllers of any links to, or copy or replication of, those personal data
(emphasis added).
Although this is not a new requirement, companies will need to have better procedures in place to
comply with European users’ right to be forgotten.
213
Companies will now need to inform data
subjects not only of their ability to correct information about themselves,
214
but also to have
212
GDPR art. 12(b).
213
See Section II.A.1(c) herein regarding the Google Spain case.
214
GDPR 5(1)(d) and 16.
48
information deleted.
215
If a data subject withdraws their consent, and that consent has served as the
legal basis for processing their data, their data must be deleted.
216
3. Right to data portability
The right for users to transfer their data to a new controller (one budgeting app to another) is
new in the GDPR.
Article 20 reads:
1. The data subject shall have the right to receive the personal data concerning him or her,
which he or she has provided to a controller, in a structured, commonly used and machine-
readable format and have the right to transmit those data to another controller without
hindrance from the controller to which the personal data have been provided, where:
a) the processing is based on consent pursuant to point (a) of Article 6(1) or point (a)
of Article 9(2) or on a contract pursuant to point (b) of Article 6(1); and
b) the processing is carried out by automated means.
2. In exercising his or her right to data portability pursuant to paragraph 1, the data subject shall
have the right to have the personal data transmitted directly from one controller to another,
where technically feasible.
3. The exercise of the right referred to in paragraph 1 of this Article shall be without prejudice
to Article 17. That right shall not apply to processing necessary for the performance of a task
carried out in the public interest or in the exercise of official authority vested in the
controller.
4. The right referred to in paragraph 1 shall not adversely affect the rights and freedoms of
others (emphasis added).
This requirement will permit users to transfer their data or obtain a copy of their data in
machine-readable format.
217
In most cases, the controller will not be permitted to charge for this
service and will need to provide the information within one month of the request.
218
There is no
corresponding right in under U.S. law. This new provision will actually help smaller companies take
advantage of data records created by their competitors. Although banks in some member states in the
EU were previously subject to this requirement, users who were reluctant to switch to a new social
media platform, for example, can now take their data with them.
215
GDPR arts. 16 and 17.
216
There are a number of exceptions to this requirement to delete, such as when such information is required to be held
for a certain numbers of years per law (e.g., banks must hold data for 7 years). GDPR art. 17.
217
GDPR art. 20.
218
See Art. 29 Data Protect. Working Party, Guidelines on the right to “data portability” (Apr. 5, 2017) (WP 242 rev.01),
http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=611233.
49
4. Lawful Basis for Processing
While the requirements for lawful basis echo those in the 95 Directive, the GDPR makes it
more difficult for organizations to process personal data for a new purpose due to its description of
“compatible” purposes.
219
The initial purposes are as follows:
Article 6(1) reads:
Processing shall be lawful only if and to the extent that at least one of the following applies:
1. the data subject has given consent to the processing of his or her personal data for
one or more specific purposes;
2. processing is necessary for the performance of a contract to which the data subject is
party or in order to take steps at the request of the data subject prior to entering into a
contract;
3. processing is necessary for compliance with a legal obligation to which the controller
is subject;
4. processing is necessary in order to protect the vital interests of the data subject or of
another natural person;
5. processing is necessary for the performance of a task carried out in the public
interest or in the exercise of official authority vested in the controller;
6. processing is necessary for the purposes of the legitimate interests pursued by the
controller or by a third party, except where such interests are overridden by the interests
or fundamental rights and freedoms of the data subject which require protection of
personal data, in particular where the data subject is a child (emphasis added).
If specific consent is not given for the processing, the company’s use of the data must fall
within one of the above categories (2 – 6) and organizations will most likely need to explain how the
individual’s privacy rights are outweighed by such use. This is not a requirement under U.S. law and
for that reason may present a stumbling block for U.S. corporations wishing to process EU generated
information the same way it processes U.S. generated information.
5. Data Protection Officer
As mentioned in III.A. herein, regardless of whether a data protection officer (DPO) is
required for an organization, appointing one will assist in compliance. The DPO would ensure
219
GDPR art.6(4) reads: “Where personal data are to be processed for a new purpose, the controller must consider
whether the new purpose is "compatible" with the original purpose taking into account the following factors:
a. any link between the original purpose and the new purpose;
b. the context in which the data have been collected, including the controller's relationship with the data subjects;
c. the nature of the personal data, in particular, whether Sensitive Personal Data are affected;
d. the possible consequences of the new purpose of processing for data subjects; and
e. the existence of appropriate safeguards (e.g., encryption or pseudonymisation).”
50
proper consent, privacy by design, conduct privacy impact assessment, respond to user requests, and
serve as the point of contact with local DPAs.
Article 37 reads:
1. The controller and the processor shall designate a data protection officer in any case where:
a) the processing is carried out by a public authority or body, except for courts acting
in their judicial capacity;
b) the core activities of the controller or the processor consist of processing operations
which, by virtue of their nature, their scope and/or their purposes, require regular
and systematic monitoring of data subjects on a large scale; or
c) the core activities of the controller or the processor consist of processing on a large
scale of special categories of data pursuant to Article 9 or personal data relating to
criminal convictions and offences referred to in Article 10.
2. A group of undertakings may appoint a single data protection officer provided that a data
protection officer is easily accessible from each establishment.
3. Where the controller or the processor is a public authority or body, a single data protection
officer may be designated for several such authorities or bodies, taking account of their
organisational structure and size.
4. In cases other than those referred to in paragraph 1, the controller or processor or associations
and other bodies representing categories of controllers or processors may or, where required
by Union or Member State law shall, designate a data protection officer. The data protection
officer may act for such associations and other bodies representing controllers or processors.
5. The data protection officer shall be designated on the basis of professional qualities and, in
particular, expert knowledge of data protection law and practices and the ability to fulfil the
tasks referred to in Article 39.
6. The data protection officer may be a staff member of the controller or processor, or fulfil the
tasks on the basis of a service contract.
7. The controller or the processor shall publish the contact details of the data protection officer
and communicate them to the supervisory authority (emphasis added).
In order to comply with the GDPR, best practices for companies collecting and processing
data in Europe will most likely include appointing a DPO.
220
Although a DPO is only required for
public bodies or when a company’s “core processing activities require regular and systematic
monitoring of individuals on a large scale, or where its core activities consist of the processing of
sensitive data on a large scale,”
221
most tech companies do engage in the large scale monitoring of
220
In addition, Article 27 requires companies without offices in the EU that monitor or process the personal data of users
within the EU to appoint an EU-based representative to be the contact person for the local DPA. “The purpose of this is
simple: It ensures that EU citizens will be able to contact the controllers and processors outside of Europe that hold their
personal data, without having the potentially confusing, difficult and costly efforts required to contact them at their
base.” Tim Bell, Is Article 27 the GDPR’s ‘hidden obligation’?, IAPP (May 3, 2018), https://iapp.org/news/a/is-article-
27-the-gdprs-hidden-obligation/.
221
GDPR art. 37.
51
individuals. The DPO will need to ensure that European users’ data is sufficiently protected and that
the data controller is in compliance with the GDPR. The DPO will need to be an expert in data
protection law, thus, it is unlikely that current IT professionals will meet this definition. In addition,
the DPO must be independent, so an IT professional or marketing professional within the
corporation would most likely have a conflict of interest. The DPO is responsible for reporting data
breaches to EU authorities within 72 hours of detection of the breach.
222
In addition, if the company
processes data from EU residents in connection with the offer of products or services to them or the
monitoring of their behavior in the EU, and it is not established in the EU, it will also be required to
have a representative located in the EU.
223
This requirement was previously optional in the EU under
most member states' laws, except where it was required of processers in Germany.
224
In addition to
advising the company on all things GDPR, they will be responsible for the massive record-keeping
requirements.
6. Affirmative Consent
One of the bases for lawful processing is consent.
Article 7 reads:
1. Where processing is based on consent, the controller shall be able to demonstrate that
the data subject has consented to processing of his or her personal data.
2. If the data subject’s consent is given in the context of a written declaration which also
concerns other matters, the request for consent shall be presented in a manner which is
clearly distinguishable from the other matters, in an intelligible and easily accessible
form, using clear and plain language. Any part of such a declaration which constitutes
an infringement of this Regulation shall not be binding.
3. The data subject shall have the right to withdraw his or her consent at any time. The
withdrawal of consent shall not affect the lawfulness of processing based on consent
before its withdrawal. Prior to giving consent, the data subject shall be informed
thereof. It shall be as easy to withdraw as to give consent.
4. When assessing whether consent is freely given, utmost account shall be taken of
whether, inter alia, the performance of a contract, including the provision of a service, is
conditional on consent to the processing of personal data that is not necessary for the
performance of that contract (emphasis added).
222
GDPR art. 33. This requirement applies “unless the personal data breach is unlikely to result in a risk to the rights
and freedoms of natural persons.” GDPR art. 33(1).
223
GDPR art. 27.
224
EU General Data Protection Regulation – Key Changes, supra note 189.
52
The GDPR expands the consent requirements by requiring proof of such consent. The 95 Directive
just required the user to signify agreement.
225
Consent is defined as: “any freely given, specific,
informed and unambiguous indication of the data subject's wishes by which he or she, by a statement
or by a clear affirmative action, signifies agreement to the processing of personal data.” The GDPR
uses this initial language form the 95 Directive, but then adds “Consent must be given by a statement
or a clear affirmative action.”
226
As mentioned above, one “lawful basis” for the processing of data is
consent. In the event a U.S. company cannot provide a contractual basis
227
or legitimate interest
228
for the processing of personal data, they will need to provide proof of consent. The GDPR sets forth
the age of consent at 16.
229
Companies will need to obtain and document
230
the affirmative consent
in plain language
231
of its users to the collection, processing and storing of their personal data and
provide an easy mechanism for the users to withdraw their consent.
232
The GDPR expressly provides that silence, inactivity and pre-checked boxes will not meet
this requirement of affirmative consent.
233
Also, different from U.S. law, the individual retains the
right to revoke the consent at any time.
234
An important issue that this raises is what is required of
U.S. companies who have already collected and processed information of EU data subjects.
235
Will
verifiable consent need to be obtained for data maintained after May 25, 2018? It seems unlikely that
U.S. companies will have adequate records to obtain this consent given that they previously relied on
225
95 Directive art. 2(h), 7(a).
226
GDPR art. 4.
227
GDPR art. 6.1(2).
228
GDPR art. 6.1(6).
229
Member states are able to set a lower age (not lower than 13 years), but those under the set age will need to comply
with additional requirements similar to COPPA, such as parental consent. See GDPR art. 8(1).
230
GDPR art. 7(1).
231
GDPR art. 7(2) and recital 42.
232
GDPR art. 7(3).
233
GDPR recital 32.
234
GDPR art. 7(3).
235
See, e.g., Todd Ehret, U.S. firms are still unprepared for looming EU data privacy rules, REUTERS (Feb. 13, 2018),
https://www.reuters.com/article/bc-finreg-data-privacy-rules/u-s-firms-are-still-unprepared-for-looming-eu-data-privacy-
rules-idUSKCN1FX2D2 and Eye on Discovery – Five Steps to Take Now to Prepare for the General Data Protection
Regulation, CONSILIO, http://www.consilio.com/resource/eye-discovery-five-steps-take-now-prepare-general-data-
protection-regulation/.
53
individual’s opting out of the collection of their information, but just as likely that DPAs will require
this. In essence, if the consent obtained prior to the GDPR’s application was “GDPR-compliant”
then new consent would not be necessary. However, if not GDPR-compliant, then new consent that
complies with the GDPR would need to be obtained.
Another significant aspect of the consent requirements is that the data subjects must be
informed how the data will be used at the time of collection.
236
This raises issues regarding the
ability to share and sell the information, as secondary uses may not be known at the time of
collection. It seems that the GDPR was very aware of this practice and seeks to stop it. Data subjects
must also be informed that they have the right to file a complaint with the company’s DPO.
237
The
notice asking for consent will need to be very detailed in its disclosures.
7. Data Protection by design
The idea of data protection by design and default is that privacy needs to be considered prior
to the time the data is collected and processed in the first place. This is a completely new
requirement. Art. 25 reads:
1. Taking into account the state of the art, the cost of implementation and the nature, scope,
context and purposes of processing as well as the risks of varying likelihood and severity
for rights and freedoms of natural persons posed by the processing, the controller shall,
both at the time of the determination of the means for processing and at the time of the
processing itself, implement appropriate technical and organisational measures, such as
pseudonymisation, which are designed to implement data-protection principles, such as data
minimisation, in an effective manner and to integrate the necessary safeguards into the
processing in order to meet the requirements of this Regulation and protect the rights of data
subjects (emphasis added).
236
GDPR recital 32.
237
Miriam Beezy & Stephanie Lucas, Compliance with the EU’s General Data Protection Regulation and U.S.
Discovery Law, INTA (Nov. 1, 2017), https://www.inta.org/Advocacy/Documents/2017/Article%20-
%20Compliance%20with%20the%20EU_S%20General%20Data%20Protection%20Regulation%20and%20US%20Disc
overy%20Law.pdf.
54
An example of designing your processing with data protection in mind would be
pseudonymisation.
238
This will present significant challenges for those using machine learning
algorithms that infer details based on patterns discovered in massive amounts of data. As it is not
always possible to know the criteria that the machine has “learned” and is now incorporating, it is
difficult to see how it can be disclosed.
239
Although the idea behind privacy by design is to
incorporate privacy in the developmental stages of data processing, Privacy Impact Assessments can
further guide companies in making changes when reviewing current systems. It will also be likely
that reports by maintained on how privacy by design was implemented by the company to
demonstrate compliance with the GDPR.
240
8. Impact Assessments
The 95 Directive did not require privacy or data protection assessments to be carried out
prior to processing, nor is there any similar requirement under U.S. law.
241
This new requirement in
the GDPR is intended to help organizations identify potential issues with their processing of user
data. Article 35(1-3) reads:
1. Where a type of processing in particular using new technologies, and taking into account the
nature, scope, context and purposes of the processing, is likely to result in a high risk to the
rights and freedoms of natural persons, the controller shall, prior to the processing, carry
out an assessment of the impact of the envisaged processing operations on the protection of
personal data. A single assessment may address a set of similar processing operations that
present similar high risks.
2. The controller shall seek the advice of the data protection officer, where designated, when
carrying out a data protection impact assessment.
3. A data protection impact assessment referred to in paragraph 1 shall in particular be
required in the case of:
a) a systematic and extensive evaluation of personal aspects relating to natural persons
which is based on automated processing, including profiling, and on which
decisions are based that produce legal effects concerning the natural person or
similarly significantly affect the natural person;
238
GDPR art. 25.
239
See VIKTOR MAYER-SCHÖNBERGER & KENNETH CUKIER, supra note 76.
240
For an excellent read on the ability to regulate privacy by design, see Ira Rubinstein, Regulating Privacy by Design,
26 BERKELEY TECH. L. J. 1409 (2011),
https://scholarship.law.berkeley.edu/cgi/viewcontent.cgi?referer=https://duckduckgo.com/&httpsredir=1&article=1917&
context=btlj.
241
Although privacy impact and data protection assessments are recommended by the FTC.
55
b) processing on a large scale of special categories of data referred to in Article 9(1), or
of personal data relating to criminal convictions and offences referred to in Article
10; or
c) a systematic monitoring of a publicly accessible area on a large scale (emphasis
added).
The idea is that appropriate safeguards can be instituted when deficiencies are discovered.
Rather than relying on individuals to evaluate the risks in sharing their data, some of the burden is
placed on the controller.
242
Specifically, Article 35(7) requires that the PIA
“shall contain at least:
1. a systematic description of the envisaged processing operations and the purposes of the
processing, including, where applicable, the legitimate interest pursued by the controller;
2. an assessment of the necessity and proportionality of the processing operations in relation
to the purposes;
3. an assessment of the risks to the rights and freedoms of data subjects referred to in
paragraph 1; and
4. the measures envisaged to address the risks, including safeguards, security measures and
mechanisms to ensure the protection of personal data and to demonstrate compliance with
this Regulation taking into account the rights and legitimate interests of data subjects and
other persons concerned (emphasis added).
These PIAs must be documented and in accordance with any further requirements established
by the DPA.
243
One of the issues with respect to this requirement is the inherent inability to identify
specifc risks when machine learning is involved as mention above. Because machine may be making
decisions based on factors which are not revealed outside of the “black box,” it is not possible to
anticipate an exact risk (although certainly the potential for discrimination in general should be
anticipated when engaged in any type of profiling).
244
242
See Claudia Quelle, The Data Protection Impact Assessment, or: How the General Data Protection Regulation May
Still Come to Foster Ethically Responsible Data Processing (Nov. 25, 2015), https://ssrn.com/abstract=2695398. The
author makes the case that the data protection impact assessment bakes in a privacy analysis by requiring the review to
include risks to the rights and freedoms of individuals as opposed to just the risk of a data breach.
243
GDPR art. 35(7).
244
One of the reasons the GDPR may be requiring explanations is because of the potential for discrimination. For a
discussion on how machine learning can result in discriminatory decisions generally, see Danielle Keats Citron, & Frank
Pasquale, The Scored Society: Due Process for Automated Predictions, 89 WASH. L. REV. 1, 5, 14-15 (2014),
http://digital.law.washington.edu/dspace-law/bitstream/handle/1773.1/1318/89WLR0001.pdf?sequence=1; Lior J.
Strahilevitz, Toward a Positive Theory of Privacy Law, 126 HARV. L. REV. 2010, 2021-2022, 2027-2028 (2013);
Nissenbaum (n 15) 208-210; Neil M. Richards & Jonathan H. King, Three Paradoxes of Big Data, 66 STAN. L. REV.
56
9. Profiling
One provision of the GDPR that is very different from U.S. law, is the requirement under
Article 22 that European users have the right to know how their personal information is being
processed when an automated decision is made about them. These automated decisions are known as
profiling. Companies must not only be able to explain how the decisions are being made, but also
must provide a mechanism to have such activities stopped.
245
Article 22 reads:
1. The data subject shall have the right not to be subject to a decision based solely on
automated processing, including profiling, which produces legal effects concerning him or
her or similarly significantly affects him or her.
2. Paragraph 1 shall not apply if the decision:
a. is necessary for entering into, or performance of, a contract between the data
subject and a data controller;
b. is authorised by Union or Member State law to which the controller is subject and
which also lays down suitable measures to safeguard the data subject’s rights and
freedoms and legitimate interests; or
c. is based on the data subject’s explicit consent.
3. In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall
implement suitable measures to safeguard the data subject’s rights and freedoms and
legitimate interests, at least the right to obtain human intervention on the part of the
controller, to express his or her point of view and to contest the decision.
4. Decisions referred to in paragraph 2 shall not be based on special categories of personal data
referred to in Article 9(2)1), unless point (a) or (g) of Article 9(2) applies and suitable
measures to safeguard the data subject’s rights and freedoms and legitimate interests are in
place (emphasis added).
ONLINE 41, 44 n.98 (2013); VIKTOR MAYER-SCHÖNBERGER & KENNETH CUKIER at 244 n.1; Art. 29 Data Protect.
Working Party, Opinion 03/2013 (n 65); The White House, Executive Office of the President. Big Data: Seizing
Opportunities, serving Values’ (n 5) 7, 45-47, 51-53, 59-60, 64-65; Fn, ‘Data brokers. A Call for Transparency and
Accountability’ (n 3) 55-56. For a discussion on how machine learning used by public bodies can result in
discrimination, see Kimberly Houser & Debra Sanders, The Use of Big Data Analytics by the IRS: Efficient Solution or
the End of Privacy as We Know it?, 19 VAND. J. ENT. & TECH. L. 817 (2017); Melissa De Zwart et al., Surveillance, Big
Data and Democracy: Lessons for Australia from the US and UK, 37 U. NEW S. WALES L.J. 713, 718 (2014). FED.
TRADE COMM’N, BIG DATA: A TOOL FOR INCLUSION OR EXCLUSION?: UNDERSTANDING THE ISSUES 28 (2016),
https://www.ftc.gov/system/files/documents/reports/big
clusion-understanding-issues/160106big-data-rpt.pdf [https://perma.cc/K8YS-WAJ8]. Bennett Capers, Rethinking the
Fourth Amendment: Race, Citizenship, and the Equality Principle, 46 HARV. C.R.-C.L. L. REV. 1, 17, 17 n.120 (2011).
Solon Barocas & Andrew Selbst, Big Data’s Disparate Impact, 104 CAL. L. REV. 671, 688, 720, 726 (2016),
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2477899 [https://perma.cc/CK5E-PLUH]. Christopher W. Clifton,
Deirdre K. Mulligan & Raghu Ramakrishnan, Data Mining and Privacy: An Overview, in PRIVACY AND TECHNOLOGIES
OF IDENTITY 191, 203 (Katherine J. Strandburg & Daniela Stan Raicu eds., 2006) (explaining that, without access to the
underlying data and logic of the “No Fly” program, individual’s ability to challenge inclusion on list is
impaired). See VIKTOR MAYER-SCHÖNBERGER & KENNETH CUKIER, supra note 76; see also Fred H. Cate & Viktor
Mayer-Schönberger, Notice and Consent in a World of Big Data, 3 INT’L DATA PRIVACY L. 67, 67–73 (2013).
245
GDPR art. 22.
57
This provision expands upon that in the 95 Directive.
246
According to Article 4(4) of the
GDPR, profiling:
consists of any form of automated processing of personal data evaluating the personal aspects
relating to a natural person, in particular to analyse or predict aspects concerning the data
subject’s performance at work, economic situation, health, personal preferences or interests,
reliability or behaviour, location or movements, where it produces legal effects concerning
him or her or similarly significantly affects him or her.
247
According to Veale and Edwards, the right to not be subject to an automated decision has rarely been
invoked under the 95 Directive.
248
However, they point out how the “right to an explanation” may be
problematic in terms of compliance.
249
This is especially difficult as scholars have noted that many
times algorithms are programmed to learn over time.
250
What this means is that the purpose for
which the pattern recognition algorithm is set up changes as the algorithm incorporates massive
amounts of data.
251
If the company is unable to see “inside the black box,” it will not be able to
explain exactly on what criteria a decision was made.
252
As Article 22(4) limits automated decision-
making and profiling based on special categories (such as race and religion), this may severely
restrict the use of machine learning algorithms.
253
10. Security Requirements
Because of the recent clarification of the definition of personal data, both by the ECJ and in
the GDPR,
254
companies will now need to provide the same level of protection for IP addresses and
GPS information as they do for names and social security numbers. The security requirements are
246
95 Directive article 41.
247
GDPR recital 71.
248
Michael Veale & Lilian Edwards, Clarity, Surprises, and Further Questions in the Article 29 Working Party Draft
Guidance on Automated Decision-Making and Profiling, 34 COMPUTER L. & SECURITY REV. 398–404 (2018),
https://ssrn.com/abstract=3071679.
249
Id., at 399.
250
Id.
251
Melissa De Zwart et al., supra note 243, at 718.
252
There is always a concern that an algorithm, while not initially set up to use factors such as race or religion, may
result in targeting certain groups based on the associations created as the algorithm learns. See BIG DATA: A TOOL FOR
INCLUSION OR EXCLUSION?, supra note 243, at 28.
253
For a discussion of potential harms under U.S. law, see James C. Cooper, Separation Anxiety, 21 Va. J. Law & Tech.
1 (2017), http://dx.doi.org/10.2139/ssrn.2655794.
254
GDPR art. 4(1).
58
expanded in that both the processor and controller must assure the security of its data. While the 95
Directive left it to the controller to determine appropriate security measure, the GDPR is more
prescriptive in its approach. Unlike U.S. law, which defines required security as “reasonable
measures,” the GDPR provides a description of potential measure that can be taken to protect data.
Article 32 reads:
1. Taking into account the state of the art, the costs of implementation and the nature,
scope, context and purposes of processing as well as the risk of varying likelihood and
severity for the rights and freedoms of natural persons, the controller and the
processor shall implement appropriate technical and organisational measures to ensure a
level of security appropriate to the risk, including inter alia as appropriate:
a) the pseudonymisation and encryption of personal data;
b) the ability to ensure the ongoing confidentiality, integrity, availability and
resilience of processing systems and services;
c) the ability to restore the availability and access to personal data in a timely
manner in the event of a physical or technical incident;
d) a process for regularly testing, assessing and evaluating the effectiveness of
technical and organisational measures for ensuring the security of the processing.
2. In assessing the appropriate level of security account shall be taken in particular of the
risks that are presented by processing, in particular from accidental or unlawful
destruction, loss, alteration, unauthorised disclosure of, or access to personal data
transmitted, stored or otherwise processed.
3. Adherence to an approved code of conduct as referred to in Article 40 or an approved
certification mechanism as referred to in Article 42 may be used as an element by which
to demonstrate compliance with the requirements set out in paragraph 1 of this Article.
4. The controller and processor shall take steps to ensure that any natural person acting
under the authority of the controller or the processor who has access to personal data does
not process them except on instructions from the controller, unless he or she is required to
do so by Union or Member State law (emphasis added).
In addition to calling out the possibility of encryption and/or pseudonymisation of personal
data, it seems likely that such security measures will also include “a combination of firewalls, log
recording, data loss prevention, malware detection and similar applications.”
255
Article 25 of the
GDPR requires companies to implement data protection principles such as data minimization and
ensure that only personal data that is necessary for each specific purpose is processed. Although it
does not appear to be available at this time, Article 25 permits certification as evidence of
255
EU General Data Protection Regulation – Key Changes, supra note 189.
59
compliance with Article 25.
256
Furthermore, security standards set forth by relevant EU member
states agencies and by European Union Agency for Network and Information Security (ENISA)
should be met.
257
ENISA is the cyber security agency for the EU.
258
In addition, several European member states have recently updated their security
requirements. In October 2016, France enacted the Digital Republic Bill which increased fines for
failing to secure data, expanded data breach notification requirements, and allowed for increased
investigations into company’s handling of data breaches.
259
In addition, in Europe collective legal
proceedings can be brought against companies suspected of failing to secure its data. These are
similar to the U.S. class action suit. In Germany, recently enacted legislation permits the awarding of
attorney’s fees in such actions.
260
Because of this expanded potential liability, companies would be
well-served to conduct frequent documented audits of its security practices. It is also important that
data controllers ensure that its data processors are compliant as well.
11. Data breach notification requirements
Data breach notification requirement are not a new concept for American companies, but this
is a new requirement under the GDPR. The 95 Directive did not require DPAs to be notified of a
breach as now required under the GDPR. Article 33 reads:
1. In the case of a personal data breach, the controller shall without undue delay and, where
feasible, not later than 72 hours after having become aware of it, notify the personal data
breach to the supervisory authority competent in accordance with Article 55, unless the
personal data breach is unlikely to result in a risk to the rights and freedoms of natural
persons. Where the notification to the supervisory authority is not made within 72 hours, it
shall be accompanied by reasons for the delay.
2. The processor shall notify the controller without undue delay after becoming aware of a
personal data breach.
3. The notification referred to in paragraph 1 shall at least:
256
An approved certification mechanism pursuant to Article 42 may be used as an element to demonstrate compliance
with the requirements set out in paragraphs 1 and 2 of that Article. GDPR art. 25,
257
ENISA, PRINCIPLES AND OPPORTUNITIES FOR A RENEWED EU CYBERSECURITY STRATEGY (2017),
https://www.enisa.europa.eu/publications/enisa-position-papers-and-opinions/enisa-input-to-the-css-review-b.
258
About ENISA, enisa, https://www.enisa.europa.eu/about-enisa (last visited on May 23, 2018).
259
The Digital Republic bill – Overview, https://www.republique-numerique.fr/pages/in-english
260
Dechert LLP, Data Protection: Germany Introduced “Class Action”-like Claims; Rest of EU to Follow,
LEXOLOGY (Apr. 14, 2016), https://www.lexology.com/library/detail.aspx?g=adc8d068-102b-4421-b94e-
afab534deec3.
60
a. describe the nature of the personal data breach including where possible, the
categories and approximate number of data subjects concerned and the categories and
approximate number of personal data records concerned;
b. communicate the name and contact details of the data protection officer or other
contact point where more information can be obtained;
c. describe the likely consequences of the personal data breach;
d. describe the measures taken or proposed to be taken by the controller to address the
personal data breach, including, where appropriate, measures to mitigate its possible
adverse effects.
4. Where, and in so far as, it is not possible to provide the information at the same time, the
information may be provided in phases without undue further delay.
5. The controller shall document any personal data breaches, comprising the facts relating to the
personal data breach, its effects and the remedial action taken. That documentation shall
enable the supervisory authority to verify compliance with this Article (emphasis added).
As indicated earlier, many U.S. states have data breach notification statutes and companies have
witnessed the severe ramifications to their peers for failing to encrypt and secure PII.
261
Although
Germany has had a data breach notification law for a number of years, the GDPR will require all
member states to require notification of data security breaches in certain conditions.
262
It is important
to note that most any breach will result in a conclusion that it resulted from the company’s failure to
properly secure its data.
263
Article 33 of the GDPR requires the company encountering a breach to notify the relevant
supervisory authority of the breach not later than 72 hours after discovery unless the breach is
unlikely to “result in a risk to the rights and freedoms of natural persons.” Article 34 similarly
requires notification to the natural persons who are the affected parties as described above when it is
likely to result in a high risk to their rights and freedoms, unless the data was encrypted or the
261
Target’s stock price plummeted 46% after its data breach fiasco. See Maggie McGrath, Target Profit Falls 46% On
Credit Card Breach And The Hits Could Keep On Coming, Forbes (Feb. 26, 2014 09:21am),
https://www.forbes.com/sites/maggiemcgrath/2014/02/26/target-profit-falls-46-on-credit-card-breach-and-says-the-hits-
could-keep-on-coming/#3cc1b67f7326.
262
GDPR arts. 33 and 34.
263
Benjamin Wright, Preparing for compliance with the General Data Protection Regulations (GDPR) A Technology
Guide for Security Practitioners, SANS INSTITUTE (2017), https://www.sans.org/reading-
room/whitepapers/analyst/preparing-compliance-general-data-protection-regulation-gdpr-technology-guide-security-
practitioners-37667.
61
company has taken measures to ensure that the data subjects’ rights are not impacted. In addition,
processors must notify controllers of any breaches.
264
Despite the fact that some U.S. states have very short windows in which to notify users of a
data breach, many U.S. companies take their time in determining the extent of a breach and its
ramifications prior to sending out notifications to users. This practice will not be sufficient under the
GDPR. The 72 hour requirement will be especially problematic for companies wishing to keep the
breach quiet until remediation can be accomplished potentially lading to the culprit. Making the
breach public will most likely alert the perpetrator who can then go silent.
265
C. Steps for compliance with the GDPR
In order to comply with the GDPR, there are a number of steps which companies will need to
take in order to technically comply. This section is not meant to serve as a complete explanation of
all 99 articles of the GDPR, but rather some initial guidance to companies seeking to address
compliance issues in advance of an investigation. Although not every company is required to have
one, the appointment of a DPO will go a long way to ensure that nothing is overlooked.
266
As
discussed in Section III.B.5 above, a DPO will be responsible for the record-keeping requirements,
which are significant. As a first step, companies must review data the data currently maintained and
consolidate all users data records (at least by location) and determine if there is a lawful basis to
keep the data and if proof of consent, if required, is documented.
There are a number of lawful bases for which companies may collect data, such as a
contractual obligation, but in many cases companies will need to obtain consent from European
264
For a thorough discussion of data breach notification laws in the U.S., EU and Australia, see Angela Daly, The
Introduction of Data Breach Notification Legislation in Australia: A Comparative View, 34 COMPUTER L. & SECURITY
REV. 477 (2018).
265
Some U.S. state data breach laws permit companies to withhold notification at the request of law enforcement
investigating the breach. Most states do not put any timeline on when a notification must occur, but there are several
which do require the notification to be provide within 30-90 days. Hayley Tsukayama, Why it can take so long for
companies to reveal their data breaches, WASH. POST, (Sept. 8, 2017), https://www.washingtonpost.com/news/the-
switch/wp/2017/09/08/why-it-can-take-so-long-for-companies-to-reveal-their-data-breaches/?utm_term=.d27d88036fe2.
266
In addition, if a company does not have an office in the EU but offers goods or services in the EU or monitors behvior
of persons in the EU, it may need to have a representative with a corporate office in the EU (similar to registered agent
requirements for corporations). GDPR art. 27.
62
users prior to collecting, processing or storing their personal data and be able to provide
documentation of such consent. Companies will also need to inform the users of the purpose of such
activities and the right to withdraw their consent. In addition, the Regulation codifies the right to be
forgotten/right to erasure and the right to data portability. The Regulation will also delineate the
responsibilities of data controllers and data processors. Processors must provide “sufficient
guarantees to implement appropriate technical and organizational measures” to comply with the
GDPR. This includes using appropriate measures to secure the data from possible breach.
Going forward, privacy policies will need to be updated to fully disclose not only who your
DPO and controller are, but also why the information is being collected, and how users can exercise
their rights. Privacy policies must be clear, concise and complete. Mechanisms should be developed
that make it as easy for users to exercise their rights as it is to consent to the collection of their data.
Finally, training should be conducted for all employees. This is not an IT issue as failure of
anyone in the organization to honor the rights of data subjects or comply with the requirements of
the GDPR can result in significant fines. In addition, to the issues of complying with the profiling
requirement, the ability to document compliance will be a major issue as well as responding to data
requests from users exercising their rights. For everything discussed in Section III. B., organizations
will need to demonstrate how each requirement was accomplished this (proof of impact assessments,
privacy by design – meaning before introducing a new service or product you must prepare a written
report on how privacy was considered in the design, proof of lawful basis, consent records including
consent for each separate use of the data). Individual data records will need to be easily accessible
and available in machine- readable format.
267
267
As an aside, if an organization purchases marketing lists, it will now be required to obtain the consent record along
with the list. It will no longer be sufficient to rely on the representation of the data broker. Finally, where once it was
possible to add a prospect met at a trade show simply be virtue of a discussion and exchange of business cards, verifiable
consent will need to be obtained prior to adding their personal data to a filing system (data base).
63
D. Cross-border Data Transfers
One issue particularly unique to the U.S. is the ability to transfer data from within the EU
back to the U.S. Similar to the 95 Directive, the GDPR requires that before data can be transferred
outside of the EU, the target country must provide adequate assurances of data protection.
268
Because the U.S. cannot provide such assurances due to its lack of similar privacy and data security
laws, companies will need to either sign onto the Privacy Shield or use one of the other previously
accepted methods of assuring adequate protection such as model contract clauses or binding
corporate rules.
269
1. Privacy Shield
In 2016, the EU-U.S. Privacy Shield Framework was announced as a replacement to the Safe
Harbor that U.S. companies had previously operated under.
270
The Privacy Shield program, which is
administered by the International Trade Administration (ITA) within the U.S. Department of
Commerce, enables U.S.-based organizations to self-certify to the Privacy Shield Framework in
order to benefit from the adequacy determinations, in order to allow the transfer of personal data
from the European Union to the United States. Companies become “certified” as agreeing to comply
with seven primary data security principles, generally categorized as:
1. notice;
2. choice;
3. accountability for onward transfer;
4. security;
5. data integrity and purpose limitation;
6. access; and
268
GDPR art. 44.
269
GDPR arts. 44 – 50 detail under what conditions data may be transferred outside of the EU. Davdi Lat, 6 Insights
about GDPR Compliance, ABOVE THE LAW (Apr. 4, 2018 12:15 PM),
https://abovethelaw.com/2018/04/6-insights-about-gdpr-compliance/
270
As discussed in Section II.A.2.(a) above, the ECJ declared the Safe Harbor invalid in the Schrems case.
64
7. recourse, enforcement and liability.
271
One of the main differences between the Safe Harbor and the Privacy Shield is that
companies are required to ramp up their privacy policies to include more thorough notice
requirements and better controls on further transfers of data. To become certified as complying with
the Privacy Shield, the company must first confirm eligibility, conduct a privacy audit, designate a
privacy contact person in the company, post a privacy policy adopting the provision of the Privacy
Shield Principles, and certify with the U.S. Department of Justice that it has agree to the Principles,
and pay the certification fee of $250 - $3,250.
272
Although U.S. companies are not required to join, once they commit, the provisions become
enforceable under U.S. law. All of these principles are designed to ensure adequate protection for
personal information transfers from the European Union to the United States. The U.S. Department
of Commerce, the FTC, and the DOT are given enforcement authority.
273
As of December 2017,
2613 companies were certified under the Privacy Shield.
274
Although it passed its first annual review
in September 2017, it is likely that modifications will be necessary after the GDPR goes into effect
271
I. OVERVIEW, PRIVACY SHIELD FRAMEWORK, https://www.privacyshield.gov/article?id=OVERVIEW (last visited
on May 25, 2018).
272
How to Join Privacy Shield (part 1), PRIVACY SHIELD FRAMEWORK, https://www.privacyshield.gov/article?id=How-
to-Join-Privacy-Shield-part-1 (last visited on May 25, 2018).
273
Enforcement of Privacy Shield, PRIVACY SHIELD FRAMEWORK,
https://www.privacyshield.gov/article?id=Enforcement-of-Privacy-Shield (last visited on May 25, 2018).
274
Privacy Shield List, PRIVACY SHIELD FRAMEWORK, https://www.privacyshield.gov/list (last visited on May 25, 2018).
65
on May 25, 2018.
275
There are still concerns that once the GDPR becomes applicable, the privacy
shield will need to be reevaluated.
276
The European Commission permits alternatives to the Privacy Shield (or for destination
countries other than the United States with inadequate privacy protections), the first major one being
model contract clauses, and the second being binding corporate rules (BCRs), and explicit consent
agreements being a third. We will take them in that order, but note that the first two of these –
standard contractual clauses and BCRs – are conditioned on enforceable data subject rights and
effective remedies for data subjects being available.
277
2. Model Contract Clauses
The European Commission has approved sets of standard contractual clauses that may be
used between a company in the European Union exporting data, and another receiving company in a
third country that does not have an adequate level of protection. The idea is that the contract clauses
will bind the latter company legally to respect the essence of EU data protection law and provide
data subjects with similar rights to those that they have under EU law, and this allows the transfer of
data to the third country.
278
In addition to referencing standard data protection clauses adopted by
the Commission, the GDPR also refers to other adequate safeguards in the form of contract clauses.
275
Some of the recommendations after the first review include:
• More proactive and regular monitoring of companies' compliance with their Privacy Shield obligations by the
U.S. Department of Commerce. The U.S. Department of Commerce should also conduct regular searches for companies
making false claims about their participation in the Privacy Shield.
• More awareness-raising for EU individuals about how to exercise their rights under the Privacy Shield, notably
on how to lodge complaints.
• Closer cooperation between privacy enforcers i.e. the U.S. Department of Commerce, the Federal Trade
Commission, and the EU Data Protection Authorities (DPAs), notably to develop guidance for companies and enforcers.
• Enshrining the protection for non-Americans offered by Presidential Policy Directive 28 (PPD-28), as part of
the ongoing debate in the U.S. on the reauthorisation and reform of Section 702 of the Foreign Intelligence Surveillance
Act (FISA).
To appoint as soon as possible a permanent Privacy Shield Ombudsperson, as well as ensuring the empty posts are filled
on the Privacy and Civil Liberties Oversight Board (PCLOB).
Press Release, Eur. Comm’n, EU-U.S. Privacy Shield: First review shows it works but implementation can be improved
(Oct. 18 2017), http://europa.eu/rapid/press-release_IP-17-3966_en.htm.
276
MARTIN A. WEISS & KRISTIN ARCHICK, supra note 37, at 12.
277
GDPR art. 46(1).
278
The European Commission derived its ability to develop standard contractual clauses through 95 Directive art. 26(2).
Similar provisions exist in the GDPR. See GDPR art. 46(c).
66
These include “contractual clauses between the controller or processor and the controller, processor
or the recipient of the personal data in the third country…,” subject to an authorization from the
competent EU member state supervisory authority.
279
The European Commission has approved the following standard contractual clauses (or
“model contracts”): EU controller to non-EU controller (2001);
280
EU controller to non-EU
controller (2004);
281
EU controller to a non-EU processor 2010)
282
It should be noted that, in light of the invalidation of the Safe Harbor in the Schrems decision
discussed above, and the recent challenge of standard contractual clauses in court in Ireland,
283
that
there may be changes to the standard contractual clauses in the future.
284
3. Binding Corporate Rules (BCRs)
Article 47 of the GDPR expressly permits the use of binding corporate rules for the transfer
of personal data to third countries or international organizations. Binding corporate rules (BCRs) are
“internal data protection and privacy rules set out by multinational companies to facilitate transfers
of personal data,” which set out the procedure for handling the processing of the data involving
intra-company international transfers in a kind of self-certifying mechanism. They may be approved
279
GDPR art. 46(3)(a).
280
Commission Decision of 15 June 2001 on standard contractual clauses for the transfer of personal data to third
countries, under Directive 95/46/EC (2001/497/EC), 2001 O.J. (L 181) 19 (July 4, 2001), http://eur-lex.europa.eu/legal-
content/EN/TXT/PDF/?uri=CELEX:32001D0497&from=en.
281
Commission Decision of 27 December 2004 amending Decision 2001/497/EC as regards the introduction of an
alternative set of standard contractual clauses for the transfer of personal data to third countries (2004/915/EC), 2004
O.J. (L 385) 74 (Dec. 29, 2004), http://eur-lex.europa.eu/legal-
content/EN/TXT/PDF/?uri=CELEX:32004D0915&from=EN.
282
Commission Decision of 5 February 2010 on standard contractual clauses for the transfer of personal data to
processors established in third countries under Directive 95/46/EC of the European Parliament and of the Council
(2010/87/EU), 2010 O.J. (L 39) 5 (Feb. 12, 2010), http://eur-lex.europa.eu/legal-
content/EN/TXT/PDF/?uri=CELEX:32010D0087&from=EN.
283
The High Court (Commercial) of Ireland in the case of Data Protection Commissioner v. Facebook and Schrems,
2016 No. 4809 P., Oct. 3, 2017, https://dataprotection.ie/documents/judgements/DPCvFBSchrems.pdf., referred the
question of the validity of standard contractual clauses to the ECJ for decision. For a discussion of this see Cedric
Burton et al. (Wilson Sonsini Goodrich & Rosati), European Court of Justice to Rule on Validity of Standard
Contractual Clauses, JD SUPRA (Oct. 4, 2017), https://www.jdsupra.com/legalnews/european-court-of-justice-to-rule-on-
79381/.
284
See, e.g., LOTHAR DETERMANN, DETERMANN’S FIELD GUIDE TO DATA PRIVACY LAW INTERNATIONAL CORPORATE
COMPLIANCE 38 (3d ed. 2017) (advising that companies should be prepared for changes and agree with their contracting
partners to modify contracts when this becomes necessary).
67
by an individual EU member state DPA or may use a fast-track co-operation procedure for common
opinions.
285
This form of instrument is provided for in the GDPR, as well, in order to allow cross-
border transfers to non-adequate protection countries, such as the U.S.
286
IV. Conclusion
The business model currently used by U.S. tech companies is to provide free access to
services in exchange for a user’s data.
287
This data can include information entered into a website or
platform, searches, browsing history, likes and dislikes, as well as purchases. These companies are
then able to monetize this data by selling it (or access to it) to advertisers.
288
Most companies have
not been able to successfully charge users to use their platforms, thus it seems unlikely that U.S. tech
companies will voluntarily change to this model because of the value of the data they can provide to
advertisers. With respect to their users in the EU, rather than comply with the GDPR these
companies may choose to move to an “agree or quit” model
289
or disallow those located in the EU
from using their platform after May 25, 2018.
290
It was only with the Cambridge Analytica debacle
that users here in the U.S. began to understand that they were not merely trading data for access, but
rather trading their privacy and security for services.
While many U.S. tech companies have informed the public that they will comply with
GDPR,
291
the “data for service” model is unlikely to survive the review of the ECJ if actions are
brought against these companies for violations. It is unlikely that this model would fall within
285
Binding corporate rules, EUR. COMM’N, https://ec.europa.eu/info/law/law-topic/data-protection/data-transfers-
outside-eu/binding-corporate-rules_en (last visited on May 24, 2018)
286
GDPR arts. 46(2)(b) & 47.
287
Sheryl Sandberg, COO of Facebook, has indicated that without targeting advertising, Facebook would need to charge
its users. Alex Johnson & Erik Ortiz, Without data-targeted ads, Facebook would look like a pay service, Sandberg says,
NBC NEWS (Apr. 6, 2018 3:19 PM ET), https://www.nbcnews.com/tech/social-media/users-would-have-pay-opt-out-all-
facebook-ads-sheryl-n863151.
288
G. S. Hans, supra note 4.
289
Id.
290
Hannah Kuchler, US small businesses drop EU customers over new data rule, FINANCIAL TIMES (May 23, 2018),
https://www.ft.com/content/3f079b6c-5ec8-11e8-9334-2218e7146b04.
291
One guidebook put out by the UK DPA describes steps that companies can take now. U.K. INFO. COMMISSIONER’S
OFFICE, PREPARING FOR THE GENERAL DATA PROTECTION REGULATION (GDPR): 12 STEPS TO TAKE NOW (Mar. 14,
2016), https://ico.org.uk/media/1624219/preparing-for-the-gdpr-12-steps.pdf.
68
“contractual obligation” or “legitimate interests” categories and thus consent for each and every use,
each and every future use, and for each sharing of the data. Users would be able to refuse any of
these uses and sharing which would cripple the tech companies’ ability to monetize the data. The
maximum fines that can be imposed under the GDPR are significant. There are two levels of
potential fines for noncompliance by companies. Serious breaches can result in fines of up to 4% of
annual global turnover or €20 Million (whichever is greater). This could include lack of sufficient
customer consent
292
to process data or violating the core of Privacy by Design concepts. Fines of up
to 2% of global turn over can be assessed, for example, for failing to maintain proper records
(Article 28) and failing to provide data breach notifications on a timely basis.
293
These fines are
applicable both to controllers and processors.
294
On this side of the Atlantic, many have noted that privacy laws in the U.S. need an
overhaul.
295
The main statute regulating privacy is over 30 years old. The Electronic
Communications Privacy Act was written years before the widespread use of the internet and before
social media. Even the U.S. Department of Commerce has indicated that the lack of trust in internet
292
As of the date of this article submission, DPAs in the EU have brought actions against both Facebook and Google.
Shobhit Seth, Google, Facebook Face $8.8B GDPR Suits on Day One, INVESTOPEDIA (May 29, 2018),
https://www.investopedia.com/news/google-facebook-face-88b-gdpr-suits-day-one/.
293
In assessing fines, DPAs are instructed to consider the following factors in determining the size of a fine: the number
of data subjects involved, the purpose of the processing, the damage suffered by data subjects, and the duration of the
breach. GDPR art. 83. GDPR recital 75 reads: “W]here the processing may give rise to discrimination, identity theft or
fraud, financial loss, damage to the reputation, loss of confidentiality of personal data protected by professional secrecy,
unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantage; where data
subjects might be deprived of their rights and freedoms or prevented from exercising control over their personal data;
where personal data are processed which reveal racial or ethnic origin, political opinions, religion or philosophical
beliefs, trade union membership, and the processing of genetic data, data concerning health or data concerning sex life or
criminal convictions and offences or related security measures; where personal aspects are evaluated, in particular
analysing or predicting aspects concerning performance at work, economic situation, health, personal preferences or
interests, reliability or behaviour, location or movements, in order to create or use personal profiles; where personal data
of vulnerable natural persons, in particular of children, are processed; or where processing involves a large amount of
personal data and affects a large number of data subjects.”
294
GDPR art. 79.
295
Christina Delgado, Will Congress finally update a data privacy law that’s 31-years old?, WASH. EXAM’R (Sept. 13,
2017 01:01 PM), http://www.washingtonexaminer.com/will-congress-finally-update-a-data-privacy-law-thats-31-years-
old/article/2634276.
69
privacy in the U.S. is hampering economic activity.
296
Daniel Solove, a privacy expert, repudiates
the myth that the U.S. government is a leader in creating privacy and data protection laws pointing
out that most of the federal laws were passed between 1970 and 1999.
297
While it is possible that the
Cambridge Analytica fiasco will be the impetus the U.S. government needs to update its data
protection and privacy laws, reluctance remains to move from the self-governance model to one of
strict controls over data use using the GDPR as a model.
The main reason for the differences in the laws and enforcement actions of the U.S. and EU
with respect to U.S. tech companies is that the EU considers privacy to be an inalienable right. The
U.S. Constitution does not even mention privacy.
298
The Google Spain case also elucidates the
conflict between U.S. and the EU ideology with freedom of speech and the public right to know on
the one hand, and the European’s right to privacy and to be forgotten on the other.
299
While these
conflicts are easily resolved in the EU where privacy is paramount, it is not so easy in the U.S.
While the implementation of the GDPR will represent some issues for European companies
to adapt to, they have already been operating under the strict privacy laws in effect since the 95
Directive and are in a much better position to comply. The interpretation of the GDPR will provide a
test ground for this new business model and how it impacts the ability of companies to use and
monetize consumer data. Although the GDPR represents a new paradigm for U.S. tech companies in
296
Rafi Goldberg, Lack of Trust in Internet Privacy and Security May Deter Economic and Other Online Activities,
NTIA (May 13, 2016), https://www.ntia.doc.gov/blog/2016/lack-trust-internet-privacy-and-security-may-deter-
economic-and-other-online-activities.
297
Daniel Solove, The U.S. Congress Is Not the Leader in Privacy or Data Security Law, TEACH PRIVACY (Apr. 9,
2017), https://www.teachprivacy.com/us-congress-is-not-leader-privacy-security-law/.
298
In the United States, the right to privacy was later detailed in Brandeis and Warren’s seminal Harvard Law Review
article The Right to Privacy. James H. Barron, Warren and Brandeis, the Right to Privacy, 4 HARV. L. REV. 193 (1890):
Demystifying a Landmark Citation. 13 SUFFOLK U. L. REV. 875 (1979). See also, Ben Bratman, Brandeis & Warren's
'The Right to Privacy and the Birth of the Right to Privacy', 69 TENN. L. REV. 623 (2002),
http://ssrn.com/abstract=1334296.
299
As Jeffrey Rosen explains: “In Europe, the intellectual roots of the right to be forgotten can be found in French law,
which recognizes le droit à l’oubli—or the “right of oblivion”—a right that allows a convicted criminal who has served
his time and been rehabilitated to object to the publication of the facts of his conviction and incarceration. In America, by
contrast, publication of someone’s criminal history is protected by the First Amendment, leading Wikipedia to resist the
efforts by two Germans convicted of murdering a famous actor to remove their criminal history from the actor’s
Wikipedia page.” Jeffrey Rosen, Symposium Issue The Right to be Forgotten , 64 STAN. L. REV. ONLINE 88,
https://review.law.stanford.edu/wp-content/uploads/sites/3/2012/02/64-SLRO-88.pdf (citations omitted).
70
terms of handling data, if it is successful, lessons learned could be adopted in the U.S. either
voluntarily or by legal requirement.
Although Mark Zuckerberg recently stated that changes to Facebook will provide additional
protections to users worldwide, Facebook has also moved its data storage from the EU back to the
U.S.
300
indicating that it may be preparing to challenge the applicability of the GDPR to its business
practices. Given the European perception that these U.S. tech companies have had an unfair
advantage due to lax American privacy laws and the shock of discovering the U.S. government’s
secret monitoring of data flowing out of the EU, it is likely that DPAs will target these companies
for failing to comply with the GDPR immediately. There is an overriding sense of unfairness
surrounding the ability of U.S. tech companies to monetize the data of those located in the EU where
local tech companies cannot due to the prohibitive restrictions of European data protection and
privacy laws. Although the GDPR may not be the end of Facebook and Google, their business
models and practices will have to be modified to take the new European legislation into account. It
may, however, be fair to say that the new restrictions and fines may be an end to Facebook and
Google as they currently operate, at least in the EU.
300
Alex Hern, Facebook moves 1.5bn users out of reach of new European privacy law, THE GUARDIAN (Apr. 19, 2018)
https://www.theguardian.com/technology/2018/apr/19/facebook-moves-15bn-users-out-of-reach-of-new-european-
privacy-law.