Conference PaperPDF Available

How do app vendors respond to subject access requests? A longitudinal privacy study on iOS and Android Apps

Authors:

Abstract and Figures

EU data protection laws grant consumers the right to access the personal data that companies hold about them. In a first-of-its-kind longitudinal study, we examine how service providers have complied with subject access requests over four years. In three iterations between 2015 and 2019, we sent subject access requests to vendors of 225 mobile apps popular in Germany. Throughout the iterations, 19 to 26% of the vendors were unreachable or did not reply at all. Our subject access requests were fulfilled in 15 to 53% of the cases, with an unexpected decline between the GDPR enforcement date and the end of our study. The remaining responses exhibit a long list of shortcomings, including severe violations of information security and data protection principles. Some responses even contained deceptive and misleading statements (7 to 13%). Further, 9% of the apps were discontinued and 27% of the user accounts vanished during our study, mostly without proper notification about the consequences for our personal data. While we observe improvements for selected aspects over time, the results indicate that subject access request handling will be unsatisfactory as long as vendors accept such requests via email and process them manually.
Content may be subject to copyright.
How do App Vendors Respond to Subject Access Requests?
A Longitudinal Privacy Study on iOS and Android Apps
Jacob Leon Kröger
kroeger@tu-berlin.de
TU Berlin, Weizenbaum Institute
Berlin, Germany
Jens Lindemann
lindemann@informatik.uni-hamburg.de
University of Hamburg
Hamburg, Germany
Dominik Herrmann
dh.psi@uni-bamberg.de
University of Bamberg
Bamberg, Germany
ABSTRACT
EU data protection laws grant consumers the right to access the
personal data that companies hold about them. In a rst-of-its-
kind longitudinal study, we examine how service providers have
complied with subject access requests over four years. In three
iterations between 2015 and 2019, we sent subject access requests
to vendors of 225 mobile apps popular in Germany. Throughout the
iterations, 19 to 26 % of the vendors were unreachable or did not
reply at all. Our subject access requests were fullled in 15 to 53 %
of the cases, with an unexpected decline between the GDPR en-
forcement date and the end of our study. The remaining responses
exhibit a long list of shortcomings, including severe violations of in-
formation security and data protection principles. Some responses
even contained deceptive and misleading statements (7 to 13 %).
Further, 9% of the apps were discontinued and 27% of the user
accounts vanished during our study, mostly without proper noti-
cation about the consequences for our personal data. While we
observe improvements for selected aspects over time, the results
indicate that subject access request handling will be unsatisfactory
as long as vendors accept such requests via email and process them
manually.
CCS CONCEPTS
Security and privacy Social aspects of security and pri-
vacy
;
Privacy protections
;
Social and professional topics
Privacy policies.
KEYWORDS
GDPR, compliance, subject access request, smartphone, mobile app,
privacy
ACM Reference Format:
Jacob Leon Kröger, Jens Lindemann, and Dominik Herrmann. 2020. How do
App Vendors Respond to Subject Access Requests? A Longitudinal Privacy
Study on iOS and Android Apps. In The 15th International Conference on
Availability, Reliability and Security (ARES 2020), August 25–28, 2020, Virtual
Event, Ireland. ACM, New York, NY, USA, 10 pages. https://doi.org/10.1145/
3407023.3407057
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for prot or commercial advantage and that copies bear this notice and the full citation
on the rst page. Copyrights for third-party components of this work must be honored.
For all other uses, contact the owner/author(s).
ARES 2020, August 25–28, 2020, Virtual Event, Ireland
©2020 Copyright held by the owner/author(s).
ACM ISBN 978-1-4503-8833-7/20/08.
https://doi.org/10.1145/3407023.3407057
1 INTRODUCTION
Many mobile apps collect personal data about their users and share
it with third parties, such as analytics services and ad networks
[
3
]. Given the increasing number of apps and the vast amount of
data collected – often including data with no apparent relevance to
the advertised functionality [
30
] – it is challenging for smartphone
users to keep track of the data that app vendors hold about them.
As the preceding Directive 95/46/EC, the EU General Data Protec-
tion Regulation (GDPR [
28
]), which came into eect on 25 May 2018,
aords individuals various rights over their personal data, including
the rights to access, rectication, and erasure (Art. 15–17 GDPR).
These rights allow smartphone users to demand transparency and
regain control over the personal data collected by mobile apps. Nu-
merous empirical investigations have revealed, however, that data
subjects face various impediments in exercising their rights, with
many data controllers completely refusing to comply with the law
[1, 10, 24, 34, 37].
In this paper, we contribute to this line of research by presenting
the results of a four-year undercover eld study. While most existing
work has investigated other types of public and private service
providers, we focus on developers and vendors of mobile apps.
This industry deserves scrutiny, as smartphones have become the
primary device for many users [5].
Contribution. Until now, related studies (cf. Sect. 2) have evalu-
ated data controller behavior based on one-time snapshot data. In
contrast, we analyze how the behavior and compliance of a xed set
of app vendors change over time. Our longitudinal study includes
observations both before and after the GDPR enforcement date. In
three iterations between the years 2015 and 2019, we attempted to
exercise the right of access with 225 vendors of mobile apps that
were popular in Germany at the beginning of our study. Addition-
ally, our rst and our last round of inquiries included a question
on third-party data sharing practices. In this paper, we examine
in detail the obstacles we encountered as well as the veracity and
completeness of the responses received. Secondly, we describe the
measures that app vendors apply to verify the requester’s identity
and how they choose to protect the condentiality of transmitted
personal information. Thirdly, we shed light on the issue of dis-
solution of personal data that results from vendors going out of
business, apps being discontinued, and stale user accounts being
deleted without prior notice.
Outline. The remainder of this paper is structured as follows.
First, we review related work in Sect. 2. In Sect. 3, we describe
our data collection process. Then, we present our methodology for
interaction with app vendors in Sect. 4. Section 5 summarizes the
results of our study, a discussion of which is provided in Sect. 6. A
ARES 2020, August 25–28, 2020, Virtual Event, Ireland Jacob Leon Kröger, Jens Lindemann, and Dominik Herrmann
reection on ethical aspects and limitations of our study follows in
Sect. 7 before we conclude the paper in Sect. 8.
2 RELATED WORK
Numerous studies have examined how data controllers react to
data subject requests in practice. In these studies, test requests for
data access, erasure, and/or portability were either sent to specic
types of organizations, such as CCTV operators [
34
], smartphone
app vendors and website owners [
10
], online tracking companies
[
37
], or to a broad range of public and private sector organizations
[1, 2, 18, 19, 24, 42, 43].
In [
24
], investigations were carried out in ten dierent EU mem-
ber states and, in addition to the quantitative evaluation, a detailed
case-by-case assessment is provided for individual data controllers.
With the exception of [
1
], [
37
] and [
42
], the above-referenced stud-
ies were conducted prior to the GDPR coming into force. Also,
in contrast to our study, none of the existing publications oers
a multi-year longitudinal evaluation over a constant set of data
controllers. Since 2010, the French Association of Data Protection
Ocers
1
has published a yearly report on the right of access, includ-
ing results from probing 150 to 200 service providers [
1
]. However,
their list of examined data controllers is newly compiled each year,
which means that behavioral changes and trends within individual
organizations are not observed.
In line with our ndings, previous studies report various anom-
alies, poor practices, and severe compliance issues on the side of
the data controllers, resulting in low rates of satisfying responses
to data subject requests [
1
,
10
,
19
,
24
,
34
,
37
]. Besides a widespread
unwillingness or inability to provide the requested data in time
[
2
,
10
,
34
,
37
], researchers have observed the use of inappropri-
ate le formats for the transfer of personal data [
42
], instances
of personal information leakage to impostors [
7
,
10
], issues con-
cerning the language and clarity of interactions [
24
], and unsafe
procedures to authenticate data subjects [
4
,
25
]. In some cases, re-
searchers were not even able to locate the contact details of data
controllers, rendering any request submission impossible from the
outset [17, 24].
While privacy and security aspects of mobile apps have received
a lot of research attention in recent years [
3
,
15
,
21
,
30
,
31
], little
is yet known about the behavior of app vendors when it comes to
fullling data subject requests, especially not with the GDPR being
in eect.
3 DATA COLLECTION
Our objective was to obtain a comprehensive sample of apps from
the iOS and Android app stores, considering a diverse set of app
types and vendors. We did not want to focus on the most popular
apps, because those might exhibit abnormal behavior as a result
of being in the spotlight. On the other hand, we did not want to
analyze outdated and niche apps with virtually no users.
To compile a suitable sample of popular mobile apps, we used
market research information provided by AppAnnie (https://www.
appannie.com). AppAnnie monitors the download counts of apps
on the app stores of Google and Apple. Specically, in August 2015,
1
Ocial name in French: Association Française des Correspondants à la Protection
des Données à Caractère Personnel (AFCDP)
iOS
?
Android
rest EU Germany non-EU countries
App categories (5 apps each)
Vend or ’s co untr y o f re side nce
Operating system
iOS: Business, Catalogue, Education,
Entertainment, Finance, Food, Games,
Health, Lifestyle, Medical, Music,
Navigation, News, Photo, Productivity,
Reference, Social Networking, Sports,
Travel, Utility, Weather
Android: Books, Business, Comics,
Communication, Education,
Entertainment, Family, Finance,
Games, Health, Lifestyle, Media,
Medical, Music, News, Photography,
Productivity, Shopping, Social, Sports,
Tools, Transportation, Travel, Widgets
844 78 95
105 120
Figure 1: Popular apps dataset overview; gures refer to
number of apps out of 225 apps in total
we obtained AppAnnie’s ranking lists containing the most popular
apps in Germany. In current mobile operating systems, apps are
assigned categories that describe their primary function or subject
matter (e. g., Education, Music, or Health). For each of these app
store categories, AppAnnie provides a ranked list (of varying length)
that contains the most popular apps in the respective category
according to the number of downloads within a xed period. There
were 24 categories for Android and 21 categories for iOS apps.
We randomly sampled apps from each of AppAnnie’s category
lists, subject to the following conditions. We picked a random app,
installed it, and checked whether it oered users the possibility
to create a personal account or to enter personal information in
some other way. If an app did not meet this requirement, we picked
another random app from the respective list. We also skipped an app
if our sample already contained another app from the same vendor.
We sampled apps until we had collected ve apps per category,
amounting to a total sample size of 225 apps (120 Android apps,
105 iOS apps). As a result, apps span a wide number of popularity
ranks (avg. rank within a category: 134, std. deviation: 111, min.
rank: 1, max. rank 407).
In the following, we characterize the dataset (see also Fig. 1).
The largest proportion of the selected apps’ vendors is based in
Germany (35 %). Vendors located across other EU member states
and outside of the EU account for 20 % and 42%, respectively. For
3 % of the apps in our dataset, we could not nd any information
on the vendor’s residence.
Like the German Federal Data Protection Act [
26
] and the now
superseded EU Directive 95/46/EC [
29
], the GDPR can also apply
to data controllers established in non-EU countries [
40
]. The GDPR
explicitly states that it applies to organizations “not established
in the Union, where the processing activities are related to the
oering of goods or services, irrespective of whether a payment
of the data subject is required, to such data subjects in the Union”
(Art. 3 GDPR). While it may be dicult to exercise and enforce data
protection rights against data controllers in foreign countries, we
include app vendors based outside the EU in our study to compare
their behavior with that of domestic vendors (see Sect. 5.5).
How do App Vendors Respond to Subject Access Requests?
A Longitudinal Privacy Study on iOS and Android Apps ARES 2020, August 25–28, 2020, Virtual Event, Ireland
4 METHODOLOGY FOR INTERACTION WITH
THE APP VENDORS
The apps in our sample were downloaded and installed on a smart-
phone with the corresponding operating system. Where it was
possible, a new user account was created by signing up via email
(76 %), Facebook (10%), or Google (10%); preference was given to
signup via email if available. The remaining 4 % of the apps were
populated with personal data without account registration.
For account creation and all other app interactions, we acted
disguised as an ordinary consumer, always using the same identity
of a volunteer (male, of legal age, German nationality). The app
vendors were not informed in advance that they would be the
subject of a study. We will reect upon the necessity and ethical
aspects of this covert approach in Sect. 7. Using the identity of a
real person was necessary to overcome identication barriers (see
Sect. 5.4).
After installation, we interacted with each app for about ten
minutes, entering as much personal data as possible. Subsequently,
over a period of four years, we issued three rounds of data subject
requests to the app vendors (from now on referred to as R1, R2, and
R3).
R1
requests were issued in November 2015, several weeks after
the apps were installed.
R2
requests followed in March 2018, two
months before the May 25 GDPR enforcement date.
R3
requests
were sent in August 2019, more than one year after GDPR had come
into eect.
The subject access requests were written in English or German
(depending on the language of the app). We sent them to the ven-
dors’ email address specied on the respective page in the app
marketplace. If no contact details were provided there, we searched
for an email address on the vendor’s website or, if it was the only
option available on the website, submitted the request via a contact
form. Ultimately, we were able to nd a point of contact for all
apps for R1. We updated our list of vendor email addresses once
again before sending out our subject access requests in R2 and R3,
respectively.
To account for changing external circumstances and to avoid
being recognized as researchers, we deliberately used a dierent
request text for each round of inquiries. While the right of access
was the main focus of our study and, therefore, all three rounds con-
tained data access requests, we additionally requested information
about data sharing practices in R1 and R3. For R1, we chose a short
and informal inquiry text, comprising only seven sentences. The
texts for R2 and R3 were more elaborate. They included references
to relevant data protection laws (GDPR, EU Directive 95/46/EC, and
the German Federal Data Protection Act) as well as a warning that
the responsible data protection authorities would be notied in the
case of no response (which, in fact, we did not do).
A more formal approach was chosen for R2 and R3 because the
introduction of the GDPR and the wide-ranging preparations for the
new law aroused increased media attention and public awareness
for data subject rights and data protection issues in general [
9
,
32
].
In addition, since 2018, more and more self-help tools and templates
for GDPR requests have become available through websites like
datarequests.org and gdpr.eu. We, therefore, assumed that ordinary
smartphone users were now better equipped and thus more likely
to make formal requests with legal references than they were in
2015, the year of R1.
To test how a reminder would aect the response rate, we sent
a follow-up email to all vendors who had not replied to our re-
quest in R3. In the reminder (sent 85 days after the R3 request) we
merely stated that we had not received a response to our previous
message. Except where specically indicated, the responses to this
reminder will be ignored in the evaluation section, i. e., vendors
who only responded after receiving a reminder will be counted as
non-responders in R3.
5 ANALYSIS AND RESULTS
The received responses were subjected to a qualitative content anal-
ysis as proposed by Strauss and Corbin [
35
]. Following an open
coding approach, we built and rened a codebook based on the
data collected in R1. First, one coder went through the received
responses, creating and applying an initial set of codes to all encoun-
tered ndings. An independent second coder subsequently applied
the resulting list of codes to the same set of responses. The initial
coding yielded a Fuzzy Kappa value of 0.819, which indicates a high
degree of agreement between the two coders.
2
Subsequently, the
two coders consolidated the codebook, reassessed all responses, and
resolved all conicts. The emerging code categories and overarch-
ing themes were used to determine the foci of our paper, including
the reachability and responsiveness of app vendors, the provision
of the requested data, statements made on data sharing practices,
occurring technical and communication problems as well as the
security of data transmissions. A single coder applied the consol-
idated codebook to the responses in R2 and R3. The number of
assigned codes per app and iteration varies between 1 and 9.
We have released a sanitized dataset containing our ndings
for every app [
12
]. A summary of our results is provided in Ta-
ble 1, at the end of this section. In the published dataset and in
the remainder of this paper, the apps in our sample are referred to
using the pseudonyms
A1 to A225
. Where applicable, the corre-
sponding round of requests is indicated through a subscript. A5
R1
,
for instance, designates the response from the vendor of App 5
to our rst request. Section 5.1 provides an overview of our re-
sults, followed by an in-depth analysis of the obtained responses
in Sects. 5.2–5.7. Unless specically stated otherwise, percentage
values refer to the whole sample of 225 apps. We will introduce
capitalized labels to refer to particular subsets of apps.
5.1 Overview of the Received Responses
While the majority of the investigated app vendors reacted to our
inquiries in some way (
RESP
:78 %
R1
,81 %
R2
,74 %
R3
), many did
not respond to our requests (22 %
R1
,16 %
R2
,22 %
R3
) and some were
completely unreachable via email, yielding us only a delivery failure
notication (0 %
R1
,3 %
R2
,4 %
R3
). The reminder we sent after our
last round of requests increased the response rate for R3 slightly
from 74 % to 80 %, showing that exercising data protection rights
can sometimes require perseverance.
2
In contrast to Cohen’s Kappa, Fuzzy Kappa by Kirilenko and Stepchenkova [
11
] can
handle codebooks with codes that are not mutually exclusive, which matches our
setting.
ARES 2020, August 25–28, 2020, Virtual Event, Ireland Jacob Leon Kröger, Jens Lindemann, and Dominik Herrmann
Out of all examined app vendors, only 14 %
R1
,44 %
R2
, and 37 %
R3
sent us a response that contained personal data (
DATA
). For reasons
outlined in Sect. 5.2, however, many responses were unintelligible
and useless. Overall, we received a proper usable export of personal
data from merely 12 % of all vendors in R1, and from a signicantly
greater but still underwhelming proportion of 38 % in R2 and 28%
in R3. Some responses even contained deceptive or misleading
statements (7 %
R1
,13 %
R2
,11 %
R3
), which will be elaborated on in
Sect. 5.6.
In each iteration of our study, the vast majority of responses
arrived within ve days (82 %
R1
,76 %
R2
,83 %
R3
). Apart from a small
number of exceptions, the rest arrived between day 6 and day 15
(14 %
R1
,22 %
R2
,11 %
R3
). A minor group of outliers (2 %
R1
,0 %
R2
,
2 %
R3
) replied after more than 31 days, with one month being the
time limit for responding to data protection rights requests as per
Art. 12 GDPR. Before GDPR enforcement, neither the EU Directive
95/46/EC nor the German Federal Data Protection Act did prescribe
such a specic time limit. However, according to legal commen-
taries by Wol et al. [
41
, § 34 Rn. 104-106.1] and Müller-Glöge et al.
[
22
, § 34 Rn. 1], the maximum appropriate response time ranged
between two and four weeks. Figure 2 shows the frequency distri-
bution of responses over time.
Out of all received responses (RESP), the proportion that fullled
our subject access request (
OK
) by either containing a proper export
of personal data or a credible statement that the data is no longer
stored was 19 %
R1
,65 %
R2
, and 55 %
R3
. The others were recorded as
incomplete responses. When taking all app vendors in our sample
into account, including those who were unreachable or did not
respond, the proportion of OK responses amounts to 15 %
R1
,53 %
R2
,
and 41 %
R3
. An overview of our evaluation for all three rounds of
subject access requests is provided in Fig. 3.
While we cannot precisely determine their individual inuence,
it can be assumed that both the introduction of the GDPR as well
as the more formal and threatening tone of our inquiry in R2 and
R3 (cf. Sect. 4) had an impact on the vendors’ behavior. In [
10
], data
provision inquiries yielded better response rates when they were
written more formally and included explicit references to relevant
data protection law.
In cases where data was provided (DATA), it was made available
either by email (97 %
R1
,84 %
R2
,79 %
R3
), by postal mail (0 %
R1
,9 %
R2
,
2 %
R3
), or through the app itself (3 %
R1
,7 %
R2
,19 %
R3
). In most cases,
personal data was provided in attachments in various le formats
(namely pdf,html,json,csv,jpg,png,docx, and txt) and as plain text
in the email body.
The following sections focus on particular aspects of the re-
sponses. First, we will consider deciencies with format and content
in Sect. 5.2. The degree to which the app vendors addressed our ad-
ditional question on data sharing practices is examined in Sect. 5.3.
After that, we consider the security of data in transit (Sect. 5.4) be-
fore we compare the responses in terms of their sender’s residence
(Sect. 5.5). Finally, we report on particularly intriguing aspects in
Sects. 5.6 and 5.7, namely deceptive responses as well as the disso-
lution of personal data when vendors go out of business, apps are
discontinued, and accounts are deleted unsolicitedly.
0
10
20
30
40
50
60
70
80
90
100
110
120
130
140
150
0-5
6-10
11-15
16-20
21-25
26-31
>31
0-5
6-10
11-15
16-20
21-25
26-31
>31
0-5
6-10
11-15
16-20
21-25
26-31
>31
Number of responses
Response duration (in days)
OK responses incomplete responses
R1 R2 R3
Figure 2: Frequency distribution of responses over time
0.22
0.16
0.22
incomplete (0.33)
incomplete (0.28)
incomplete (0.63)
OK (0.41 )
OK (0.53 )
R1
received responses (RESP)
0.03
Overall
evaluation of
all apps
R2
R3
0.04
0.14
0.44
0.37
unreachable no reply data transmission (DATA)
Figure 3: Evaluation of subject access request responses of
all apps (n = 225)
5.2 Insucient Responses
Failing to full our subject access request, some of the received
responses (RESP,
n=
175
R1,
183
R2,
167
R3
) contained only the labels
of collected data, e. g., “birth date”, but not our actual data values,
e. g., “1977-03-09” (22 %
R1
,6 %
R2
,5 %
R3
). The frequency of this phe-
nomenon strongly decreased over the course of our study, with a
particularly large drop between R1 and R2.
How do App Vendors Respond to Subject Access Requests?
A Longitudinal Privacy Study on iOS and Android Apps ARES 2020, August 25–28, 2020, Virtual Event, Ireland
Even when actual data was transmitted (DATA,
n=
32
R1,
100
R2
,
84
R3
), it was often unintelligible due to serious formatting errors
or obscure data labels (9 %
R1
,14 %
R2
,24 %
R3
). In some of these
cases,
3
the data consisted of a continuous block of alphanumeric
characters without headings, spaces, and line breaks. According to
EU law, personal data should be made available “in a structured,
commonly used and machine-readable format” (Art. 20 GDPR).
Interestingly, despite the relatively short duration of app usage (10
to 15 minutes per app), we sometimes received data in large and
confusing folder structures. The responses of A4
R3
and A15
R3
, for
example, contained 17 folders and 27 individual les, respectively,
but only a handful of relevant data points. Also, in several cases,
data was only transferred as an image le (e. g., a screenshot of an
internal database), which made it impossible to extract information
via copy and paste (3 %R1,16 %R2 ,4 %R3).
In other cases, technical or communication problems resulted
in relevant information and personal data not being transmitted
in the rst place. These problems include dead download links
and empty email attachments (A11
R3
, A29
R3
, A33
R3
, A146
R3
), un-
clear or incomprehensible instructions for how to obtain the data
(A120
R3
, A150
R3
, A151
R3
, A225
R3
), references to inaccessible pri-
vacy functions inside the app (A157
R3
), malfunctioning authenti-
cation mechanisms (A87
R3
, A140
R2
, A178
R3
, A181
R1
, A202
R3
), and
cases where the responsibility for privacy-related inquiries was re-
ferred back and forth between departments or aliated companies
(A21
R1
, A25
R1
, A118
R3
). In most of these incidents, pointing out
the problem to the app vendor did not help. For example, we ex-
changed 24 emails with A151
R3
in an attempt to clarify a technical
issue before nally giving up because the responses had become
repetitive. These observations are in line with those by Ausloos et
al. [
2
] who found that subject access requests often lead to endless
sequences of emails without resulting in the requested transfer of
personal data.
Furthermore, some of the investigated app vendors appear to lack
essential resources to receive and process subject access requests
in a professional manner. For example, from A30
R3
, we received
only a “recipient inbox full” error message. Other vendors explicitly
stated that they lack the time, technical means, or personnel to
compile and transmit the requested data (A105
R2
, A172
R3
), to reply
in time (A50
R3
, A83
R3
), or to read emails in their inbox (A48
R3
,
A199
R3
). While the many inadequate responses to our requests
indicate widespread unawareness and ignorance about legal duties,
some responses revealed a specic deciency in legal knowledge.
The person writing A112
R2
, for example, did not know that a right
of access already existed in EU law before the GDPR. Also, after
the commencement of GDPR in May 2018, some vendors stopped
oering a login function (A42R3, A58R3, A93R3) or even discontin-
ued the entire app, stating explicitly their inability to comply with
the new law (A62R3).
There were also language barriers. Although our requests were
written in the corresponding app’s default language (English or
German), two vendors replied in Spanish (A108
R3
, A201
R3
) and one
always replied in Korean (A149). A31
R3
, A51
R3
and A174
R3
changed
the language of communication from English to German during
3
Out of ethical considerations, we refrain from publishing screenshots and excerpts
of the responses (cf. Sect. 7).
our email correspondence without prior announcement. Another
vendor used the online translation service Google Translate to
communicate with us, which led to ambiguities and confusion
(A150
R3
). A206
R2
replied in bad English, and the responses A165
R1
and A189
R1
were utterly incomprehensible. Beyond that, some
vendors used technical jargon, such as the term “SDK”
4
(A210
R3
).
While jargon does not render a response useless per se, ordinary
users may have diculty understanding it.
Astonishingly, three vendors stated that data exports are only
provided for paid subscriptions, not for users of their free version
(A89
R2
, A137
R1
, A202
R2
). And, in another grave violation of the
most basic data protection principles, A100
R2
sent us personal data
of another user, including the person’s full name, contact details,
login credentials, app usage logs, and even location data. In one
case, we received the response to our request from a foreign email
address with no apparent relation to the app in question (A77R3).
Finally, some responses were impolite or contained careless mis-
takes. For instance, A161
R2
declared – without any explanation – to
have “no interest in having [us] as a customer.” A193
R3
and A224
R3
replied without a formal salutation, A202
R3
misspelled the name in
the salutation, and both A13
R3
and A151
R3
sent us the same emails
multiple times for no apparent reason. In a strange mistake, the
support agent writing A89
R3
even addressed us with their name.
Although these incidents were no severe obstacles to communi-
cation, they do indicate a lack of care and rigor in answering our
requests.
5.3 Data Sharing with Third Parties
Along with the request for data access, in R1 and R3 we also
requested information on third-party data sharing practices (cf.
Sect. 4). Specically, we asked the app vendors to name any third-
party recipients to whom they had disclosed our personal data.
Only a minority of the received responses (RESP,
n=
175
R1,
167
R3
)
addressed this question at all (41 %R1,40 %R3 ).
Out of this subset (
DSANS
,
n=
72
R1,
66
R3
), a decreasing pro-
portion stated that personal data is never being shared with third
parties (63 %
R1
,52 %
R3
). The remaining responses vary widely in
terms of the length and detail in which data sharing practices are
explained. While some state specically what data is made avail-
able to which partners for what purpose (56 %
R1
,37 %
R3
), others
list only a few generic reasons for data sharing (18 %
R1
,16 %
R3
) or
potential data recipients without naming the respective purpose
(11 %
R1
,25 %
R3
). A few vendors disclose only the categories of data
being shared (11 %
R1
,6 %
R3
) or even refrain from providing any
specics beyond mentioning the existence of data sharing (4 %
R1
,
16 %R3).
Overall, only 36 %
R1
and 32 %
R3
of the received responses (RESP)
adequately addressed our question by either assuring that no data
is being shared with third parties or by providing a list of data
recipients. And there is still room for doubt, even in these cases.
In particular, the proportion of apps that explicitly mention third-
party tracking (only 19 %
R1
and 14 %
R3
of DSANS) is far below what
one should expect based on the results of large-scale empirical
studies on the topic. Binns et al. [
3
], for example, examined the
4
SDK is an abbreviation of “software development kit” (i. e., a collection of software
development tools)
ARES 2020, August 25–28, 2020, Virtual Event, Ireland Jacob Leon Kröger, Jens Lindemann, and Dominik Herrmann
prevalence of third-party trackers on 959,000 Android apps and
found that 90.4 % included at least one tracker host. Therefore, it is
well conceivable that some of the responding app vendors withheld
relevant information on their data sharing practices.
5.4 Identity Verication and Security of
Transmitted Data
Before responding to a subject access request, data controllers must
carefully verify the identity of the requester. In a recent study,
Martino et al. [
7
] demonstrated that unauthorized third parties
could fairly easily obtain personal data through fake subject access
requests sent from a dierent email address. In our study, most
of the app vendors that sent us a copy of personal data (DATA,
n=
32
R1,
100
R2,
84
R3
) did not ask for prior identity verication
(84 %
R1
,85 %
R2
,76 %
R3
). Note, however, that we sent our inquiries
using the email address that we had used for account registration,
which provides a minimal form of authentication by itself.
Only a minority of app vendors in DATA demanded certain
pieces of identity-related information. We were asked to provide
our birth date, address, and customer number (13 %
R1
,2 %
R2
,0 %
R3
)
as well as a copy of a utility bill, ID card, or driving license (0 %
R1
,
5 %
R2
,5 %
R3
). A problem with this type of authentication mecha-
nism is that it often requires those seeking to exercise their rights
to provide the data controller with additional personal informa-
tion. This practice has been recognized as “contrary to the spirit of
data protection law” [
25
]. Implementing a rather unusual method
for authentication, A119
R2
demanded from us to “log into your
social media account and post for public viewing a list of random
characters.” Surprisingly, in three cases (A5R2, A8R2, A157R2 ), per-
sonal data was disclosed to us before we had even responded to an
authentication request.
In accordance with a GDPR best practice recommendation (from
Recital 63 [
28
]), several app vendors did not directly send data to
us. Instead, they provided remote access to the data through a self-
service system within the password-protected app (3 %
R1
,7 %
R2
,
19 %
R3
of DATA). For this practice, a positive trend was observed
over the three iterations of our study, with a particularly sharp
increase between R2 and R3.
Some of the responders (RESP,
n=
175
R1,
183
R2,
167
R3
) failed
to use transport-layer encryption via TLS (Transport Layer Secu-
rity) in at least one of their emails. Transport-layer encryption is a
baseline security measure, which has been deemed mandatory for
the transmission of personal data via email [
36
]. For our analysis,
we checked whether all relevant Received headers within emails
contained signs of TLS being used (indicated by “SMTPS” or the
statement of the negotiated cipher suite). Here, again, the worst
result was recorded in R1, where 13 % of all responses (RESP) were
not encrypted. This gure improved and remained stable over R2
and R3 (both 3 %). A21
R2
, and A75
R1
even transmitted the requested
personal data without encryption. Besides, as indicated by email
metadata (again, Received headers), many app vendors handle cus-
tomer support via third-party services, e. g., ticket system software
operated by Zendesk, Salesforce, Helpscout, Mandrill and others.
As a result, even vendors that are themselves based in the EU may
disclose personal data to third parties in other jurisdictions. In these
cases, exercising one’s right to access may disperse one’s personal
data to additional data processors.
Where data was transferred (DATA), two other security mecha-
nisms we observed were the expiration of download links after a
few days or weeks (0 %
R1
,2 %
R2
,10 %
R3
) and the protection of email
attachments with passwords (6 %
R1
,3 %
R2
,12 %
R3
). Except for one
case where the corresponding password was communicated via
telephone (A225
R1
), the access keys were sent via email, sometimes
in the same email as the password-protected data (A28
R3
, A211
R3
).
Finally, for security reasons, several vendors chose to send the data
by postal mail (0 %
R1
,9 %
R2
,2 %
R3
) – sometimes even by registered
mail (A16R2, A81R2, A107R2, A225R2).
Our correspondence with the vendors indicates that the em-
ployed authentication mechanisms are not the result of rigorous
attacker modeling. Given the prevalence of TLS on mail servers,
they provide little to no additional security against passive eaves-
droppers. More importantly, they cannot protect against an adver-
sary that has already compromised a user’s email account. Many
pieces of information that have been requested from us for identity
verication can be extracted from previously exchanged emails.
Moreover, adversaries that control the email account of a user re-
ceive data and accompanying passwords when these are sent via
email. Such adversaries could also compromise a user’s account
on the app itself via the commonplace email-based password reset
functionality, either to steal data directly from within the app or to
change the postal address to which the data is sent.
5.5 Comparison by Vendor’s Residence
In the following, we explore behavioral dierences between the
app vendors from our sample that are based in Germany (
GER
,
n=
78), in other EU member states (
EMS
,
n=
44), and in the rest
of the world (
WLD
,
n=
95). Apps for which the vendor’s residence
could not be determined (n = 8) are not considered in this analysis.
Across all three iterations of our study, the proportion of vendors
responding to our inquiry was always somewhat higher among
GER (82 %
R1
,88 %
R2
,78 %
R3
) than among EMS (75 %
R1
,77 %
R2
,
77 %
R3
), and WLD (75 %
R1
,77 %
R2
,69 %
R3
). We will refer to these
responses as
RESPGER
(
n=
64
R1,
69
R2,
61
R3
),
RESPEMS
(
n=
33
R1,
34
R2,
34
R3
) and
RESPWLD
(
n=
71
R1,
73
R2,
66
R3
). Through-
out R1 and R2, the proportion of responses containing an export of
personal data was highest among RESP
GER
(30 %
R1
,71 %
R2
,61 %
R3
).
RESP
EMS
caught up towards the end of our study (21 %
R1
,62 %
R2
,
62 %
R3
), leaving the rate of data transmissions among RESP
WLD
far behind (7 %
R1
,38 %
R2
,38 %
R3
). We name these data contain-
ing responses
DATAGER
(
n=
19
R1,
49
R2,
37
R3
),
DATAEMS
(
n=
7R1,21R2,21R3 ) and DATAWLD (n=5R1,28R2,25R3).
Where sucient for comparison, only the percentage gures
from the most recent inquiry (R3) will be shown below. If the previ-
ous rounds of requests yielded deviating results, we will explicitly
mention them.
In terms of OK responses (a usable export of personal data or a
credible statement that no personal data is being stored), the three
geographical subsets followed a similar development. The results
show a sharp improvement between R1 and R2 followed by deteri-
oration in R3 (GER: 51 %
R3
, EMS: 45 %
R3
, WLD: 31 %
R3
). In all three
runs, GER had the highest rate of OK responses; the lowest rate
How do App Vendors Respond to Subject Access Requests?
A Longitudinal Privacy Study on iOS and Android Apps ARES 2020, August 25–28, 2020, Virtual Event, Ireland
was consistently found among WLD. Contributing to this discrep-
ancy, Germany-based vendors scored better on the structure and
intelligibility of transmitted data. Due to a lack of these qualities,
24 %
R3
and 33 %
R3
of the data exports provided by DATA
WLD
and
DATA
EMS
were useless, respectively. Among DATA
GER
, this gure
was relatively low (19 %R3).
And while the geographical subsets had comparable response
rates to our question on data sharing practices in R1 (see Sect. 5.3),
GER achieved by far the best result in our last round of requests
(49 %R3), followed by EMS (25 %R3 ) and then WLD (17 %R3).
Averaging over all responses received in our three rounds of
inquiries, the vendors residing in Germany took longer to respond
(5.5 days) than the vendors based in other EU member states (3.2
days) and outside of the EU (2.9 days). However, measured against
legally prescribed time constraints (see Sect. 5.1), all of these average
durations are acceptable.
The Germany-based vendors exhibited substantially inferior per-
formance in the area of requester authentication. Compared to the
already surprisingly small portion of vendors who conducted some
form of identity check among DATA
EMS
(48 %
R3
) and DATA
WLD
(28 %
R3
), authentication requests among DATA
GER
were much
sparser (8 %
R3
). In our rst inquiry, this gure had been 21 %
R1
for
DATA
GER
and 0 %
R1
for DATA
WLD
, but subsequently rose sharply
for DATA
WLD
and, contrary to our expectations, decreased consid-
erably for DATA
GER
. Before taking the sudden lead in R3, DATA
EMS
had much weaker authentication results (14 %R1,10 %R2).
At the end of our study, more apps from GER (10 %) and es-
pecially EMS (14 %) had been discontinued than from WLD (7 %).
The proportion of disappearing user accounts (see Sect. 5.7) was
distributed fairly evenly across the three groups.
5.6 Deceptive and Misleading Responses
Many of the responding app vendors (RESP,
n=
175
R1,
183
R2,
167
R3
)
made misleading or demonstrably false statements. Several claimed
that the app in question or our user account no longer existed –
although we were still able to install, log in to, and use the app
on our test devices (6 %
R1
,11 %
R2
,6 %
R3
). Some vendors even con-
tradicted themselves. For example, A34
R3
repeatedly picked up
specic points from our email request while pretending that the
same message had never arrived. A188
R3
claimed that no data re-
lated to our account had been collected. A213
R3
stated that, as a
matter of principle, copies of personal data would never be provided
for security reasons. Both companies, however, had already sent us
a copy of stored personal data a few days before. Moreover, some
app vendors falsely claimed that they had already replied to our
request earlier (A127
R3
) or falsely promised to get back to us within
a specied time (A2R3, A33R3 , A59R2, A102R2, A118R3 , A126R3).
Besides the capacity limits mentioned in Sect. 5.2, one reason
for these observed inconsistencies and contradictions seems to be
a lack of standardized processes for dealing with subject access
requests. Our inquiries were sometimes processed and answered
simultaneously by multiple dierent departments or employees
(A4
R3
, A21
R2
, A37
R3
, A106
R3
, A153
R3
, A188
R3
, A191
R3
, A197
R3
,
A213
R3
). Additionally, in some cases where a data export was pro-
vided, we noticed that the export was not complete, i. e., some pieces
of personal data stored in the app had been omitted for unknown
reasons (A47R2, A158R1, A173R1 ).
We also suspected that some vendors merely pretended to be
poorly reachable when they received subject access requests –
while others actually had insucient resources to process incoming
emails. To conrm this hypothesis, we tested how the vendors that
failed to respond to our requests reacted to non-privacy related
inquiries. Using another (dierent) fake identity, we emailed the
vendors who had not replied in R1 and R3, expressing interest in
promoting their apps on a personal blog (R1) or YouTube channel
(R3). Out of the group of initial non-responders, 31 %
R1
and 22 %
R3
replied to these dummy requests, many of them within a few hours,
proving that their email inbox was in fact being monitored.
5.7 Discontinued Apps and Accounts
Manual checks revealed that prior to R2, 4 % of the initial 225 apps
no longer existed. 9 % were gone prior to R3. Some of the corre-
sponding vendors (
GONE
,
n=
10
R2,
21
R3
) still responded to our
requests, informing us that their app had been discontinued (30 %
R2
,
14 %
R3
) or providing us with a copy of personal data that was still
being stored (10 %
R2
,0 %
R3
). The majority of them, however, did
not address our request in their response or remained silent (50 %
R2
,
57 %
R3
) and some were not reachable via email anymore (10 %
R2
,
29 %R3).
Even among the vendors of the apps that could still be installed
(
NOTGONE
,
n=
225
R1,
215
R2,
204
R3
), several responded that none
of our submitted personal data is stored on their servers (8 %
R1
,
24 %
R2
,17 %
R3
). While some of these vendors (
NODATA
,
n=
17
R1,
51
R2,
34
R3
) explained that they only process data locally on
the user’s device without ever having direct access to it (24 %
R1
,
25 %
R2
,24 %
R3
), most of them simply stated that a matching record
did not exist in their database anymore (76 %R1,75 %R2,76 %R3).
Based on manual login attempts, 27 % of all user accounts that
we had initially created were no longer accessible at the end of our
study, including discontinued apps (GONE). In a few astounding
cases, our user account was deleted or deactivated in direct response
to our inquiry (A12
R3
, A93
R2
, A133
R2
, A161
R2
, A209
R2
) – either
with unconvincing excuses or with no justication at all. A12
R3
,
for example, cited “credit card problems” as the reason, although
we had never connected a credit card with any of the investigated
apps.
All account discontinuations and erasures of personal data were
carried out without any request or consent from our side. It is
striking that we found no provisions in the corresponding privacy
policies or terms of service that would explain the deletions. Also,
apart from two vendors who gave notice that our account would
be discontinued due to inactivity between R1 and R2 (A84, A106),
we received no account deletion notication outside the responses
to our requests. Neither a terminated app nor a deleted account
necessarily implies, of course, that all of the user’s data was, in
fact, entirely deleted by the app vendor and potential aliates. This
is well illustrated by a few cases where the vendor transferred
our personal data despite having discontinued the app or our user
account (A93R2, A128R2, A133R2).
Among the vendors who provided an export of personal data
(DATA,
n=
32
R1,
100
R2,
84
R3
), many included in their response an
ARES 2020, August 25–28, 2020, Virtual Event, Ireland Jacob Leon Kröger, Jens Lindemann, and Dominik Herrmann
unsolicited oer to delete our user account and the related data –
especially in R2 and R3 (9 %
R1
,39 %
R2
,37 %
R3
). It seems that the loss
of a single user is often perceived as advantageous over bearing
the eort and potential legal trouble associated with responding to
data subject requests.
Table 1: Summary of results; gures refer to number of apps
out of 225 apps in total
R1
(2015)
R2
(2018)
R3
(2019)
169
49 36 49
175 183 167
Response sufficient (OK)33 119 92
Response contains data (DATA)32 100 84
Response deceptive or misleading 13 24 19
Average response time [in days] 3.7 3.8 4.0
72 n/a 66
27 n/a 32
45 n/a 34
63 n/a 54
515 20
App login required 1 7 16
Identification do cument required 054
Identity-related in formation required 420
Other authentication mechanism 010
2 3 10
028
092
23 6 5
010 21
049 61
Exist ence of App & Account (Sect. 5.7)
Discontinued apps (aggregated)
Discontinued user accounts (aggregated)
Requ est Response (Sect. 5.1)
Vendo r unreachable (delivery f ailure)
Data Sharing Practices (Sect. 5.3)
Response addresses data sharing (DSAN S)
Response states that data is shared
Response states that data is not shared
Sufficient information on data sharing
Data export via postal mail
TLS encr yption failur e
Vendo r did not r espond
Vendo r responded (RESP)
Download link has expiration date
Requ este r Aut hentication (Sect. 5.4)
Authentication mechanism in place
Securit y of Transmitted Data (Sect. 5 .4)
Data file(s) protected with password
6 DISCUSSION
The results of our four-year undercover study reveal severe obsta-
cles to exercising the right of access in the mobile app space. We
found that the documented problems largely persist over time and
have not substantially improved since 2014, when Herrmann and
Lindemann obtained success rates that are comparable with the
ones we obtained in 2019 [10].
In certain areas, we did observe positive trends. Between 2015
and 2019, the investigated vendors seem to have increased their
willingness and ability to disclose stored personal information to
data subjects upon request. Out of all examined app vendors, the
proportion that managed to provide a proper export of personal
data rose from 12 %
R1
to 28 %
R3
. Overall, the rate of acceptable
responses (OK) to our subject access requests went up from 15 %
R1
to 41 %R3 (see Sect. 5.1).
There was also a crucial improvement in the protection of private
messages and transmitted personal information. Specically, among
all received replies (RESP,
n=
175
R1,
183
R2,
167
R3
), we observed
a reduction of unencrypted email correspondence from 13 %
R1
to
3 %
R3
. And we noticed a growing tendency among the vendors to
not send data exports directly through email or postal mail but to
make them available via a portal in the password-protected app
(see Sect. 5.4). Among all vendors who sent us a copy of personal
data (DATA,
n=
32
R1,
100
R2,
84
R3
), the prevalence of this practice
increased from 3 %R1 to 19 %R3.
Even with these improvements, however, getting access to one’s
data is still a frustrating endeavor most of the time. While we ob-
served substantial improvements between 2015 (R1) and 2018 (R2),
most of the identied problems (e. g., low overall response rate,
incomplete responses, deceptive and misleading statements, unin-
telligible data exports) remained static or even worsened between
R2 and our last round of inquiries in 2019 (R3). Besides the impact
of the emerging GDPR, the signicant improvements between R1
and R2 can probably be attributed to a considerable extent to the
less formally written request in R1 (see Sect. 4), reecting previous
ndings from [
10
]. Although formal and informal requests have the
same legal validity, our results suggest that successfully exercising
data protection rights in practice may require a certain manner of
expression and a degree of legal expertise, potentially going beyond
the capacity of average smartphone users.
Why has there been no further improvement between R2 and
R3? One explanation may be that app vendors have realized that the
mostly underfunded and poorly organized supervisory authorities
have insucient resources to enforce the regulation and penal-
ize misconduct [
33
,
38
,
39
]. As a result, there is little incentive
for vendors to handle subject access requests professionally. The
handling of our requests varied signicantly, indicating a lack of
well-established best practices in the mobile app industry.
We observed numerous alarming cases of disregard for the most
basic information security and data protection principles. Among
other failures, 76 to 85 % of the data exports (DATA) were provided
without any attempt at verifying the requester’s identity. Where
authentication mechanisms were in place, they were sometimes
accidentally circumvented by the app vendors themselves. We also
received exports of personal data via non-encrypted channels and
were even given access to personal data not belonging to us (see
Sect. 5.4).
Furthermore, we observed that many of the investigated apps
were discontinued (9 %), with some vendors becoming completely
unreachable during our study (4 %). The latter case can create un-
settling uncertainty for data subjects, leaving them in doubt about
the fate of their data. Given this “personal data dissolution,” users
cannot get answers to the essential question, “Who knows what
about me?” anymore. Answering this question, in turn, is a natural
prerequisite for exercising the remaining ARCO rights (access, recti-
cation, cancellation, opposition) in an informed manner, and thus
an essential basis for informational self-determination [
17
]. While
the case of unreachable and disappearing data controllers illustrates
the problem well and potentially exacerbates it, any non-reaction
or incomplete response to a subject access request can, of course,
have the same consequence.
Personal data dissolution does not only result from vendors
going out of business or apps being discontinued. Another source is
user accounts that disappear without prior notice: 27 % of the initial
accounts were gone by the end of our study. Of course, blocking or
How do App Vendors Respond to Subject Access Requests?
A Longitudinal Privacy Study on iOS and Android Apps ARES 2020, August 25–28, 2020, Virtual Event, Ireland
deleting accounts after long periods of user inactivity is a legitimate
concern of app providers. The practice also aligns with the GDPR’s
storage limitation principle [
8
] and is often in the interest of the
data subject by helping to prevent the accumulation of unused
“zombie accounts” [
23
]. However, it seems evident that aected
users should be informed in advance, be oered a nal copy of
their data, and be asked for explicit consent if the data is retained
for further use. With few exceptions, these basic rules were not
followed by the respective vendors in our study. In the vast majority
of cases, we did not even receive an account deletion notication.
Overall, our results show very clearly that current processes for
receiving and dealing with data subject requests have plenty of
room for improvement. Existing research-based suggestions and
recommendations for data controllers need to be compiled into
actionable guidelines and distributed in a form that makes them
digestible for small- and medium-sized organizations, such as app
vendors. This includes, in particular, guidance on how to authen-
ticate data subjects safely [
4
,
7
,
25
], how to transfer personal data
[
42
], and how to facilitate the submission of requests [
2
,
10
,
17
]. It
should be a key objective to replace the error-prone manual pro-
cessing of data subject requests with automated and standardized
interfaces for obtaining personal data and other privacy-related
information. To incentivize the rapid and broad-scale adoption of
such approaches, industry-specic legal requirements along these
lines could be helpful.
Examining a xed group of data controllers over a sustained pe-
riod of time has proven to be suitable for this type of investigation
and should be used more widely. The basic issues and challenges
to exercising data protection rights in practice are now well under-
stood. More longitudinal research, however, is needed to monitor
whether progress is being made, to assess the impact of legal mea-
sures, as well as to track and rene emerging best practices.
Future studies and discussions on the privacy behavior of mo-
bile apps should take into account that the stored personal data is
not necessarily limited to recorded actions and inputs of the user.
Patterns and correlations in collected data may leak additional in-
formation in a way that is not easily understood or anticipated by
the user [
13
,
14
,
16
]. To achieve an adequate level of transparency,
it would be necessary for users to be informed about such forms of
data acquisition as well.
7 ETHICAL CONSIDERATIONS AND
LIMITATIONS
As explained in Sect. 4, we interacted with the app vendors disguised
as an ordinary app user. One crucial ethical aspect of undercover
eld studies is that resources of the studied subjects are consumed
without their consent – namely, in our case, the working time of
employees of the examined app vendors. However, as in related
studies [
1
,
2
,
10
,
24
,
37
,
42
], the covert approach was necessary to
test the behavior of the vendors under realistic conditions and to
prevent a research participation eect [
20
]. It can be assumed that
employees of the investigated companies would have put more care
and diligence into answering our inquiries if they had known about
the nature of our study. For this reason, we consider undercover
eld research the only viable approach to investigate problems
related to exercising data protection rights in practice.
It is not the purpose of this paper to name and shame individual
companies. Therefore, we only referred to apps from our sample
using pseudonyms. Moreover, we have decided not to inform the
investigated companies about the study we have conducted to pro-
tect the responsible employees from negative consequences. For the
same reason, after careful consideration and given the many inade-
quate responses to our inquiries, we also decided to publish neither
the collected postal and email correspondence nor the received data
records – not even in pseudonymized form. There would always be
a residual risk that a vendor might recognize itself. In the interest
of reproducibility and traceability, we did, however, release our re-
sults dataset, which consists of a sanitized and comprehensive table
containing all ndings for every app (represented by the numeric
ID used in this paper) [12].
Our study has several limitations that need to be recognized.
First, we want to stress that we have studied a particular sample of
mobile apps. Our results, therefore, cannot be generalized to the
whole population of apps, nor to all popular apps. Also, as explained
in Sect. 4, we used a dierent request text for each iteration of our
study, varying in length, formality, and in terms of the laws that
were cited. Thus, the results obtained from R1, R2, and R3 are not
unconditionally comparable.
Furthermore, the sending of three repeated requests, with long
pauses between them, after only a few minutes of user activity on
each app in the beginning, presumably deviates strongly from nor-
mal user behavior. This activity pattern may have aroused curiosity
or suspicion among the investigated companies and potentially
aected their response to our inquiries. Simulating an active user
on 225 apps over a period of four years was not feasible with our
available resources. On the other hand, this approach allowed us to
measure the extent of personal data dissolution (cf. Sect. 6).
Other surrounding factors, such as the increasing public interest
in data protection issues [
9
,
32
] and several smartphone-related
privacy scandals [
6
,
27
], could also have aected the compliance
with subject access requests in the mobile app industry. Therefore,
despite the longitudinal nature of our study, we cannot measure
the impact of the newly enacted GDPR in isolation.
8 CONCLUSION
In this longitudinal study, we investigated the obstacles faced by
data subjects in exercising their right to access with mobile app
vendors. Our results indicate positive trends in selected areas. The
overall situation, however, is still as unsatisfactory today – with
GDPR in force – as it was at the beginning of our study in 2015.
Even in the second iteration of our study, in which the reactions to
our subject access request were the most promising, we received an
acceptable response from only 53 % of the examined vendors. In the
rst and third iteration, this gure was 15 % and 41 %, respectively.
Response rates to questions we asked about third-party data sharing
practices were similarly disappointing.
Besides a general lack of responsiveness, the observed prob-
lems range from malfunctioning download links and authentica-
tion mechanisms over confusing data labels and le structures to
impoliteness, incomprehensible language, and even serious cases
of carelessness and data leakage. It is evident from our results that
there are no well-established and standardized processes for subject
ARES 2020, August 25–28, 2020, Virtual Event, Ireland Jacob Leon Kröger, Jens Lindemann, and Dominik Herrmann
access requests in the mobile app industry. Moreover, we found that
many vendors lack the motivation to respond adequately. Many of
the responses we received were not only completely insucient,
but also deceptive or misleading. Equally worrisome are cases of
unsolicited dissolution of personal data, for instance, due to the
apparently widespread practice of deleting stale accounts without
prior notice.
With regard to the sensitive personal data that is regularly col-
lected by mobile apps, this decient and stagnating status quo is
hardly tolerable. What could help to improve the situation is a com-
bination of random compliance checks by authorities, coupled with
better support for data controllers through industry-specic guide-
lines and best practices. In particular, there should be mandatory
standard interfaces for providing data exports and other privacy-
related information to data subjects, obviating the need for the
manual processing of GDPR requests.
REFERENCES
[1]
Association Française des Correspondants à la protection des Données à caractère
Personnel. 2020. Données personnelles - Index AFCDP 2020 du Droit d’accès.
https://afcdp.net/index- du-droit-d- acces/
[2]
Jef Ausloos and Pierre Dewitte. 2018. Shattering One-Way Mirrors. Data Subject
Access Rights in Practice. Data Subject Access Rights in Practice (January 20,
2018). International Data Privacy Law 8, 1 (2018), 4–28.
[3]
Reuben Binns, Ulrik Lyngs, Max Van Kleek, Jun Zhao, Timothy Libert, and Nigel
Shadbolt. 2018. Third party tracking in the mobile ecosystem. In Proceedings of
the 10th ACM Conference on Web Science. 23–31.
[4]
Coline Boniface, Imane Fouad, Nataliia Bielova, Cédric Lauradoux, and Cristiana
Santos. 2019. Security Analysis of Subject Access Request Procedures. In Annual
Privacy Forum. Springer, 182–209.
[5]
Christina Bröhl, Peter Rasche, Janina Jablonski, Sabine Theis, Matthias Wille, and
Alexander Mertens. 2018. Desktop PC, tablet PC, or smartphone? An analysis
of use preferences in daily activities for dierent technology generations of a
worldwide sample. In International Conference on Human Aspects of IT for the
Aged Population. Springer, 3–20.
[6]
Catalin Cimpanu. 2019. Another Facebook privacy scandal, this time involving
its mobile analytics SDK. ZDNet (Feb. 2019). https://www.zdnet.com/article/anot
her-facebook- privacy-scandal-this- time-involving-its- mobile-analytics-sdk/
[7]
Mariano Di Martino, Pieter Robyns, Winnie Weyts, Peter Quax, Wim Lamotte,
and Ken Andries. 2019. Personal Information Leakage by Abusing the GDPR
“Right of Access”. In Proceedings of the Fifteenth USENIX Conference on Usable
Privacy and Security (SOUPS’19). USENIX Association, USA, 371–386.
[8]
Majid Hatamian. 2020. Engineering Privacy in Smartphone Apps: A Technical
Guideline Catalog for App Developers. IEEE Access 8 (2020), 35429–35445.
[9]
Alex Hern. 2018. What is GDPR and how will it aect you? The Guardian (May
2018). https://www.theguardian.com/technology/2018/may/21/what-is-gdpr- a
nd-how- will-it-aect- you
[10]
Dominik Herrmann and Jens Lindemann. 2016. Obtaining personal data and
asking for erasure: do app vendors and website owners honour your privacy
rights?. In GI Sicherheit 2016. Gesellschaft für Informatik e.V., Bonn, 149–160.
arXiv:1602.01804
[11]
Andrei P. Kirilenko and Svetlana Stepchenkova. 2016. Inter-Coder Agreement in
One-to-Many Classication: Fuzzy Kappa. PLOS ONE 11, 3 (03 2016), 1–14.
[12]
Jacob Leon Kröger, Jens Lindemann, and Dominik Herrmann. 2020. Subject
Access Request response data - 105 iOS and 120 Android apps. https://doi.org/
10.14279/depositonce-10338
[13]
Jacob Leon Kröger, Otto Hans-Martin Lutz, and Florian Müller. 2020. What does
your gaze reveal about you? On the privacy implications of eye tracking. In
Privacy and Identity Management. Springer, 226–241.
[14]
Jacob Leon Kröger, Otto Hans-Martin Lutz, and Philip Raschke. 2020. Privacy
Implications of Voice and Speech Analysis–Information Disclosure by Inference.
In Privacy and Identity Management. Springer, 242–258.
[15]
Jacob Leon Kröger and Philip Raschke. 2019. Is My Phone Listening in? On the
Feasibility and Detectability of Mobile Eavesdropping. In Data and Applications
Security and Privacy XXXIII. Springer International Publishing, Cham, 102–120.
[16]
Jacob Leon Kröger, Philip Raschke, and Towhidur Rahman Bhuiyan. 2019. Privacy
implications of accelerometer data: a review of possible inferences. In Proceedings
of the 3rd International Conference on Cryptography, Security and Privacy. 81–87.
[17]
Xavier Duncan L’Hoiry and Clive Norris. 2015. The honest data protection
ocer’s guide to enable citizens to exercise their subject access rights: lessons
from a ten-country European study. International Data Privacy Law 5, 3 (2015),
190–204.
[18]
Xavier L’Hoiry and Clive Norris. 2017. Exercising Access Rights in the United
Kingdom. In The Unaccountable State of Surveillance. Springer, 359–404.
[19]
René Mahieu, Hadi Asghari, and Michel van Eeten. 2018. Collectively exercising
the right of access: individual eort, societal eect. Internet Policy Review 7, 3
(2018).
[20]
Jim McCambridge, John Witton, and Diana R Elbourne. 2014. Systematic review
of the Hawthorne eect: new concepts are needed to study research participation
eects. Journal of Clinical Epidemiology 67, 3 (2014), 267–277.
[21]
Nurul Momen, Majid Hatamian, and Lothar Fritsch. 2019. Did App Privacy
Improve After the GDPR? IEEE Security & Privacy 17, 6 (2019), 10–20.
[22]
Rudi Müller-Glöge, Ulrich Preis, and Ingrid Schmidt (Eds.). 2016. Erfurter Kom-
mentar zum Arbeitsrecht (ErfKoArbR) (16 ed.). Beck, München.
[23]
David Nield. 2019. How To Clear Out Your Zombie Apps and Online Accounts.
Wired (May 2019). https://www.wired.com/story/delete-old- apps-accounts-onl
ine/
[24]
Clive Norris and Xavier L’Hoiry. 2017. Exercising Citizen Rights Under Surveil-
lance Regimes in Europe–Meta-analysis of a Ten Country Study. In The Unac-
countable State of Surveillance. Springer, 405–455.
[25]
Chris Norval, Heleen Janssen, Jennifer Cobbe, and Jatinder Singh. 2018. Reclaim-
ing data: Overcoming app identication barriers for exercising data protection
rights. In ACM International Joint Conference and International Symposium on
Pervasive and Ubiquitous Computing and Wearable Computers. 921–930.
[26]
German Federal Oce of Justice. 2019. Federal Data Protection Act (BDSG).
https://www.gesetze-im-internet.de/englisch_bdsg/englisch_bdsg.html#p0014
[27]
Kate O’Flaherty. 2019. Huawei Security Scandal: Everything You Need to Know.
Forbes (Feb. 2019). https://www.forbes.com/sites/kateoahertyuk/2019/02/26/hu
awei-security- scandal-everything-you- need-to-know/
[28]
European Parliament and Council of the European Union. 2016. Regulation (EU)
2016/679 of the European Parliament and of the Council of 27 April 2016 on the
protection of natural persons with regard to the processing of personal data and
on the free movement of such data, and repealing Directive 95/46/EC (General
Data Protection Regulation). https://eur-lex.europa.eu/legal- content/EN/TXT/?
uri=uriserv:OJ.L_.2016.119.01.0001.01.ENG
[29]
Article 29 Data Protection Working Party. 2010. Opinion 8/2010 on applicable
law (0836-02/10/EN). WP 179 (2010).
[30]
Anthony Quattrone, Lars Kulik, Egemen Tanin, Kotagiri Ramamohanarao, and
Tao Gu. 2015. PrivacyPalisade: Evaluating app permissions and building pri-
vacy into smartphones. In 2015 10th International Conference on Information,
Communications and Signal Processing (ICICS). IEEE.
[31]
Abbas Razaghpanah, Rishab Nithyanand, Narseo Vallina-Rodriguez, Srikanth
Sundaresan, Mark Allman, Christian Kreibich, and Phillipa Gill. 2018. Apps,
Trackers, Privacy, and Regulators: A Global Study of the Mobile Tracking Ecosys-
tem. In Proceedings 2018 Network and Distributed System Security Symposium.
Internet Society, San Diego, CA.
[32]
Adam Satariano. 2018. G.D.P.R., a New Privacy Law, Makes Europe World’s
Leading Tech Watchdog. The New York Times (May 2018). https://www.nytimes.
com/2018/05/24/technology/europe-gdpr- privacy.html
[33]
Adam Satariano. 2020. Europe’s Privacy Law Hasn’t Shown Its Teeth, Frustrating
Advocates. New York Times (April 2020). https://www.nytimes.com/2020/04/27/
technology/GDPR-privacy- law-europe.html
[34]
Keith Spiller. 2016. Experiences of accessing CCTV data: The urban topologies
of subject access requests. Urban Studies 53, 13 (2016), 2885–2900.
[35]
Anselm Strauss and Juliet Corbin. 1990. Basics of Qualitative Research. Sage
Publications.
[36]
Jörg Thoma. 2014. Datenschutzbeauftragter mahnt mangelnde Verschlüsselung
an. Golem (Sept. 2014). https://www.golem.de/news/bayern-datenschutzbeauft
ragter-mahnt- mangelnde-verschluesselung-an- 1409-109260.html
[37]
Tobias Urban, Dennis Tatang, Martin Degeling, Thorsten Holz, and Norbert
Pohlmann. 2019. A Study on Subject Data Access in Online Advertising after the
GDPR. In Data Privacy Management, Cryptocurrencies and Blockchain Technology.
Springer, 61–79.
[38]
Siddharth Venkataramakrishnan. 2020. GDPR accused of being toothless because
of lack of resources. Financial Times (April 2020). https://www.ft.com/content/a
915ae62-034e- 4b13-b787-4b0ac2aaf f7e
[39]
Nicholas Vinocur. 2019. ‘We have a huge problem’: European tech regulator
despairs over lack of enforcement. Politico (Dec. 2019). https://www.politico.c
om/news/2019/12/27/europe-gdpr- technology-regulation-089605
[40]
Paul Voigt and Axel von dem Bussche. 2017. Scope of Application of the GDPR.
In The EU General Data Protection Regulation (GDPR). Springer, 9–30.
[41]
Heinrich Amadeus Wol and Stefan Brink (Eds.). 2018. Datenschutzrecht in Bund
und Ländern (23 ed.). Beck, München.
[42]
Janis Wong and Tristan Henderson. 2019. The right to data portability in practice:
exploring the implications of the technologically neutral GDPR. International
Data Privacy Law 9, 3 (2019), 173–191.
[43]
Nils Zurawski. 2017. Exercising Access Rights in Germany. In The Unaccountable
State of Surveillance. Springer, 109–133.
... As one of the most comprehensive legislative changes introduced with the GDPR, the strengthened right to data access combined with the newly introduced right to data portability have been broadly studied in matters of their effects on users and services [51,31,54]. Compared to related obligations regarding transparency [20,23,22], however, explicitly technical contributions concerning Art. 15 and 20 GDPR are rather rare. ...
... SARPs themselves, in turn, have been analyzed from a single controller's point of view, e. g., Twitter (now X) [40] or Instagram [38], and from a comparative perspective [51,31,41]. SARP-focused user studies, in turn, repeatedly conclude that the provided raw data are hardly useful for data subjects [54,12]. ...
Preprint
Full-text available
The European Union's General Data Protection Regulation (GDPR) strengthened several rights for individuals (data subjects). One of these is the data subjects' right to access their personal data being collected by services (data controllers), complemented with a new right to data portability. Based on these, data controllers are obliged to provide respective data and allow data subjects to use them at their own discretion. However, the subjects' possibilities for actually using and harnessing said data are severely limited so far. Among other reasons, this can be attributed to a lack of research dedicated to the actual use of controller-provided subject access request packages (SARPs). To open up and facilitate such research, we outline a general, high-level method for generating, pre-processing, publishing, and finally using SARPs of different providers. Furthermore, we establish a realistic dataset comprising two users' SARPs from five services. This dataset is publicly provided and shall, in the future, serve as a starting and reference point for researching and comparing novel approaches for the practically viable use of SARPs.
... While a right to data access is included in both CCPA and GDPR, researchers have documented problems with companies' processes to implement these rights. Prior work studying data access requests has documented issues with delayed or missing responses to data access requests, incomplete data access outcomes, and challenges with identity verification [9,23,42,47]. To our knowledge, no study has systematically explored and documented the challenges that emerge when seeking access to one's own data in PSWs specifically. ...
Article
Full-text available
People Search Websites, a category of data brokers, collect, catalog, monetize and often publicly display individuals' personally identifiable information (PII). We present a study of user privacy rights in 20 such websites assessing the usability of data access and data removal mechanisms. We combine insights from these two processes to determine connections between sites, such as shared access mechanisms or removal effects. We find that data access requests are mostly unsuccessful. Instead, sites cite a variety of legal exceptions or misinterpret the nature of the requests. By purchasing reports, we find that only one set of connected sites provided access to the same report they sell to customers. We leverage a multiple step removal process to investigate removal effects between suspected connected sites. In general, data removal is more streamlined than data access, but not very transparent; questions about the scope of removal and reappearance of information remain. Confirming and expanding the connections observed in prior phases, we find that four main groups are behind 14 of the sites studied, indicating the need to further catalog these connections to simplify removal.
Article
Full-text available
While most countries still have no comprehensive legislation in place to address political microtargeting (PMT), efforts to regulate and legislate around the phenomenon are quickly emerging around the world. In order to define the material scope of any regulation on PMT, one fundamental question is how to legally define “political advertisement”. This is no trivial task, as all possible definitions have advantages and downsides (e.g., complexity in implementation, potential loopholes, level of subjectivity, adaptability to evolving technological landscape). Approaches to regulating PMT encompass a variety of options, including: (1) rules for shaping PMT, (2) transparency obligations, (3) user control / consent, (4) partial restrictions or total bans. Each approach has a range of implementation options as well as benefits and shortcomings which are discussed in detail. Importantly, most of the available policy options only address a fraction of the risks associated with PMT. A legal restriction or complete ban of PMT are likely most effective in removing risks, but if not carefully designed, such measures can pose a significant threat to freedom of expression. In regulating PMT, policymakers need to strike a delicate balance between public interest and different fundamental rights.
Article
Full-text available
Digital technology has revolutionized how political ads are delivered and consumed, giving political campaigns increased possibilities to target and tailor their messaging to specific audiences—a practice known as political microtargeting (PMT). While PMT has potential benefits for society, it also entails significant risks that have yet to be adequately addressed by regulators around the globe. This report offers fundamental guidance on PMT for policymakers, civil society, and other relevant stakeholders, providing recommendations for action and an overview of possible protective measures. Public discourse has so far mostly focused on PMT cases in the Global North,i such as US elections or Brexit, whereas the practice is becoming increasingly adopted worldwide. In lower-income countries, the impact of PMT may be felt even more strongly due to context-specific factors such as lower levels of digital skills and media literacy, higher prevalence of political violence, weaker or non-existent legal and regulatory frameworks, and less resilient democratic institutions. This report contributes to balancing the global coverage by focusing on cases and examples from the Global South. Recent advances in artificial intelligence (AI) have added to the urgency of investigating PMT, as they amplify the capabilities of targeted messaging and intensify the risk of online disinformation though automated generation and manipulation of content.
Article
In recent years, online services have started to make personal data of users more accessible by offering dedicated ways of exporting data. The introduction of download portals is associated with increasing demands by privacy regulations regarding the rights of users, such as the Right of Access (Art. 15) and the Right to Data Portability (Art. 20) of the European Union’s General Data Protection Regulation (GDPR). These rights aim to empower users by increasing their control over the personal data that online services hold about them. They allow users to export their personal data and thereby gain insights on the scope of personal data held by the services and to transfer this data to other services. However, until now, little is known about how users experience and evaluate the process of accessing and exporting their data and how it impacts individual-level factors such as privacy-related attitudes (i.e. attitudes regarding sharing personal data, perceived control over data, and using privacy-protective strategies). In this paper, we report the results of an online survey with an experimental condition (N = 728) and a second online survey (N = 817) where participants from two university courses were asked to request real data exports from online services and inspect the exported data afterward. We find that inspecting exported personal data has a statistically significant positive effect on users’ privacy-related attitudes. However, users perceive limited usefulness in switching scenarios where personal data is transferred to a new substitutional service and rather prefer to use the data at multiple complementary services.
Article
Full-text available
In the realm of data protection, a striking disconnect prevails between traditional domains of doctrinal, legal, theoretical, and policy-based inquiries and a burgeoning body of empirical evidence. Much of the scholarly and regulatory discourse remains entrenched in abstract legal principles or normative frameworks, leaving the empirical landscape uncharted or minimally engaged. Since the birth of EU data protection law, a modest body of empirical evidence has been generated but remains widely scattered and unexamined. Such evidence offers vital insights into the perception, impact, clarity, and effects of data protection measures but languishes on the periphery, inadequately integrated into the broader conversation. To make a meaningful connection, we conduct a comprehensive review and synthesis of empirical research spanning nearly three decades (1995- March 2022), advocating for a more robust integration of empirical evidence into the evaluation and review of the GDPR, while laying a methodological foundation for future empirical research.
Chapter
Full-text available
Technologies to measure gaze direction and pupil reactivity have become efficient, cheap, and compact and are finding increasing use in many fields, including gaming, marketing, driver safety, military, and healthcare. Besides offering numerous useful applications, the rapidly expanding technology raises serious privacy concerns. Through the lens of advanced data analytics, gaze patterns can reveal much more information than a user wishes and expects to give away. Drawing from a broad range of scientific disciplines, this paper provides a structured overview of personal data that can be inferred from recorded eye activities. Our analysis of the literature shows that eye tracking data may implicitly contain information about a user's biometric identity, gender, age, ethnicity, body weight, personality traits, drug consumption habits, emotional state, skills and abilities, fears, interests, and sexual preferences. Certain eye tracking measures may even reveal specific cognitive processes and can be used to diagnose various physical and mental health conditions. By portraying the richness and sensitivity of gaze data, this paper provides an important basis for consumer education, privacy impact assessments, and further research into the societal implications of eye tracking.
Chapter
Full-text available
Internet-connected devices, such as smartphones, smartwatches, and laptops, have become ubiquitous in modern life, reaching ever deeper into our private spheres. Among the sensors most commonly found in such devices are microphones. While various privacy concerns related to microphone-equipped devices have been raised and thoroughly discussed, the threat of unexpected inferences from audio data remains largely overlooked. Drawing from literature of diverse disciplines, this paper presents an overview of sensitive pieces of information that can, with the help of advanced data analysis methods, be derived from human speech and other acoustic elements in recorded audio. In addition to the linguistic content of speech, a speaker's voice characteristics and manner of expression may implicitly contain a rich array of personal information, including cues to a speaker's biometric identity, personality, physical traits, geographical origin, emotions, level of intoxication and sleepiness, age, gender, and health condition. Even a person's socioeconomic status can be reflected in certain speech patterns. The findings compiled in this paper demonstrate that recent advances in voice and speech processing induce a new generation of privacy threats.
Article
Full-text available
With the rapid growth of technology in recent years, we are surrounded by or even dependent on the use of technological devices such as smartphones as they are now an indispensable part of our life. Smartphone applications (apps) provide a wide range of utilities such as navigation, entertainment, fitness, etc. To provide such context-sensitive services to users, apps need to access users’ data including sensitive ones, which in turn, can potentially lead to privacy invasions. To protect users against potential privacy invasions in such a vulnerable ecosystem, legislation such as the European Union General Data Protection Regulation (EU GDPR) demands best privacy practices. Therefore, app developers are required to make their apps compatible with legal privacy principles enforced by law. However, this is not an easy task for app developers to comprehend purely legal principles to understand what needs to be implemented. Similarly, bridging the gap between legal principles and technical implementations to understand how legal principles need to be implemented is another barrier to develop privacy-friendly apps. To this end, this paper proposes a privacy and security design guide catalog for app developers to assist them in understanding and adopting the most relevant privacy and security principles in the context of smartphone apps. The presented catalog is aimed at mapping the identified legal principles to practical privacy and security solutions that can be implemented by developers to ensure enhanced privacy aligned with existing legislation. Through conducting a case study, it is confirmed that there is a significant gap between what developers are doing in reality and what they promise to do. This paper provides researchers and developers of privacy-related technicalities an overview of the characteristics of existing privacy requirements needed to be implemented in smartphone ecosystems, on which they can base their work.
Conference Paper
Full-text available
Online tracking has mostly been studied by passively measuring the presence of tracking services on websites (i) without knowing what data these services collect, (ii) the reasons for which specific purposes it is collected, (iii) or if the used practices are disclosed in privacy policies. The European General Data Protection Regulation (GDPR) came into effect on May 25, 2018 and introduced new rights for users to access data collected about them. In this paper, we evaluate how companies respond to subject access requests and portability to learn more about the data collected by tracking services. More specifically, we exercised our right to access with 38 companies that had tracked us online. We observe stark differences between the way requests are handled and what data is disclosed: Only 21 out of 38 companies we inquired (55 %) disclosed information within the required time and only 13 (34 %) companies were able to send us a copy of the data in time. Our work has implications regarding the implementation of privacy law as well as what online tracking companies should do to be more compliant with the new regulation.
Chapter
Full-text available
Besides various other privacy concerns with mobile devices, many people suspect their smartphones to be secretly eavesdropping on them. In particular, a large number of reports has emerged in recent years claiming that private conversations conducted in the presence of smartphones seemingly resulted in targeted online advertisements. These rumors have not only attracted media attention, but also the attention of regulatory authorities. With regard to explaining the phenomenon, opinions are divided both in public debate and in research. While one side dismisses the eavesdropping suspicions as unrealistic or even paranoid, many others are fully convinced of the allegations or at least consider them plausible. To help structure the ongoing controversy and dispel misconceptions that may have arisen, this paper provides a holistic overview of the issue, reviewing and analyzing existing arguments and explanatory approaches from both sides. Based on previous research and our own analysis, we challenge the widespread assumption that the spying fears have already been disproved. While confirming a lack of empirical evidence, we cannot rule out the possibility of sophisticated large-scale eavesdropping attacks being successful and remaining undetected. Taking into account existing access control mechanisms, detection methods, and other technical aspects, we point out remaining vulnerabilities and research gaps.
Conference Paper
Full-text available
Accelerometers are sensors for measuring acceleration forces. They can be found embedded in many types of mobile devices, including tablet PCs, smartphones, and smartwatches. Some common uses of built-in accelerometers are automatic image stabilization, device orientation detection, and shake detection. In contrast to sensors like microphones and cameras, accelerometers are widely regarded as not privacy-intrusive. This sentiment is reflected in protection policies of current mobile operating systems, where third-party apps can access accelerometer data without requiring security permission. It has been shown in experiments, however, that seemingly innocuous sensors can be used as a side channel to infer highly sensitive information about people in their vicinity. Drawing from existing literature, we found that accelerometer data alone may be sufficient to obtain information about a device holder's location, activities, health condition, body features, gender, age, personality traits, and emotional state. Acceleration signals can even be used to uniquely identify a person based on biometric movement patterns and to reconstruct sequences of text entered into a device, including passwords. In the light of these possible inferences, we suggest that accelerometers should urgently be re-evaluated in terms of their privacy implications, along with corresponding adjustments to sensor protection mechanisms.
Preprint
Full-text available
Data protection regulations generally afford individuals certain rights over their personal data, including the rights to access, rectify, and delete the data held on them. Exercising such rights naturally requires those with data management obligations (service providers) to be able to match an individual with their data. However, many mobile apps collect personal data, without requiring user registration or collecting details of a user's identity (email address, names, phone number, and so forth). As a result, a user's ability to exercise their rights will be hindered without means for an individual to link themselves with this 'nameless' data. Current approaches often involve those seeking to exercise their legal rights having to give the app's provider more personal information, or even to register for a service; both of which seem contrary to the spirit of data protection law. This paper explores these concerns, and indicates simple means for facilitating data subject rights through both application and mobile platform (OS) design.
Article
Introduction The introduction of the General Data Protection Regulation (GDPR)¹ has been called ‘the most significant data privacy reform process in history’.² This new law reinforces existing data subject rights in an attempt to rebalance power between citizens and the increasingly sizeable and international companies that are collecting and exploiting data from them. As the GDPR only came into effect recently on the 25 May 2018, it is timely to study how this Regulation works in practice. In this article, we examine the new data subject right introduced under the GDPR, Article 20’s right to data portability (RtDP). Described by the European Data Protection Supervisor (EDPS) as ‘the gateway in the digital environment to the user control which individuals are now realising they lack’,³ the RtDP provides the right for data subjects to receive personal data concerning him or her and the right to transmit those data from one data controller to another. The RtDP is interesting to study as it operates under a framework that aims to be technologically neutral to maintain reasonable longevity of the law. In section ‘GDPR Article 20—the right to data portability’, we explore the historical background, current developments, and existing research on how data portability sits within the GDPR. Traditionally grounded in competition and consumer law, the RtDP is also the first of its kind to be included in data protection law. As a new right, it is yet to be seen whether data portability can be used as a means for protecting the processing of data subjects’ personal data. However, given that data portability requires specific technologies for implementation, little guidance has been provided to data controllers with regards to compliance. To see how the RtDP works in practice, we then conduct 230 data portability requests to a broad range of data controllers and discuss the successes and failures of making these requests as data subjects. With the responses received, we assess the types of data formats used, and the completeness and appropriateness of the RtDP requests. We show some of the potential impediments that data subjects and data controllers may face in exercising and complying with the RtDP, respectively. Then, we revisit the definitions of data portability terminology based on the legal implications identified from the GDPR’s technological neutral framework in application to the RtDP. Finally, we discuss future areas for work in section ‘Future work’, where different stakeholders could help to further clarify the RtDP and suggest ways to overcome these obstacles.
Article
In this article, we present an analysis of app behavior before and after the regulatory change in data protection in Europe. Our data shows that app privacy has moderately improved after the implementation of the General Data Protection Regulation. In May 2018, stronger regulation of the processing of personal data became law in the European Union, known as the General Data Protection Regulation (GDPR).1 The expected effect of the regulation was better protection of personal data, increased transparency of collection and processing, and stronger intervention rights of data subjects, with some authors claiming that the GDPR would change the world, or at least that of data protection regulation.2 The GDPR had a two-year (2016–2018) implementation period that followed four years of preparation. At the time of this writing, in November 2019, one and one-half years have passed since the implementation of GDPR. Has the GDPR had an effect on consumer software? Has the world of code changed too? Did the GDPR have a measurable effect on mobile apps’ behavior? How should such a change in behavior be measured? In our study, we decided to use two indicators for measurement: Android dangerous permission16 privileges and user feedback from the Google Play app market. We collected data from smartphones with an installed app set for months before GDPR implementation on 25 May 2018 and months after that date.
Conference Paper
With the GDPR in force in the EU since May 2018, companies and administrations need to be vigilant about the personal data they process. The new regulation defines rights for data subjects and obligations for data controllers but it is unclear how subjects and controllers interact concretely. This paper tries to answer two critical questions: is it safe for a data subject to exercise the right of access of her own data? When does a data controller have enough information to authenticate a data subject? To answer these questions, we have analyzed recommendations of Data Protection Authorities and authentication practices implemented in popular websites and third-party tracking services. We observed that some data controllers use unsafe or doubtful procedures to authenticate data subjects. The most common flaw is the use of authentication based on a copy of the subject's national identity card transmitted over an insecure channel. We define how a data controller should react to a subject's request to determine the appropriate procedures to identify the subject and her data. We provide compliance guidelines on data access response procedures.