ChapterPDF Available

Assessing Privacy Policies of Internet of Things Services

Authors:
  • HSB City University of Applied Sciences
Assessing Privacy Policies
of Internet of Things Services
Niklas Paul1[0000000182834751], Welderufael B. Tesfay1[0000000210872019],
Dennis-Kenji Kipker
2[000000031454591X]
, Mattea Stelter
2[0000000236278990]
,
and Sebastian Pape1(B)[0000000208937856]
1Goethe-University, Frankfurt, Germany
2University of Bremen, Bremen, Germany
Abstract.
This paper provides an assessment framework for privacy
policies of Internet of Things Services which is based on particular GDPR
requirements. The objective of the framework is to serve as supportive
tool for users to take privacy-related informed decisions. For example
when buying a new fitness tracker, users could compare different models in
respect to privacy friendliness or more particular aspects of the framework
such as if data is given to a third party. The framework consists of 16
parameters with one to four yes-or-no-questions each and allows the
users to bring in their own weights for the different parameters. We
assessed 110 devices which had 94 different policies. Furthermore, we did
a legal assessment for the parameters to deal with the case that there
is no statement at all regarding a certain parameter. The results of this
comparative study show that most of the examined privacy policies of IoT
devices/services are insufficient to address particular GDPR requirements
and beyond. We also found a correlation between the length of the policy
and the privacy transparency score, respectively.
Keywords:
Internet of Things, Privacy Policies, General Data Protec-
tion Regulation, GDPR, ePrivacy Regulation, ePR
1 Introduction
Privacy is a big but early stage research topic in the Internet of Things (IoT),
where many questions are still inadequately addressed [
1
]. Studies indicate that
”six in ten Internet of Things devices dont properly tell customers how their
personal information is being used” [
2
] and ”nearly all areas (of Internet of Things)
miss applicable mechanisms in privacy” [
3
]. This collection and processing of
personal, sometimes sensitive, information has raised privacy concerns of users.
A survey in 2016 revealed that 53% of 797 IT professionals are very concerned
about privacy in IoT, it already seems relevant in professional circles [
4
]. With
the increasing complexity of products users have to deal with, it is likely that
this raises concerns of non-professional users as well.
Thus, regulators require service providers to publish their data processing
practices. As such, terms and conditions and privacy policies are used to inform
users about the purpose of data collection and processing. However, only a small
proportion of users read these documents [
5
,
6
], mainly due to the length of
the texts, and being written in difficult legal jargon. Therefore, it is widely
accepted to confirm a policy without reading it, even if users in general should
read them [
7
]. As a consequence, users are not aware that a large number of
policies elude domestic justice, contains user unfriendly parts or suspect purpose
of private data use e.g. to collect information and to use it as ”a new source of
revenue” by selling the information or for advertising purposes [8].
To give a methodological assessment of this problem, in this work, we introduce
a framework for privacy policies of Internet of Things (IoT) devices evaluation
based on General Data Protection Regulation (GDPR) aspects as assessment
criteria. The framework gives an overview of the contents of certain policies
and further ranks them based on their scores pertinent to these criteria. The
objective of the framework is not to provide binding legal guidance, but to
serve as supportive tool for users to take privacy-related informed decisions. For
example when buying a new fitness tracker, users could compare different models
in respect to privacy friendliness or more particular aspects of the framework
such as if data is given to a third party.
The remainder of the paper is structured as follows: Sect. 2 briefly introduces
the regulatory background on which our framework is based. After that, in section
3, related work is presented and how this work differs from them. In section 4 we
present our research methodology and in section 5, the assessment framework is
introduced. In section 6 we present the results of a first assessment and statistical
analyses. In section 7, we discuss results and limitations of the framework and
suggest future work. We conclude in section 8.
2 Background
Internet of Things (IoT) refers to the networked interconnection of everyday
objects, which are often equipped with ubiquitous intelligence [
9
]. Usually users
can extend the control of IoT devices by using an application on their phone, tablet
or computer. Since IoT-Services require a certain amount of personal information
to determine user behaviour and they process electronic data automatically, they
are regulated by the General Data Protection Regulation (GDPR) [
10
] and the
ePrivacy Regulation (ePR) [
11
]. In this section, we give a brief overview on the
GDPR and ePR with a focus how to utilize them as foundation for the privacy
policy assessment framework.
2.1 General Data Protection Regulation
The General Data Protection Regulation, adopted by the European Parliament
on 14 April 2016 and becoming effective as from 25 May 2018, will replace the
Data Protection Directive (1995/46/EC). The regulation is the result of the EU’s
objective to harmonize the several data protection provisions existing at European
and national level and thereby to strengthen data protection throughout the EU
3
.
Unlike the previous directive, the new regulation does not require transposition
into national laws and will be directly applicable in all Member States. Henceforth,
national legislation that diverges from the GDPR provisions will be allowed only
within various opening clauses contained in the regulation. Since the GDPR
“lays down rules relating to the protection of natural persons with regard to
the processing of personal data” [
10
, Article 1 para. 1], it is also addressed to
suppliers of IoT products. According to Article 3 of the regulation, the GDPR
thereby does not only apply for EU-based producers of IoT devices, but also
for all enterprises established outside the EU that offer their products on the
European market. Therefore, the provisions of the GDPR can serve as uniform
assessment criteria for the comparison of the level of data protection ensured for
IoT devices whose producers are located across the world.
Of particular importance for the evaluation of privacy policies is Article 13
GDPR, which specifies the information to be provided where personal data are
collected from a data subject. These information obligations follow from the
transparency principle laid down in Article 5 GDPR. The mandatory information
includes, inter alia, identity and contact details of the product provider as well as
full details on the purposes of the data processing, the storage period, the various
rights of the data subject under Articles 12-23 GDPR, or, where applicable, the
disclosure of data to a third party and the transfer of data to third countries.
2.2 ePrivacy Regulation
However, the legislative process on the harmonisation of European data protection
law is not yet completed. Apart from the GDPR, the ePrivacy Regulation
is intended to replace the outdated Privacy and Electronic Communications
Directive (2002/58/EC) and to supplement the GDPR as regards the electronic
communication sector. Although the ePrivacy Regulation initially had been
expected to become effective at the same time as the GDPR on 25 May 2018, it is
currently still at the stage of draft [
11
]. While trilogue negotiations between the
Parliament, the Commission and the Council are about to take place, the high
level of data protection provided in the proposal is strongly criticised by media
and advertising industries
4
. The exact scope of the ePrivacy Regulation and its
relation to the GDPR remain controversial, too [
13
]. Thus, it does not appear
to be appropriate to include the current draft regulation into this assessment
framework the discrepancies that have to be resolved prior to the adoption of
a final version are too fundamental. However, in the future, legal requirements
for IoT devices will be significantly determined not only by the GDPR, but
also by the ePrivacy Regulation: Recital 12 of the proposed regulation explicitly
states that the scope of the regulation also covers the transmission of machine-
to-machine communications, which is the essential characteristic of the Internet
of Things. The regulations entry into force is not expected before 2019 [14].
3See, inter alia, Recitals 6, 7, 9, 10 of the GDPR.
4See, for example, the campaign by several industry associations [12]
3 Related Work
Even though information privacy is a concern for users and IoT operators, so far,
it seems to be addressed inadequately. However, there are still some promising
efforts, which we summarize below. Stankovic [
1
] proposed a new language for
privacy policies in IoT to address emerging problems of privacy. Ziegeldorf et al.
stated seven categories of privacy threats in the Internet of Things, introducing
four new categories of privacy threats especially in the Internet of Things [
15
].
The threat of life-cycle transition (changes of control spheres e.g. through selling)
is considered in this framework as well.
Smith, Milberg and Burke found five central dimensions of concerns about
privacy practices namely, collection of personal information, internal unauthorized
secondary use of personal information, external unauthorized secondary use
of personal information and finally errors and improper access [
16
]. All these
previously mentioned dimensions should be addressed in a privacy policy and
are also, to some extent, part of the requirements for the assessment framework
and can be considered as the basis to develop the framework.
Previous studies examined the existence of policies rather than assessing the
content [
17
]. Previous work that took the content into account, mainly dealt with
privacy policies of websites, but not of IoT services and respectively, apps to
control them [18, 17, 19]. For Example some of them used the Fair Information
Practices (FIPs) for the content and the Flesch grade level [
20
] for assessing the
readability with the result that the examined policies were difficult to read and
required a higher education level. The Flesch Score is based on the average lenth
of a sentence and the average word length within syllables, the higher it is the
easier a text is to read. Over time more mathematical approaches which calculated
scores were established but also rankings based on a crowdsourcing approach
[
19
]. In 2017, the project ”Ranking Digital Rights” evaluated a set of companies
based on 35 parameters in three groups namely governance, freedom of expression
and privacy [
21
]. The privacy category was by far the largest, consisting of 18
parameters. It examined a broad variety of characteristics reaching from simple
and easy policy access to the supply of information about potential cyber risks.
Noteworthy is, they assessed not only one service of the company but a service
portfolio. The project “Terms of Service; Didn’t read” uses a less mathematical
approach [
22
]. Based on crowdsourcing they present summaries and a rating
of terms of 8 services that are assessed by other users on their website. The
problem with this and other crowdsourcing solutions is that the scope is highly
dependent on participation [
23
]. To overcome this, the project “Privee” uses a
combination of crowdsourcing and automated classification [
23
]. Despite most
previous work dealing with website privacy policies, there are also works assessing
privacy aspects of apps [24].
4 Methodology
This section briefly describes how the framework was designed, how the assessed
policies were selected, and how the assessment procedure was.
4.1 Framework Development
The main goal of this work is to create an assessment framework for privacy
policies to assess a large variety of IoT devices. Therefore, applicable parameters
are needed. The framework is strongly inspired by the GDPR (cf. Sect. 2), but
we also considered the categories of privacy threats from Ziegeldorf et al. [
15
]
and the dimensions of concerns about privacy practices from Smith, at al. [
16
]
(cf. Sect. 3). For each of the parameters we identified relevant yes-or-no questions.
For all categories, we did a legal assessment to check how we should cope with a
non existing statement. We explain this in more detail in Sect. 5.1.
We identified two important dimensions for the framework: (i) Content-
Dimension (Privacy Score) and (ii) Transparency-Dimension (Transparency
Score). They differ in so far that the transparency-dimension rather checks
whether the policy makes a statement or not and the content-dimension rather
checks what statement the policy makes.
4.2 Policy Selection
To get an overview of the available products on the market, two websites
5
were
used. Since many listed devices didnt exist anymore, we searched in web shops
(e.g. Amazon) for similar products. As the framework is built on the GDPR
and the GDRP applies only to services provided to EU citizens, the product
must be available on the European market. Criteria defining what products are
available in terms of the GDPR can be found in Recital 23 [
10
] and were checked
by searching the manufacturers website and web shops. We did not assess policies
where we couldn’t find the IoT device available to the European market.
Another condition was that the policy needed to be available in English
language. If no general EU-English policy was available, an English version
applicable in Germany was looked for or otherwise the UK one was chosen.
Sometimes, e.g. US policies are slightly different from EU-language policies. If
there was an US and an EU policy available, the EU one was chosen. If some parts
of the policy were applicable to specific countries, the descriptions for Germany
or otherwise another EU-country were preferred. If there was no distinction of
EU/Non-EU or no declaration of where the policies apply, it was assumed that
it is a global policy, which is also permitted in the framework.
To find the policies we searched the website of the manufacturer in the first
place and after that we searched for the policy in the Google Playstore and in
the last instance we contacted them via E-Mail to send us the according policy.
4.3 Assessment Procedure
The assessment was done manually by reading the policies and applying all
parameters to them. The number of words and the Flesch Score were calculated
automatically by an Online Tool [
25
], the remaining questions are yes-or-no
5http://IoTLineup.com and http://IoTList.co
questions. To record the results of the assessment, a table-workbook with several
sheets was created containing an overview of all policies and one sheet for every
assessment. The assessment scorecard is a table with general information (e.g.
name, ID, category) in the header and all parameters beneath. For both Privacy
Score and Transparency Score there are columns where the answer and the
corresponding points were saved. We also stored the segment of the privacy policy
which was relevant for the scoring to allow using this data as a training set for a
machine learning algorithm later.
5 Assessment Framework for Privacy Policies
The framework consists of 16 parameters with all besides the first of them having
up to four yes-no-questions. As already discussed, parameters are assessed towards
a privacy score and a transparency score. The answer to each question is assessed
and the awarded points sum up to a score in this parameter. Every parameter
has a separate score. To balance the different number of each question, the score
for each parameter is then normalized to be between 0 and 1. For questions
that cannot be answered with yes or no (e.g. clicks needed) there was a table
which assigned the clicks to points within this interval. Since convergence to the
privacy-protective condition of the parameter raises the score, the score can be
interpreted as ”the higher the score, the better the privacy practices”. Analogous,
the transparency can be interpreted.
Agrawal et al. (2007) weighted their categories with an importance factor,
which is the case on the parameter level in this framework as well. Users can set
a weighting factor for each parameter to operationalize their personal preferences.
If the user is not able to come up with weights easily, the framework can also
be used as a basis for an Analytic Hierarchy Process (AHP) like approach [
26
].
Hereby, the importance of every parameter is compared pairwise to each other
and the result is a parameter importance ranking. However, with an increasing
number of parameters, respondents might perceive this approach as exhaustive.
For the remainder of this work the weighting factor was set to 1.
To make it easy for the user to see where a policy is positioned within the
range of 100%, letters are assigned to relative scores. Therefore, we divided the
range of possible scores into five quintiles such that a relative Privacy Policy
Score (PPS) and respectively a relative Transparency Score (TS) with more
than 80% get the best ”A”-Ranking and the rankings with 20% and less get an
”E”-Ranking which is the worst.
5.1 Parameters
The 16 parameters of the framework (cf. Tab. 1) cover different categories like
accessibility, readability, the right to object, access, erasure and data portability.
Whether the policy considers special treatment of children data and utilization
of special data categories (Health, Race, Sex, ...) is covered as well. Also for the
involvement of a third party, notification for changes or data breaches and notes
on utilization for advertisement there are separate parameters. Due to space
limitations, we are not able to describe each parameter and reasoning in detail,
but for transparency each related GDPR article is noted in column §of Tab. 1.
5.2 Transparency Score
As shown in Tab. 1, all parameters are considered for the transparency score.
Since it is modeled if the policy makes a statement, the value of a parameter
question is 1 if the policy answered the question (irrespective how it was answered)
and 0 if the question is not or contradictory answered.
Relative Transparency Score
The transparency score is based on the sum of
the 16 parameters that each have a value between 0 and 1. The score for service
i
is calculated by formula 1 where
Ti,j ∈ {
0
,
1
}
represents the corresponding value
of the parameters, and wjis the weighting factor for parameter j. With T
j= 1
as the best possible score of parameter j, we get:
Relative TSi=Pn
j=1 wjTi,j
Pn
j=1 wjT
j
=Pn
j=1 wjTi,j
Pn
j=1 wj
(1)
5.3 Privacy Score
The privacy score needs a more distinct view on the parameters. Some parameters
like the Flesch Reading Ease Score or if the policy is a multi-device policy can be
assessed for all policies (cf. Tab. 1, sign:
3
). We did not consider the parameters
marked with
7
in Tab. 1, because some of them are not referring to the content of
the policy, e.g. how easy it is to find the policy. Others do not necessarily need to
be provided, e.g. the GDPR already states when a notification of policy changes
needs to be provided. Gluck et al. [
27
] found contradicting signs: Despite that
shorter notices are typically expected to be more effective, removing expected
privacy practices from privacy policies sometimes led to less awareness of those
practices, without improving awareness of the remaining practices. Thus, we
decided not to consider these parameters for the privacy score.
However, there are also parameters which need to be stated (cf. Tab. 1, sign:
y
), e.g. the right of data portability, where we considered their absence negative
for the privacy friendliness. In contrast, parameters which are in general not
expected, but required if the service provider follows a certain practice (cf. Tab. 1,
sign:
x
), e.g. transfer of data to third parties. Therefore, if no statement was
given, we considered them to be positive for the privacy friendliness.
The parameter marked with
should only apply to devices which are used
by children. Since for many devices there is no clear statement of the target
audience, we considered it only for toys.
Table 1.
The Framework’s Parameters with their Questions and how the Parameters
are Considered for Transparency (T) and the Privacy Friendliness of the Policy (P).
# Parameter Name Parameter Description T P §
1 Easily Acc. Form 1) Readability (Flesch Reading Ease Score) 3 3 12
2 Right to Object 1) Does the policy state a right to object?
2) Is an objection as easy as a consent? 3y6, 7,
13, 21
3 Children
1) Is a binding age limit to use the service stated?
2) Is there a special policy for children?
3) Is there a mechanism to ensure that parents
agree with the processing?
4) Does the policy state the procedure if children
data has been processed unintentionally?
3 8
4
Processing of Spe-
cial Categories
of Personal Data
1) Are special personal data categories processed?
2) Is it required contentwise for using the service?
3) Is there an explicit consent?
3x9, 13
5Necessary
Information
1) Are identity and contact details of the
controller stated?
2) Is a data protection officer stated?
3) Are the purposes of the processing for which
the personal data are intended stated?
3y13
6 Period of Storage 1) Is the storage period stated?
2) Are criteria determining the period stated? 3y13
7 Right of Access 1) Is the right of access stated?
2) Is a fee charged? 3y12, 13,
15
8 Right to Erasure
1) Is the right to erasure stated?
2) Is the time to fulfil the erasure request stated?
3) Period until fulfilment
3y12, 13,
17
9 Data Portability 1) Is the right to data portability mentioned? 3y13, 20
10 Third Countries
1) Is data processed in third countries?
2) Does the policy state these countries?
3) Is data transferred to countries with adequate
level of protection (e.g. EU-U.S. Privacy shield)?
3x45, 46,
47, 49
11 Data Breach
Notification
1) Is a personal notification after a data breach
explicitly stated?
2) Period until notification
3 7 34
12 Third Parties
1) Is a third party involved by design?
2) Does the policy state who the third party is?
3) Does the policy explicitly state the purpose?
4) Is the scope of the transferred data stated?
3x13
13 Search for the
Policy
1) Is there a link on the homepage that leads to
the policy for the device quickly?
2) How many clicks are needed from the home-
page to find the link to the policy?
3 7 12, 13
14 Change Notificat. 1) Is there a notification after policy changes? 3 7 13
15 Special Device
Policy
1) Is the present policy a multi-policy?
2) Is it clear, the policy is for the IoT product? 3 3
16 Lifecycle
1) Can information stored on the device be deleted?
3y
3: Used, 7: Not used, x/y: If not present, rated positive/negative, : Only for toys
Relative PPS
Frequency
0.0 0.2 0.4 0.6 0.8
0 2 4 6 8 10 12
Relative TS
Frequency
0.0 0.2 0.4 0.6 0.8
0 2 4 6 8 10 12 14
Fig. 1. Histogram of PPS and TS of Examined Policies
Relative Privacy Policy Score
The value which enables comparisons along
different policies is called relative Privacy Policy Score (relative PPS). The relative
PPS for service i is calculated by formula 2 where
j
is the parameter id,
xj
is
the weighting factor for parameter
j
,
Pj,i
is the score of parameter
j
for Service
iand with P
j= 1 as the best possible score of parameter j, we get:
Relative PPSi=Pn
j=1 xjPi,j
Pn
j=1 xjP
j
=Pn
j=1 xjPi,j
Pn
j=1 xj
(2)
6 Results
A set of 113 IoT devices was created, but while collecting policies we found three
products without a policy which would be ranked with 0% in both dimensions.
For legibility reasons we removed these ones and ended up with 110 products
to assess. They were divided into three umbrella categories Smart Home, Smart
Health and Toys, which are subdivided in groups e.g. Thermostat, Light, Washer,
etc. Some privacy policies covered multiple devices or they were a privacy policy
for all of the company’s services. According to the assessment framework in
Sect. 4.3, privacy policies were assessed and ranked based on their achieved
privacy and transparency scores. In the end, we assessed 94 policies: 14 policies
covered 30 devices and 80 policies were for a single IoT device. Two devices
changed their policy during the assessment period.
6.1 Ranking Results
Table 2 shows the results of the privacy and transparency scores grouped into
the respective subgroups. Figure 1 presents histograms for the relative privacy
policy respectively transparency score.
Table 2. Summary Statistics of Examined Policies
Area
Subarea # PPS Score Rel. PPS (%) Transparency Rel. TS (%)
A B C D E Mean STD A B C D E Mean STD
Smart Home
Coffee Machine 5 0 0 1 4 0 31.67 8.39 0 0 4 1 0 47.50 10.37
Light 5 0 0 2 3 0 35.56 8.67 0 1 4 0 0 53.75 6.04
Security 9 0 0 3 5 1 32.80 11.36 0 1 7 1 0 48.61 9.80
Thermostat 6 0 0 3 3 0 36.69 11.10 0 1 4 1 0 50.43 11.35
Washer 5 0 1 2 2 0 37.91 20.83 0 1 3 1 0 54.17 12.68
Others 28 0 0 7 21 0 34.71 8.95 0 5 20 3 0 50.52 8.99
Total 58 0 1 17 38 2 34.70 10.50 0 9 42 7 0 50.55 9.37
Health
Fitness Tracker 7 0 0 2 5 0 36.11 6.39 0 1 6 0 0 53.72 4.91
Scale 15 0 0 1 12 2 28.75 11.56 0 3 6 6 0 43.89 12.93
Others 5 0 0 1 4 0 33.89 8.22 0 1 4 0 0 52.29 6.93
Total 27 0 0 4 21 2 31.61 10.14 0 5 16 6 1 47.99 11.18
Toy 9 0 0 3 6 0 34.05 12.66 0 2 6 1 0 50.92 13.18
PTotal 94 0 1 24 65 4 33.75 10.59 0 16 64 14 0 49.85 10.26
6.2 Statistics on the Privacy Policies
The results do not appear to have similarities with a normal distribution. We
conducted a Shapiro-Wilk-Test [
28
] to confirm or reject this hypothesis. It is a
high-quality test for normal distribution that can be applied on relatively small
samples. The p-value predicts how likely it is to get such results from a normal
distribution. With a p-value of 0.1368 for the relative PPS and p-value of 0.3146
for the relative TS, we assume that the distribution of the privacy scores and the
distribution for the transparency score are not close to a normal distribution.
Due to the results of Gluck et al. [
27
], we were also interested in the relationship
between the length and the privacy respectively transparency score of the privacy
policies. Since the plots (cf. Fig. 2) show some clusters, we conducted Spearman
correlation tests [
29
]. For the correlation between the number of words in the
policy and the privacy score we found a moderate effect size (
ρPPS
0
.
518 with
p-value
8
.
8
·
10
8
). Analogous, for the correlation between the number of
words in the policy and the transparency score we found a strong effect size
(
ρT S
0
.
723 with p-value
2
.
2
·
10
16
. Both correlations are statistically
highly significant and allow us to conclude that there is a relationship between
the length of the policy and the privacy respectively transparency score.
7 Discussion
The ranking of the both scores within the quintiles shows that none could get
an A-rating. This might improve when the GDPR is put in place in May 2018.
However, being compliant to the GDPR could also mean to inform about certain
privacy practices without them being more privacy friendly. Difficulties in finding
the right policy raises also the question whether companies use privacy policies
to inform the users or if they just use them as a legal cover.
0 2000 4000 6000 8000 10000
0.1 0.2 0.3 0.4 0.5 0.6 0.7
0 2000 4000 6000 8000 10000
0.10 0.15 0.20 0.25 0.30
Number of Words
Relative TS
Fig. 2. Relationship between Length and Relative PPS/TS
The result of the correlation between scores and length should not be misun-
derstood as a motivation to provide longer policies because longer policies seem
to be better. More likely, the result is due to the fact that in longer policies more
topics can be covered. We expect a certain length where this effect will invert.
7.1 Limitations and Threats to Validity
Despite all care, the assessment framework cannot replace the detailed analysis of
a lawyer. Although, the questions are Additionally, it was not possible to test the
implementation of the policy. All assessment is based on the written policy and it
is not guaranteed that companies follow their own rules. Future research should
crosscheck contents and execution of the policy. Labels like TRUSTe, which the
FTC approach took into account for a measure of enforcement [
18
], can be an
indicator that their policies indeed reflect their practices. Nevertheless, even for
labels like TRUSTe, there is reason for critique e.g. in meaningfulness [30].
We only examined English privacy policies. We can not exclude that the
policies’ contents differ between the different language versions. According to
Article 12 of the GDPR the policy must be provided ”in a concise, transparent,
intelligible and easily accessible form, using clear and plain language”. The
availability of a language other than English is not explicitly mentioned in the
GDPR but the line of argument could be that this supports the requirements.
A weak point of parameter 13 (Search for the Policy) is that the effort to find
a policy is not a reliable measure because it is dependent on who looks for it.
Some companies use the same policy for their products as for their websites and
some companies don’t declare the range of application which makes it difficult
to ensure that the present policy is the right one for the IoT product. However,
we could statistically show that there was no learning effect when searching
for the policy since the number of steps was not significantly lower at the last
investigated policies.
7.2 Future Extension of the Framework
One design goal of this framework was its openness to extensions. New parameters
can be easily added, the utilization of a relative score instead of an absolute score
makes allowance for this, because it allows a step-wise re-assessment. One can
easily think of further requirements for a good privacy policy/practice which is not
considered in this framework yet, but future work could create new parameters to
operationalize them. We list some of the additional parameters, we also considered,
assessed but not included in the final version of the framework. Procedure of
data sharing after a corporate merge or bankruptcy. Has the parent company
access to personal information after a merge? We didn’t include this parameter
in the final framework, because we couldn’t find a statement how reliable this
declaration would be if there would really be a merge or bankruptcy. A parameter
considering the data processing if the user is not the owner, but e.g. a guest in
a smart home where microphones listen for commands and listen to the guests,
who have not given consent [
31
]. Is the scenario of an incidental use considered?
Are there mechanisms to protect against an incidental use? Since as of today, this
seems to be a non resolved issue, we also did not consider this parameter in our
framework. For the same reason, we did not consider interacting systems, where
each system has its own privacy policy and there is a chance of inconsistencies
arising when systems work together.
8 Conclusion and Future Work
This paper presents an extendable assessment framework for privacy policies
consisting of 16 parameters. We collected 94 privacy policies covering 110 devices.
Users can look up certain topics or compare devices according to their own
preferences.
The results of this comparative study show that most of the examined privacy
policies of IoT devices/services are insufficient to address the GDPR requirements
and beyond. Many topics are currently not addressed in privacy policies but will
need to be covered until May 2018, when the GDPR comes into effect.
Difficulties in finding the right policy raises the question whether the purpose
of privacy policies is to inform the users and make them conscious of the data
processing or if it is just a legal cover, which deserves further research. The trans-
parency dimension tried to operationalize this aspect but further development
and improvement of this dimension is required.
During the analysis of this work it also seemed as though that products on
the European market have fewer functionalities than US products. Some devices
are not even available for EU citizens, perhaps due to the higher requirements
of European law. Future work could check this impression. Additionally, there
might be differences in the cont the same policies in different languages and
future research should include a comparison.
To make people more aware about the shortcomings of privacy policies, a
public ranking website should be designed. Based on the current framework users
could set the privacy preferences and a personalized score could be calculated.
Awareness for privacy topics might help to force companies to reform their
practices. To avoid manually processing a larger number of policies, an automatic
assessment tool could be designed and developed, e.g. based on a machine learning
approach. In particular, we aim at extending the framework by using the assessed
privacy policies as corpus and building predictive models using machine learning
and natural language techniques. Furthermore, considering semantic features of
privacy policies could result in analyzing and bench-marking IoT privacy policies
with high accuracy. Such automatic and adaptive models coupled with usable
and informative user interfaces can be helpful to support users in analyzing and
retracing the data processing practices of IoT services they intend to subscribe.
Acknowledgments
This research was partly funded by the German Federal Ministry of Education
and Research (BMBF) with grant number: 16KIS0371.
References
1.
Stankovic, J.A.: Research Directions for the Internet of Things. IEEE
Internet of Things Journal 1(1) (2014) 3–9
2.
Information Commissioner’s Office: Privacy regulators study finds Internet
of Things shortfalls (2016)
3.
Mayer, C.P.: Security and Privacy Challenges in the Internet of Things. In:
Electronic Communications of the EASST. Volume 17. (2009)
4. DZone: The DZone guide to Internet of Things (2016)
5.
Milne, G.R., Culnan, M.J.: Strategies for reducing online privacy risks: Why
consumers read (or don’t read) online privacy notices. Journal of Interactive
Marketing 18(3) (2004) 15–29
6.
European Commission: Special Eurobarometer 431: Data Protection Report
(2015)
7.
Jensen, C., Potts, C., Jensen, C.: Privacy practices of internet users: Self-
reports versus observed behavior. International Journal of Human-Computer
Studies 63(1-2) (2005) 203–227
8.
Casadesus-Masanell, R., Hervas-Drane, A.: Competing with Privacy. Man-
agement Science 61(1) (2015) 229–246
9.
Xia, F., Yang, L.T., Wang, L., Vinel, A.: Internet of Things. International
Journal of Communication Systems 25(9) (2012) 1101–1102
10.
European Parliament, Council of The European Union: Regulation (EU)
2016/679 General Data Protection Regulation (GDPR).
http://eur-lex.
europa.eu/legal-content/EN/TXT/?uri=CELEX:32016R0679 (2016)
11.
European Commission: Proposal for a Regulation on Privacy and Elec-
tronic Communications (ePrivacy Regulation).
http://eur-lex.europa.
eu/legal-content/EN/TXT/?uri=CELEX:52017PC0010 (2017)
12.
European Interactive Digital Advertising Alliance (EDAA): The e-privacy reg-
ulation – good or bad for european consumers?
http://www.likeabadmovie.
eu/ (2018)
13.
Engeler, M., Felber, W.: Draft of the ePrivacy Regulation from the perspec-
tive of the regulatory practice.
http://rsw.beck.de/rsw/upload/ZD/ZD\
_Sonderver\"offentlichung\_Engeleer\_Felber\_engl..pdf (2017)
14.
Pellikan, L.: Bundesregierung: ePrivacy-Verordnung kommt erst 2019. W&V
of 22 November 2017,
https://www.wuv.de/digital/bundesregierung\
_eprivacy\_verordnung\_kommt\_erst\_2019 (2017)
15.
Ziegeldorf, J.H., Morchon, O.G., Wehrle, K.: Privacy in the Internet of
Things: threats and challenges. Security and Communication Networks
7
(12)
(2014) 2728–2742
16.
Smith, H.J., Milberg, S.J., Burke, S.J.: Information Privacy: Measuring
Individuals’ Concerns about Organizational Practices. MIS Quarterly
20
(2)
(1996) 167
17.
Milne, G.R., Culnan, M.J.: Using the content of online privacy notices
to inform public policy: A longitudinal analysis of the 1998-2001 u.s. web
surveys. The Information Society 18(5) (2002) 345–359
18.
Peslak, A.R.: Internet Privacy Policies. Information Resources Management
Journal 18(1) (2005) 29–41
19.
Agrawal, R., Grosky, W.I., Fotouhi, F.: Ranking Privacy Policy. In: IEEE
23rd Intern. Conference on Data Engineering Workshop. (2007) 192–197
20.
Flesch, R.: A new readability yardstick. Journal of Applied Psychology
32
(3)
(1948) 221–233
21. Ranking Digital Rights: 2017 Corporate Accountability Index (2017)
22.
Terms of Service; Didn’t Read project: Website.
https://tosdr.org/
(2017)
23.
Zimmeck, S., Bellovin, S.M.: Privee: An architecture for automatically
analyzing web privacy policies. In: Proceedings of the 23rd USENIX Security
Symposium. August 20–22, 2014, USENIX Association (2003)
24.
Zimmeck, S., Wang, Z., Zou, L., Iyengar, R., Liu, B., Schaub, F., Wilson, S.,
Sadeh, N., Bellovin, S.M., Reidenberg, J.: Automated Analysis of Privacy
Requirements for Mobile Apps: Ndss’17: Network and Distributed System
Security Symposium (2017)
25.
WebpageFX: Readability Test Tool.
https://www.webpagefx.com/tools/
read-able/
26.
Saaty, T.L.: What is the analytic hierarchy process? In: Mathematical models
for decision support. Springer (1988) 109–121
27.
Gluck, J., Schaub, F., Friedman, A., Habib, H., Sadeh, N., Cranor, L.F.,
Agarwal, Y.: How Short Is Too Short? Implications of Length and Framing
on the Effectiveness of Privacy Notices. In: Symposium on Usable Privacy
and Security (SOUPS). (2016)
28.
D’Agostino, R.B., Stephens, M.A., eds.: Goodness-of-fit techniques. 5. print
edn. Volume 68 of Statistics. Dekker, New York, NY (1986)
29. Hollander, M., Wolfe, D.A.: Nonparametric Statistical Methods. (1999)
30. McCarthy, J.: TRUSTe Decides Its Own Fate Today - Slashdot (1999)
31. v. Leitner, F.: Das IoT-Problem. https://ptrace.fefe.de/iot (2017)
All websites have been last accessed on Jan. 15th, 2018.
... As such, it is importance that a privacy policy is examined to determine what the terms in the policy provide for consumers. However, existing research in pri-vacy policies analysis is mostly manually by humans reading through the policies, comparing them with already established legislations, and using questionnaires and experiments with human subjects to assess the scope and coverage of the policies [3,4,5]. ...
... Paul et al. [5] analyzed the privacy policies of Internet of Things services by creating a framework based on GDPR which contains parameters to assess the coverage of the privacy policy being tested framed as yes-or-no questions. Then Paul et al. read through manually each privacy policy and marked a table containing the parameters with yesor-no depending on the coverage of the privacy policy. ...
... Literatures in [3,4,5] focus on the process of manually reading through each privacy policy for assessment and using an established data privacy documentation to compare and assess privacy policies. None of the above-mentioned studies used an automated text analysis approach to assess privacy policies and at the same time consider semantics. ...
... In contrast to the previous three games, Leech does not address security awareness, but privacy awareness. As a continuation to work on an assessment framework for privacy policies of Internet of Things Services [41] based on particular General Data Protection Regulation (GDPR) [42] requirements, the serious game Leech was developed. The aim of is Leech is to foster players' learning about the contents and structure of privacy policies so that they get a rough understanding what to expect in privacy policies. ...
Chapter
Serious games seem to be a good alternative to traditional trainings since they are supposed to be more entertaining and engaging. However, serious games also create specific challenges: The serious games should not only be adapted to specific target groups, but also be capable of addressing recent attacks. Furthermore, evaluation of the serious games turns out to be challenging. While this already holds for serious games in general, it is even more difficult for serious games on security and privacy awareness. On the one hand, because it is hard to measure security and privacy awareness. On the other hand, because both of these topics are currently often in the main stream media requiring to make sure that a measured change really results from the game session. This paper briefly introduces three serious games to counter social engineering attacks and one serious game to raise privacy awareness. Based on the introduced games the raised challenges are discussed and partially existing solutions are presented.
... O trabalho de Nickas et. al [19] utiliza uma abordagem chamada RegTech para a avaliação da conformidade com a GDPR, utilizando questões que foram criadas por regulamentações com as próprias autoridades para que pudessem servir como uma forte plataforma de avaliação de conformidade. A ferramenta de avaliação atende ao requisito de ser abrangente na medida em que cobre a amplitude da GDPR. ...
... With the rise of the Internet of Things, privacy policies are not only used to cover websites but also for any kind of service connected to the internet [13]. However, after the introduction of the EU General Data Protection Regulation (GDPR) most privacy policies are used as legal agreements and users encounter them to be largely unreadable [2]. ...
Conference Paper
Most privacy policies are incomprehensive and largely unreadable. As a consequence, most users do not bother to read them. We propose Leech, a serious game developed in a students’ project for learning about the contents and structure of privacy policies so that users get a rough understanding what to expect in privacy policies. Leech is an adventure game and the player has to solve quests to complete the game. Two of the tasks are implemented as a mini game to allow more complexity. Two pre-tests led to promising results and we intend to quantitatively evaluate the game in the next step by investigating players’ online privacy literacy, demographics, values on privacy policies, actions within the game, and their in-game experience.
Article
The Internet of Medical Things (IoMT) is an important enabler for improving healthcare through collecting, processing, and analyzing sensitive medical data of unprecedented volume. While the data collection on the body, in the home, or in hospital settings helps revolutionize healthcare, it endangers privacy and exposes users to network vulnerabilities. Furthermore, there is a lack of transparency in the data practices of the IoMT devices. We seek to explore the data practices of IoMT devices found in smart homes and identify trends and potential outliers. We leveraged legal frameworks (i.e., HIPAA, CalOPPA, COPPA) and expert opinions to develop a Privacy Policy Assessment Questionnaire (PPAQ) to evaluate the privacy posture of IoMT devices (i.e., weight scales, blood pressure monitors, continuous glucose monitors). We evaluated 20 IoMT privacy policies according to the PPAQ's seven privacy factors: General Data Collection, Data Sharing, Data Retention, Data Protection, User Choice, and Children's Personal Information. The results show the presence of the privacy factors, but the statements are general and do not provide specificity to describe relevant privacy concerns. For example, more than 85% (17/20) of the privacy policies mention children, but more than half of those policies do not explain how they handle the unintentional collection of children's personal information. Additionally, most privacy policies indicate data retention, but only three privacy policies provided time frames for the retention periods. This study underscores that privacy policies fail to inform and support users and demonstrates a need to investigate the prevalence of device identifiers as a tracking technology among IoMT devices.
Chapter
The ongoing demand for new and faster technologies continues to leave consumers and business users to face the constant challenge of updating systems and software. This unrelenting pace of technological evolution has not always been matched with a commensurate focus on security and privacy matters. In particular, the obligatory move to embrace cloud and IoT - that frequently result in the collection and analysis of large data lakes has raised challenges for sovereign data protection and privacy legislation where data at rest can change overnight with mergers and acquisitions of service providers. This chapter examines the role of IFIP Technical Committee 11 (and its 14 underlying Working Groups) in this ever-changing and evolving domain. The discussion provides an outline of key issues in information security when viewed from technical, organisational and human perspectives, which collectively represent the breadth of areas within which TC-11 and its Working Groups are seeking to make contributions. The chapter as a whole gives a clear sense of the challenges involved in achieving and maintaining security and privacy, alongside insights into the ways that they are being tackled within IFIP activities.
Chapter
Highly-heterogeneous and fast-arriving large amounts of data, otherwise said Big Data, induced the development of novel Data Management technologies. In this paper, the members of the IFIP Working Group 2.6 share their expertise in some of these technologies, focusing on: recent advancements in data integration, metadata management, data quality, graph management, as well as data stream and fog computing are discussed.
Article
Full-text available
The Internet of Things paradigm envisions the pervasive interconnection and cooperation of smart things over the current and future Internet infrastructure. The Internet of Things is, thus, the evolution of the Internet to cover the real world, enabling many new services that will improve people's everyday lives, spawn new businesses, and make buildings, cities, and transport smarter. Smart things allow indeed for ubiquitous data collection or tracking, but these useful features are also examples of privacy threats that are already now limiting the success of the Internet of Things vision when not implemented correctly. These threats involve new challenges such as the pervasive privacy-aware management of personal data or methods to control or avoid ubiquitous tracking and profiling. This paper analyzes the privacy issues in the Internet of Things in detail. To this end, we first discuss the evolving features and trends in the Internet of Things with the goal of scrutinizing their privacy implications. Second, we classify and examine privacy threats in this new setting, pointing out the challenges that need to be overcome to ensure that the Internet of Things becomes a reality. Copyright © 2013 John Wiley & Sons, Ltd.
Article
Full-text available
Many technical communities are vigorously pursuing research topics that contribute to the Internet of Things (IoT). Nowadays, as sensing, actuation, communication, and control become even more sophisticated and ubiquitous, there is a significant overlap in these communities, sometimes from slightly different perspectives. More cooperation between communities is encouraged. To provide a basis for discussing open research problems in IoT, a vision for how IoT could change the world in the distant future is first presented. Then, eight key research topics are enumerated and research problems within these topics are discussed.
Article
Full-text available
In the United States, Congress has had a long-standing interest in consumer privacy and the extent to which company practices are based on fair information practices. Previously, public policy was largely informed by anecdotal evidence about the effectiveness of industry self-regulatory programs. However, the Internet has made it possible to unobtrusively sample web sites and their privacy disclosures in a way that is not feasible in the offline world. Beginning in 1998, the Federal Trade Commission relied upon a series of three surveys of web sites to assess whether organizations post online privacy disclosures and whether these disclosures represent the U.S. definition of fair information practices. While each year's survey has provided an important snapshot of U.S. web-site practices, there has been no longitudinal analysis of the multiyear trends. This study compares a subset of equivalent individual-level web-site data for the 1998, 1999, 2000, and 2001 web surveys. Implications for using this type of research to inform public policy are discussed.
Conference Paper
In our everyday life, we must constantly make choices concerning what tasks to do or not to do, when to do them, and whether to do them at all.
Article
We analyze the implications of consumer privacy for competition in the marketplace. Firms compete for consumer information and derive revenues both from consumer purchases as well as from disclosing consumer information in a secondary market. Consumers choose which firm to patronize and how much personal information to provide it with. We show that firms maximize profits by focusing on a single revenue source and competing at the extensive rather than the intensive margin, outperforming competitors by attracting a larger customer base. We also show that competition drives the provision of services with a low level of consumer information disclosure (high level of privacy), but higher competition intensity in the marketplace need not improve privacy when consumers exhibit low willingness to pay. Our findings are relevant to the business models of Internet firms and contribute to inform the regulatory debate on consumer privacy. This paper was accepted by Bruno Cassiman, business strategy.
Article
Online privacy notices are intended to promote consumer choice and reduce the risks of disclosing personal information online. However, these effects result only if consumers read and use the information contained in the notices. This study used an online survey of 2,468 U.S. adult Internet users to investigate why online consumers read privacy notices across a variety of situations. We found that reading is related to concern for privacy, positive perceptions about notice comprehension, and higher levels of trust in the notice. This suggests that effective privacy notices serve an important function in addressing risk issues related to e-commerce. We further found that reading privacy notices is only one element in an overall strategy consumers use to manage the risks of disclosing personal information online.
Article
Several recent surveys conclude that people are concerned about privacy and consider it to be an important factor in their online decision making. This paper reports on a study in which (1) user concerns were analysed more deeply and (2) what users said was contrasted with what they did in an experimental e-commerce scenario. Eleven independent variables were shown to affect the online behavior of at least some groups of users. Most significant were trust marks present on web pages and the existence of a privacy policy, though users seldom consulted the policy when one existed. We also find that many users have inaccurate perceptions of their own knowledge about privacy technology and vulnerabilities, and that important user groups, like those similar to the Westin “privacy fundamentalists”, do not appear to form a cohesive group for privacy-related decision making.In this study we adopt an experimental economic research paradigm, a method for examining user behavior which challenges the current emphasis on survey data. We discuss these issues and the implications of our results on user interpretation of trust marks and interaction design. Although broad policy implications are beyond the scope of this paper, we conclude by questioning the application of the ethical/legal doctrine of informed consent to online transactions in the light of the evidence that users frequently do not consult privacy policies.