Conference PaperPDF Available

A study of privacy policy enforcement in access control models

Authors:

Abstract and Figures

Internet has gained huge popularity over the last decade. It offers its users reliable, efficient and exciting online services. However, the users reveal a lot of their personal information by using these services. Websites that collect information state their practices with data in their privacy policies. However, it is difficult to ensure if the policies are enforced properly in their practices. This can lead to unintentional leakage of private information to unauthorized parties and thus increase the chance of private data to be misused. Taking the help of legal systems is expensive, time consuming and cannot compensate the loss completely. Therefore, an effective way to protect privacy is to use privacy policy to control data access. In this paper, we review some distinguished research works that address this problem. We also discuss the completeness of the privacy definition used in these works.
Content may be subject to copyright.
A study of privacy policy enforcement in
access control models
Md. Moniruzzaman, Md. Sadek Ferdous and Roksana Hossain
Dept. of Computer Science, University of Calgary, Calgary, AB, Canada.
School of Computing, University of Kent, Canterbury, Kent, UK.
Dept of ECE, University of Western Ontario, London, ON, Canada.
mmoniruz@ucalgary.ca, m.s.ferdous@kent.ac.uk, rhossai2@uwo.ca
Abstract
Internet has gained huge popularity over the last
decade. It offers its users reliable, efficient and exciting
online services. However, the users reveal a lot of their
personal information by using these services. Websites
that collect information state their practices with data in
their privacy policies. However, it is difficult to ensure
if the policies are enforced properly in their practices.
This can lead to unintentional leakage of private
information to unauthorized parties and thus increase
the chance of private data to be misused. Taking the
help of legal systems is expensive, time consuming and
cannot compensate the loss completely. Therefore, an
effective way to protect privacy is to use privacy policy
to control data access. In this paper, we review some
distinguished research works that address this problem.
We also discuss the completeness of the privacy
definition used in these works.
Keywords: Privacy, access control, database, policy
language, security.
I. INTRODUCTION
Internet has revolutionized the way people see, do and
perceive things in so many aspects of their life. The
impact of this revolution has been regarded as almost
positive and people embraced it with a very positive
frame of mind. This can rightfully be indicated by the
tremendous growth rate of Internet users over the last
decade. Success of Internet can be attributed to the
gigantic number of websites and a wide range of
services they offer. While most of these services are
free, users disclose their private information by using
them. Some information are required by a website to
provide the service, some are collected automatically by
monitoring the users’ activities and some are provided
voluntarily by a user. Considering the amount and
varieties of personal information shared, internet users
have a higher risk of privacy breach. Though the
websites publish their policies about how they would
use the collected data, those policies stay in documents
and do not play an active role when data are actually
used. In a bid to improve the online privacy practice,
World Wide Web Consortium proposed a protocol
called Platform for Privacy Preferences Project (P3P)
[1], [2]. It includes an XML based policy language that
can be used by a website to encode their privacy
policies. Users can also set up their privacy preferences
in the internet browsers. When a website is visited, the
browser checks if the practices and preferences match.
If they do not, the user is warned that the website is not
compliant with the desired privacy settings. P3P has
been well accepted in the online community. However,
it does not solve the privacy problem as it does not
guarantee that a website will maintain its promises in
practice. Later, Agrawal et al. [3] proposed a theoretical
database model, known as Hippocratic database, that
controls data access using privacy policy. Hippocratic
database has set a new line of research that studies
access control in database using privacy policy.
With the previous discussion in mind, our contribution
in this paper is threefold. First, we present a summary of
the components of privacy that are enforceable by
computer systems in Section II of the paper. Secondly,
we classify the existing privacy preserving access
control models into four categories and review some of
the key works from each category in Section III. We
conclude in Section IV by comparing the reviewed
works against the privacy components described in
Section II. We also discuss possible future works.
II. BACKGROUND CONCEPTS
A. Data provider, collector and user
Individuals who provide their personal information are
called data providers. Examples include customers,
patients, website users etc. Organizations that collect
and store information about individuals are called data
collectors. Data users are the people who use the
collected data. For instance, employees and business
partners of a data collector may use the data.
B. Privacy components
Privacy definition slightly varies in the literatures.
Agrawal et al. specify ten principles that are necessary
for protecting data privacy [3]. Barker et al.[4]
summarized these principles into four components–
purpose, visibility, granularity and retention. Obligation
is another component that exists in privacy policies and
also used by many privacy preserving models [5], [6].
These five components are briefly summarized below.
Purpose: It is the reason or intention for using data.
Privacy of a data item is breached if it is used for a
purpose that does not exist in the privacy policy.
Visibility: It refers to the categories of data users who
can access data. Barker et al. identify four visibilities–
Data provider, Collector, Third party and World. Data
with visibility Data provider can be accessed by the
provider himself/herself. Data with visibility Collector
can only be accessed by data users inside the collecting
organization while visibility Third party let the collector
to share data with external organizations. Finally, the
visibility World allows everybody to access data. Data
providers select visibility for their data.
Retention: How long a collector should keep data is
defined by data retention period. It can be in the form of
a date, time period or number of accesses. Data should
be removed from the system once its retention ends.
Granularity: It refers to the levels of detail for a data
item. Consider home address as data that can have these
granularities- full address, only city, only country, etc.
Obligation: It is a task to be performed by data users as
a responsibility for accessing data. Seeking parental
consent to use children’s data is a sample obligation.
III. PRIVACY PRESERVING ACCESS
CONTROL MODELS
We divide the privacy preserving access control models
into four classes. It is important to note that most of the
privacy preserving models use or adopt an existing and
widely accepted access control model called role-based
access control model (RBAC) [7]. RBAC groups the
users in an organization and assigns roles related to their
job responsibilities. Permissions are then assigned to
roles that give users access to data items. However,
RBAC was not designed with privacy in mind and
therefore, it cannot express or enforce privacy policy.
A. Access control models using privacy
aware databases
The models of this class store privacy policies within a
database. Access to any data item should satisfy the
associated privacy policy first.
A.1 Extended relational database systems
(ERDBMS)
Agrawal et al. [8] propose an extension for relational
database systems using the idea of Hippocratic database.
During data collection, this model uses P3P to collect
the provider’s privacy preference. A policy translator
module transfers the privacy preferences that are in P3P
syntaxes into a schema called T1. Policy stored in T1
states what data can be used for which purposes and by
which recipients. In this model, data users and recipients
are two different entities. A recipient can be the data
collecting organization itself or any of its partner
organizations. Schema T1 also stores the information
regarding a data provider’s opt out information for a
particular purpose and recipient.
Based on the privacy policy in T1, a special constraint
restriction is created which is then tied to a data item.
Access to the data item is allowed when the attached
constraint is satisfied. The authors provide a language to
write constraint given in Fig. 1. The language expresses
that users in the list authors1 except the users in
authors2 are allowed to access table_T. Using options
like to columns, to rows and to cells, the constraint can
be applied to a column, row or even to a cell of a table.
The purpose and recipient information are drawn from
the privacy policy stored in T1. Finally, the constraint
may specify which action or combination of actions is
allowed on data.
Data users are given privileges using SQL authorization
statement (e.g., grant). To be compliant with the
extended model that uses restriction, each grant
statement may include additional parameters like
purpose and recipient. When a data query is submitted
along with the purpose and recipient, an algorithm
checks if there is any restriction for each table reference
in the query and it then replaces the table reference by a
dynamic view. Dynamic view is created on-the-fly
based on the restriction related to the table. Detail of the
view building algorithm is not included here. Finally,
the output of the view works as the data source for the
query. As the view reveals the amount of data allowed
by the privacy policy, the authors claim that the data
provider’s privacy is protected.
The model allows the data provider to specify the
purpose for which data can be used. This constraint will
prohibit illegitimate use of data since it can only be
accessed for the right purpose. In addition to purpose,
the data provider can specify the data recipient i.e., who
will use the data. For a particular purpose-recipient pair,
a data provider can opt out and then that recipient
cannot use the data for that purpose. This model does
not investigate privacy components like retention,
granularity and obligation.
A.2 View based privacy protection (Microview)
Byun and Bertino [9] present a privacy preserving
model that creates different views of a data item based
on the data provider’s privacy preference. Here, the
concept of view is identical to the privacy component
granularity. The model assumes that during data
collection, a data provider mentions purpose for using
data and the level of privacy for that use. The three
levels of privacy are considered – Low (L), Medium
(M) and High (H). For a given purpose, Low level
indicates that the provider is not really concerned about
using the data for that purpose, Medium level indicates
a moderate concern of the provider and finally, High
level indicates that the provider is very concerned about
such usage of the data. For each level of privacy, a
different version of a data item is created. A sample data
modeling is presented in Table I.
Create restriction name_of_restriction
on table_T
for authors_1 [ except authtors_2]
(( (to columns list_of_columns)|(to rows [ where searching_condition])|(to cells(list_of_columns[where
searching_condition])+) )
[ for purpose list_of_purposes]
[ for recipient list_of_recipients])+
restrictin
g
access to
(
all
|
(
select
|
u
p
date
|
insert
|
delete
)
+
)
Fi
g
. 1. Lan
g
ua
g
e constructs
[
8
]
The table includes the customer’s preferred privacy
levels for purpose P_contact and P_promotion.
The authors of this model suggest adopting any existing
access control model for authorization management.
Regardless of the authorization model, a data user
should submit an access purpose for querying the
database. Query output depends on the preferred privacy
level of the data provider for that access purpose. For
instance, if a query runs on Table I with purpose
P_promotion, it will get the output <1010, John S., <6
Essex Street, Toronto,ON>, 35-45 years>.
A.3 Purpose based access control for RDBMS
(PBAC)
Byun and Li propose a purpose-based access control
model for relational database systems [11]. They
consider purpose as the only privacy information given
by a data provider. Purpose is stored with data in the
same table and acts as the intended purpose for using
the data.
The authors propose four types of purpose labeling of
data – element based, tuple based, attribute based and
relation based labeling. In the element-based labeling,
each data element contains an intended purpose. In tuple
based labeling, each record in a table has an intended
purpose and access to any data element of that record is
controlled by that purpose. In attribute based labeling,
there is one purpose for each column of a table. In
relation based approach, there is only one intended
purpose for the entire table.
PBAC adopts a role based approach for authorizing data
users where a data user is given a role. Instead of
assigning permissions to a role, purposes are assigned to
a role. Later, users use these purposes to request data.
Access request to a data is granted if the access purpose
matches the intended purpose of the data. In PBAC,
role-purpose assignment is conditional where users of a
role should satisfy the condition to use the purpose. For
example, (r, Admin, UserID=123 AND
time=[9am:5pm]) is a role-purpose assignment that
states any user with the role r can use the purpose
Admin if the user’s ID is 123 and time of data access is
between 9am and 5pm. PBAC organizes purpose into a
tree structure that expresses a partial ordering of
purposes. The model considers both positive and
negative notion of purpose. If a data has a negative
intended purpose, then data can be accessed for any
purpose except the intended one. As described, PBAC
considers purpose as the only privacy components.
However, purpose management of this model is
insightful and would set a good direction for future
works. Another limitation of the model is that the access
control is mostly driven by privacy policy.
Once a data user is authorized for a purpose, he/she can
access all data having the same intended purpose.
However, an organization may want to limit its users’
privileges to a subset of the data allowed by the privacy
policy.
B. Access control models based on RBAC
Many privacy preserving models use RBAC. These
models fall into two categories- the ones that extend
RBAC and the ones that use RBAC as an add-on. We
describe a model of the first category here. It modifies
RBAC to express privacy policy.
Ni et al. propose a family of role-based privacy
preserving access control models called P-RBAC [5],
[10], [12]-[13]. The authors first propose a basic model.
The subsequent models are built on it by adding new
features. We can classify these models into these
categories – privilege model, hierarchical model,
condition model and obligation model. We summarize
all the models by starting with the basic model and then
discussing key features of the other models.
In the typical role-based access control model, an access
privilege consists of data and action. In P-RBAC,
privilege has the form (data d, action a, purpose p,
condition c, obligation ob) where purpose, condition
and obligation are the privacy requirements. This type
of privilege expresses a privacy-aware access control
policy stating that in order to perform the action a on
the data d, the intended purpose should be p while
satisfying the condition c and obligation ob. Similar to
the RBAC model, permission is assigned to a role which
in turn is assigned to a data user. A data user should
satisfy the privacy requirements in order to get access.
Condition used in permission is a Boolean expression
that verifies the values of context variables. Context
variables store relevant information necessary to
validate a privacy-aware permission. Commonly stored
information includes parental consent and data
provider’s consent. Obligation is usually a function that
should be performed before or after accessing a data.
Consider a privacy policy of the portal www.toys.com
that sells toys “From time to time, you may receive
periodic mailings, telephone calls or e-mails from "R"
Us Family members with information on products or
services, discounts, special promotions, upcoming
events or other offers from an "R" Us Family member
or its marketing partners. You may opt out of receiving
e-mail communications by clicking the link at the
bottom of the e-mail received” [14].
In short, the policy allows the website’s employees,
termed as “R” Us family members, to access the
customers’ contact information for sending promotional
offers. To express this policy using P-RBAC, let the
Customer
ID
Name
Address Age P_contact P_promotion
1010 L
M
H
John Smith
John S.
J.S.
L
M
H
6 Essex Street,
Toronto,ON
Toronto,ON
ON
L
M
H
40 years
35-45 years
>30 years
{L,L,H} {M,L,M}
Table I Different views of data for
p
rivac
y
p
references
role of an employee be MR, data be Email address,
purpose be Promo and OwnerConsent be a context
variable that stores the consent of a data provider. The
role-permission assignment is (MR, <Email address,
Promo, OwnerConsent !=NO, Ø>). This permission
does not require any obligation.
In the basic model, condition is written using relational
operators and context variables. In a later work [12], the
authors propose a more expressive language for writing
condition. The language introduces hierarchical
operator that is used to express the relation between two
roles. It also supports combining two or more logical
expression using logical connectors & and |.
In [5], [13], Ni et al. provide a detail framework for
expressing obligation policy in P-RBAC which consists
of general constraint, user, action, object and temporal
constraint. Here, the general constraint is a Boolean
expression, user is the person performing the obligation,
temporal constraint stores the activation time of
obligation with respect to data access and finally, object
may include users and temporal constraints. Consider an
imaginary privacy policy of a website that collects its
visitors’ information. “Before using children’s data, we
take consent from the respective parents”. If we
consider Home address as data, Marketing as purpose
and parentalconsent and ownerAge13plus as context
variables, the privacy policy can be expressed by the
following permission.
<read, HomeAddress, Marketing,
ownerAge13plus=”Yes” AND parentalConsent=”Yes”,
(parentalConsent=No,self, (takeConsent(), pi),tc) >
The underlined part in the privilege is the obligation.
The privilege states that to read the home address of the
data providers who are below 13 years of age, a user
should ask (takeConsent()) the respective parents (pi)
for their consent if it is not taken yet
(parentalConsent=No). In obligation, self denotes that
users accessing privilege and performing obligation
should be the same. Temporal constraint tc typically
consists of start and end time that say when an
obligation should be performed. The constraint might
also specify when a user should reattempt to do
obligation if there is no reply from the target object.
As a privacy preserving model, P-RBAC considers only
purpose and obligation. In P-RBAC, the authorization
rules or privileges are created from the privacy policy.
However, the authors did not present any formal model
for the privacy policy except the policy in natural
language. This lacking leaves it unclear how policy is
consulted for creating new privileges. Another work of
this class is proposed in [6].
C. Access control models based on
transactions/workflows
In traditional access control models like RBAC, there is
no control on the order of data access. On the other
hand, in workflows and transactions the set of data
accesses to complete any objective are predetermined.
Data accesses are also ordered with respect to each
other. This section present research works that study
privacy enforcement in this type of systems.
C.1 Transaction based PBAC (TPBAC)
Yang et al. extends the purpose based access control
model (PBAC) in [15]. We have described PBAC in the
section A.3. Originally, PBAC is based on relational
database systems. Yang et al. propose a privacy aware
data model that is independent of any database system.
In the data model, data is categorized into several
classes like registration data, contact data, treatment
data, etc. Privacy metadata is associated with each data
class instead of data itself. Privacy metadata includes
purpose and retention.
The authorization model is similar to the original PBAC
where purpose is conditionally assigned to a role. Yang
et al. constrain data access through transaction. They
define the set of necessary data accesses to fulfill each
purpose and the transactions that would contain those
accesses. For example, <Marketing, Contact data,
T#21> expresses that one of the necessary data accesses
to fulfill the purpose Marketing is to access data of type
Contact data through the transaction T#21. In addition
to purpose, the model includes retention. However, it
does not provide any enforcement framework that
removes data when retention expires.
As discussed before, one limitation of the original
PBAC is that it does not support complex access control
policies. Yang et al. address this problem by specifying
necessary accesses for each purpose. In their model, a
data user is authorized for a purpose. Users can access a
data item if the data is listed in the set of necessary data
access for their assigned purpose. Finally, the assigned
purpose should equal the intended purpose of the data.
Therefore, access control is fine-grained.
C.2 Purpose management using workflows
(WPM)
In privacy preserving access control, a user has to
submit a purpose for accessing data. Some models (e.g.,
[16]) trust users and accept whatever purpose they
provide while some models [11], [15] authorize users to
certain purposes that they can use. Jafari et al. [17]
propose using workflows to govern data access where
the access purpose is determined by a workflow’s
context that issues the access request.
In a hospital, when patients are admitted for treatment, a
workflow starts. Tasks in the workflow require access to
patients’ data. Therefore, access request issued from the
workflow will submit treatment as the access purpose
which comes from the context of the workflow. Since a
workflow can have sub-workflows, a purpose can have
sub-purposes or more specific purposes. For instance,
the purpose treatment can have sub purposes like
examination and prescription.
Once the access purpose is determined, to check if the
privacy policy allows such purpose for using data, the
authors suggest using one of the existing approaches
[10], [11]. The main focus of this model is how to
manage purpose at the enterprise level. Therefore, we
assume that the model considers purpose as the only
privacy component.
D. Access control models using XML
based policy language
Access control models of this class include an XML
based policy language. One interesting property of this
language is its extensibility to suit the requirements of
new policies.
D.1 eXtensible Access Control Markup
Language (XACML)
XACML defines a general-purpose access control
system [18]. It offers an XML based policy language for
specifying the requirements to access a protected
resource. The language uses a set of attributes that
encode different properties of subject (data user),
resource (data), action and environment. The semantics
of the attributes are defined in an XML schema file.
In XACML, the smallest unit of an access control policy
is called rule that consists of a target, condition and
response. Target that determines if a rule applies to an
access request contains subjects, resources and actions.
Condition of a rule is a function that checks additional
properties of a resource requestor and its environment.
After validating an access request, a rule outputs one of
these decisions- permit, deny, not applicable and
indeterminate (indicates an error generated during the
validation). The decision may include obligations which
are additional tasks a data user should perform.
The privacy profile of XACML, an extension of the
policy language [19], includes two more attributes:
resource purpose that is the intended purpose for using
a data item and action purpose that is the reason for
using data submitted by a data user. An access request is
allowed when these two purposes match. The privacy
profile also suggests a deny override algorithm in the
case when more than one rule apply to an access request
and one of them denies the request. In addition to the
policy language, XACML includes a policy processing
model that interprets policies in the relevant
application’s context.
XACML has several limitations. Policy language is
complex and verbose. To encode a simple policy, it
requires many lines of code. It also considers purpose as
the only privacy requirement. Since the policy language
can be extended, privacy components like visibility and
retention can be applied by defining them as attributes
of the user and data respectively. As granularity needs a
functional implementation, it would be challenging to
apply it in XACML.
D.2 Enterprise Privacy Authorization Language
(EPAL)
IBM proposed a language called EPAL [20] to encode
privacy policies. An EPAL policy is more like an access
control rule or permission that can be enforced by an
enforcement engine. Similar to XACML, the language
is based on XML and uses a set of attributes called
vocabularies. The semantics of the vocabularies are
defined for the application domain in a schema file. Key
components of an EPAL policy are target, condition and
obligation. The target consists of user categories, data
categories, purpose and action. EPAL divides users and
data into different categories. Examples of user
categories are doctor, nurse etc. Examples of data
categories include registration data, treatment data, etc.
Action in the policy is the task that is performed on data
(e.g., read, write, update etc). Inclusion of purpose
makes the target privacy aware. The condition tests
contextual and privacy related information (e.g., if a
data provider has been opted out from the policy). If
applicable to an access request, a rule inference gives
one of the three decisions- allow, deny and not
applicable. The decision may also be accompanied by
obligations.
As described, the privacy components in EPAL include
purpose and obligation. One limitation is that EPAL
does not provide any formal framework to express and
enforce obligation policies. The model also does not
elaborate the concept of role hierarchy which is a
limitation for encoding enterprise policy.
IV. CONCLUSION
In this paper, we reviewed access control models that
can be used to enforce data privacy. Privacy
components used in different access control models are
listed in Table II. It is interesting to note that no model
uses all the components of our privacy definition.
However all of them use purpose since privacy policies
tend to focus on it. Ideas like retention and obligation
are getting popular. We believe that there will be many
research works in near future bringing more insights to
these ideas. Using different versions/granularities of
data is common in data publishing [21], though it is not
very popular in access control. The reason is the
overhead of creating different data versions on-the-fly.
Data publishing is done offline and therefore, the
overhead does not make any significant difference.
However, instant system response is the expected
performance in access control. So the same overhead
may severely affect the system performance. Further
Privacy
components
ERDBMS
[8]
Microview
[9]
PBAC
[11]
P-RBAC
[10]
TPBAC
[15]
WPM
[17]
XACML
[18]
EPAL
[20]
Purpose
Visibility x x x x x x x
Granularity x x x x x x x
Retention x x x x x x x
Obligation x x x x x
Table II Checklists of
p
rivac
y
com
p
onents in
p
rivac
y
p
reservin
g
access control models
research work can bring more insight on purpose
interpretation at the enterprise level. With little research
done so far, how a system should verify the purpose
submitted by a data user is still an open question.
Retention is treated in the literature as a date in the
system calendar. In practice, retention can be date,
duration, number of data access or event. Consider a
hospital’s privacy policy stating that patients’ data
should be deleted after they are discharged from the
hospital. In this case, the retention of patients’ data is an
event. Further works are necessary to support all types
of retention. Another future work can be the application
of game theory to better understand the privacy
negotiation among data providers, collectors and third
parties. In game theory [22], each player has a numeric
objective and players try to choose an action that
maximizes their objective. The situation of interest in
game theory is the equilibrium when players are
unlikely to change their behaviors. A data provider, data
collector and third party can be put as players in a game
where the provider will try to get as much services as
possible by exposing minimum data while others will
try to maximize their utility by getting more data with
minimum services. Most privacy preserving models do
not consider hierarchy for privacy components. Data,
purpose, retention, granularity and visibilities can be
organized to hierarchies which will make the model
more expressive. Ghazinour et al. [23] have used lattice
structure to organize purpose, retention, granularity and
visibilities. However, they did not consider any such
organization for data. To build hierarchy, one should
distinguish between the types and entities belong to
those types. For data, the definitions of type and entities
are intuitive. For example, demographic information is a
type and a customer’s country is an entity of this type.
This entity cannot be further divided as subclasses. For
other privacy components like purpose, this distinction
is not clear. One possible future work is to build a
model that organizes all privacy components into
hierarchies by considering their different characteristics.
REFERENCES
[1] W3 Consortium. Platform for privacy preferences
(P3P) project. Available: http://www.w3.org/P3P/.
[2] L. Cranor, “Web Privacy with P3P,” O’Reilly &
Associates, September, 2002.
[3] R. Agrawal et al., “Hippocratic databases,” in Proc.
of 28th VLDB, 2002, pp. 143–154.
[4] K. Barker et al., “A data privacy taxonomy,” in
Proc. of 26th BNCOD, 2009, pp. 42-54.
[5] Q. Ni, E. Bertino & J. Lobo,“An Obligation Model
Bridging Access Control Policies and Privacy Policies,”
in Proc. of ACM SACMAT,2008,pp.133-142.
[6] A. Masoumzadeh & J.B.D. Joshi, “PuRBAC:
Purpose-aware role-based access control,” in On the
Move to Meaningful Internet Systems,Part
II,2008,pp.1104-1121.
[7] R. Sandhu, D. Ferraiolo & R. Kuhn, “The NIST
model for role-based access control: Towards a unified
standard,” in Proc. of 5th ACM Workshop on RBAC,
2000, pp. 47–63.
[8] R. Agrawal et al., “Extending relational database
systems to automatically enforce privacy policies,” in
Proc. of 21st ICDE, 2005.
[9] J.W. Byun & E. Bertino, “Micro-views, or on how
to protect privacy while enhancing data usability:
concepts and challenges,” SIGMOD Record, Vol. 35,
No. 1, 2006, pp. 9-13.
[10] Q. Ni et al., “Privacy-aware role based access
control,” in Proc. of 12th SACMAT, 2007, pp.41-50.
[11] J. W. Byun & N. Li, “Purpose based access control
for privacy protection in relational database systems,”
Int. J. on VLDB, Vol. 17, Issue 4, 2008, pp. 603–619.
[12] Q. Ni et al., “Conditional privacy aware role based
access control,” in Proc. of 12th European Symp. on
Research In Computer Security,2007, pp.72-89.
[13] Q. Ni, E. Bertino & J. Lobo,“Privacy-aware RBAC
- Leveraging RBAC for Privacy,” in IEEE Security &
Privacy Magazine,Vol. 7,No. 4,2009,pp.35-43.
[14] TOYS.COM. Privacy policy. Available:
http://www.toys.com/privacy.html.
[15] N. Yang, H. Barringer & N. Zhang “A purpose-
based access control model,” in Proc. of 3rd Int. Symp.
on Inform. Assurance. & Security,2007,pp. 143–148.
[16] M. Jawad, P.S. Alvaredo, & P. Valduriez, “Design
of PriServ, a privacy service for DHTs,” in Int.
Workshop on Privacy & Anonymity in the Inform.
Soc.,2008,pp.21-26.
[17] M. Jafari, R. Safavi-Naini & N.P. Sheppard
“Enforcing purpose of use via workflows,” in Proc. of
8th ACM Workshop on Privacy in the Elect.
Soc.,2009,pp.113-116.
[18] OASIS. XACML. Available: http://www.oasis-
open.org/committees/tc_home.php?wg_abbrev=xacml
[19] OASIS. XACML’s Privacy profile. Available:
http://www.oasis-
open.org/committees/document.php?document_id=3764
3&wg_abbrev=xacml
[20] P. Ashley et al.,“Enterprise Privacy Authorization
Language (EPAL),” Research Report RZ 3485, IBM
Research, 2003.
[21] L. Sweeney, “K-anonymity: A model for protecting
privacy,” Int. J. on Uncertainty, Fuzziness, &
Knowledge-based Syst., Vol.10,No.5,2002.
[22] R. Gibbons, “Game theory for applied economists,”
Princeton University Press, 1992.
[23] K. Ghazinour, M. Majedi, & K. Barker,"A Lattice-
based Privacy Aware Access Control Model," in Proc.
of IEEE Int. Conf. on Inform. Privacy, Security, Risk &
Trust,2009,pp. 154-159.
... These are known as privacy preserving access control models. Several models have been proposed [23,26] in the literature. Most of the proposed models assume that data is accessed only by the collecting organization. ...
... In this work we formalize the privacy policies. There are several works that define the contents of the privacy policies [23]. Rather than proposing a new one, we choose an existing definition that says that each policy should consist of data, action, purpose, condition and obligation [24]. ...
Conference Paper
Full-text available
Delegation is a process of sharing access rights by users of an access control model. It facilitates the distribution of authorities in the model. It is also useful in collaborative environments. Despite the advantages, delegation may have an impact on the access control model's security. Allowing users to share access rights without the control of an administrator can be used by malicious users to exploit the model. Delegation may also result in privacy violations if it allows accessing data without the data provider's consent. Even though the consent is taken, the privacy can still be violated if the data is used differently than the data provider agreed. Our work investigates data privacy in delegation. As a contribution, a privacy model is introduced that allows a data provider setting privacy policies that state how their data should be used by different organizations or parties who are interested in their data. Based on this setting, a delegation model is designed to consider the privacy policies in taking delegation decisions and also, to set the data usage criteria for the access right receivers. In addition to privacy policies, several delegation policies and constraint have been used to control delegation operations. Delegation is studied within a party and between two parties.
... XACML is a policy language based on XML (eXtensible Markup Language), using a set of attributes to define the properties of subjects, resources, actions and environments. Although XACML is a flexible language that has been successfully used in many different scenarios, it has some limitations, for example, complexity and verboseness (i.e., requires many lines of code even for a simple policy) and it provides limited support for granularity [7,8,15]. ...
Conference Paper
Full-text available
Smart industrial control systems (e.g., smart grid, oil and gas systems, transportation systems) are connected to the internet, and have the capability to collect and transmit data; as such, they are part of the IoT. The data collected can be used to improve services; however, there are serious privacy risks. This concern is usually addressed by means of privacy policies, but it is often difficult to understand the scope and consequences of such policies. Better tools to visualise and analyse data collection policies are needed. Graph-based modelling tools have been used to analyse complex systems in other domains. In this paper, we apply this technique to IoT data-collection policy analysis and visualisation. We describe graphical representations of category-based data collection policies and show that a graph-based policy language is a powerful tool not only to specify and visualise the policy, but also to analyse policy properties. We illustrate the approach with a simple example in the context of a chemical plant with a truck monitoring system. We also consider policy administration: we propose a classification of queries to help administrators analyse policies, and we show how the queries can be answered using our technique.
... Privacy in the literature: Our work falls into the category of computational privacy. Another notable area is policy based privacy [16] where data providers' preferences about how their data should be handled are stored with the data. Privacy preference may include variety of restrictions like purpose, visibility, granularity, and retention [4]. ...
Conference Paper
Users are encouraged to check in to commercial places in Geo-social networks (GSNs) by offering discounts on purchase. These promotions are commonly known as deals. When a user checks in, GSNs share the check-in record with the merchant. However, these applications, in most cases, do not explain how the merchants handle check-in histories nor do they take liability for any information misuse in this type of services. In practice, a dishonest merchant may share check-in histories with third parties or use them to track users' location. It may cause privacy breaches like robbery, discovery of sensitive information by combining check-in histories with other data, disclosure of visits to sensitive places, etc. In this work, we investigate privacy issues arising from the deal redemptions in GSNs. We propose a privacy frame-work, called Redeem with Privacy (RwP), to address the risks. RwP works by releasing only the minimum information necessary to carry out the commerce to the merchants. The framework is also equipped with a recommendation engine that helps users to redeem deals in such a way that their next visit will be less predictable to the merchants. Experimental results show that inference attacks will have low accuracy when users check in using the framework's recommendation.
Conference Paper
This paper describes proposed privacy extensions to UML to help software engineers to quickly visualize privacy requirements, and design privacy into big data applications. To adhere to legal requirements and/or best practices, big data applications will need to apply Privacy by Design principles and use privacy services, such as, and not limited to, anonymization, pseudonymization, security, notice on usage, and consent for usage. We extend UML with ribbon icons representing needed big data privacy services. We further illustrate how privacy services can be usefully embedded in use case diagrams using containers. These extensions to UML help software engineers to visually and quickly model privacy requirements in the analysis phase, this phase is the longest in any software development effort. As proof of concept, a prototype based on our privacy extensions to Microsoft Visio's UML is created and the utility of our UML privacy extensions to the Use Case Diagram artifact is illustrated employing an IBM Watson-like commercial use case on big data in a health sector application.
Article
Full-text available
Conference Paper
Full-text available
Privacy has become increasingly important to the database community which is reflected by a noteworthy increase in research papers appearing in the literature. While researchers often assume that their definition of “privacy” is universally held by all readers, this is rarely the case; so many papers addressing key challenges in this domain have actually produced results that do not consider the same problem, even when using similar vocabularies. This paper provides an explicit definition of data privacy suitable for ongoing work in data repositories such as a DBMS or for data mining. The work contributes by briefly providing the larger context for the way privacy is defined legally and legislatively but primarily provides a taxonomy capable of thinking of data privacy technologically. We then demonstrate the taxonomy’s utility by illustrating how this perspective makes it possible to understand the important contribution made by researchers to the issue of privacy. The conclusion of this paper is that privacy is indeed multifaceted so no single current research effort adequately addresses the true breadth of the issues necessary to fully understand the scope of this important issue.
Conference Paper
Full-text available
This paper describes a unified model for role-based access control (RBAC). RBAC is a proven technology for large-scale authorization. However, lack of a standard model results in uncertainty and confusion about its utility and meaning. The NIST model seeks to resolve this situation by unifying ideas from prior RBAC models, commercial products and research prototypes. It is intended to serve as a foundation for developing future standards. RBAC is a rich and open-ended technology which is evolving as users, researchers and vendors gain experience with it. The NIST model focuses on those aspects of RBAC for which consensus is available. It is organized into four levels of increasing functional capabilities called flat RBAC, hierarchical RBAC, constrained RBAC and symmetric RBAC. These levels are cumulative and each adds exactly one new requirement. An alternate approach comprising flat and hierarchical RBAC in an ordered sequence and two unordered features - constraints and symmetry - is also presented. The paper furthermore identifies important/attributes of RBAC not included in the NIST model. Some are not suitable for inclusion in a consensus document. Others require further work and agreement before standardization is feasible.
Conference Paper
As the amount of data being collected by service providers increases, privacy concerns increase for the data owners that must provide private data to get services. Legislative acts require enterprises protect the privacy of their customers and privacy policy frameworks such as P3P assist enterprises in demonstrating their privacy policies to customers (i.e. publishing privacy policy on websites). Unfortunately, defining these standards does not guarantee that the privacy policies are actually enforced since privacy is not central to conventional access control models. Furthermore, a privacy-preserving model should consider the privacy preferences of both data provider and data collector. This paper presents a Lattice-based Privacy Aware Access Control (LPAAC) Model. The key contribution is providing a privacy preserving model that enforces privacy policies and facilitates customization of privacy agreements and preferences of both data providers and organizations that collect data.
Article
In this article we introduce a model to enforce access control to data containing personally-identiable information and as such privacy-sensitive. The model extends the well known RBAC models in order to express highly complex privacy-related policies, taking into account features like purposes and obligations. In the paper we discuss the notions of obligations and privacy-aware permissions. We believe that a full- edged privacy-aware access control solution based on RBAC may have a great potential because it can be easily deployed in systems already adopting RBAC and would thus allow one to seamlessly integrate access control policies and privacy policies.
Article
A privacy-aware role-based access control (P-RBAC) model that extends RBAC to express complex privacy-related policies, including such features as conditions and obligations is discussed. P-RBAC is easy to deploy in systems already adopting RBAC, thus allowing seamless integration of access control and privacy policies. Conditional P-RBAC introduces permission assignment sets and complex Boolean expressions. It can express more complex conditions than those supported by core P-RBAC's condition language. Hierarchical P-RBAC introduces the notions of role hierarchy, object hierarchy, and purpose hierarchy. P-RBAC can represent privacy law rules with obligations using a rule from COPPA. P-RBAC features method that deals with obligations with subject binding instead of action binding.
Conference Paper
One of the main privacy concerns of users when submitting their data to an organization is that their data will be used only for the specified purposes. Although privacy policies can specify the purpose, enforcing such policies remains a challenge. In this paper we propose an approach to enforcing purpose in access control systems that uses workflows. The intuition behind this approach is that purpose of access can be inferred, and hence associated with, the workflow in which the access takes place. We thus propose to encode purposes as properties of workflows used by organizations and show how this can be implemented. The approach is more general than other known approaches to purpose-based enforcement, and can be used to implement them. We argue the advantages of the new approach in terms of accuracy and expressiveness.