access control models
Md. Moniruzzaman, Md. Sadek Ferdous† and Roksana Hossain‡
Dept. of Computer Science, University of Calgary, Calgary, AB, Canada.
†School of Computing, University of Kent, Canterbury, Kent, UK.
‡ Dept of ECE, University of Western Ontario, London, ON, Canada.
email@example.com, †firstname.lastname@example.org, ‡email@example.com
Internet has gained huge popularity over the last
decade. It offers its users reliable, efficient and exciting
online services. However, the users reveal a lot of their
personal information by using these services. Websites
that collect information state their practices with data in
their privacy policies. However, it is difficult to ensure
if the policies are enforced properly in their practices.
This can lead to unintentional leakage of private
information to unauthorized parties and thus increase
the chance of private data to be misused. Taking the
help of legal systems is expensive, time consuming and
cannot compensate the loss completely. Therefore, an
to control data access. In this paper, we review some
distinguished research works that address this problem.
We also discuss the completeness of the privacy
definition used in these works.
Keywords: Privacy, access control, database, policy
Internet has revolutionized the way people see, do and
perceive things in so many aspects of their life. The
impact of this revolution has been regarded as almost
positive and people embraced it with a very positive
frame of mind. This can rightfully be indicated by the
tremendous growth rate of Internet users over the last
decade. Success of Internet can be attributed to the
gigantic number of websites and a wide range of
services they offer. While most of these services are
free, users disclose their private information by using
them. Some information are required by a website to
provide the service, some are collected automatically by
monitoring the users’ activities and some are provided
voluntarily by a user. Considering the amount and
varieties of personal information shared, internet users
have a higher risk of privacy breach. Though the
websites publish their policies about how they would
use the collected data, those policies stay in documents
and do not play an active role when data are actually
used. In a bid to improve the online privacy practice,
World Wide Web Consortium proposed a protocol
called Platform for Privacy Preferences Project (P3P)
, . It includes an XML based policy language that
can be used by a website to encode their privacy
policies. Users can also set up their privacy preferences
in the internet browsers. When a website is visited, the
browser checks if the practices and preferences match.
If they do not, the user is warned that the website is not
compliant with the desired privacy settings. P3P has
been well accepted in the online community. However,
it does not solve the privacy problem as it does not
guarantee that a website will maintain its promises in
practice. Later, Agrawal et al.  proposed a theoretical
database model, known as Hippocratic database, that
database has set a new line of research that studies
With the previous discussion in mind, our contribution
in this paper is threefold. First, we present a summary of
the components of privacy that are enforceable by
computer systems in Section II of the paper. Secondly,
we classify the existing privacy preserving access
control models into four categories and review some of
the key works from each category in Section III. We
conclude in Section IV by comparing the reviewed
works against the privacy components described in
Section II. We also discuss possible future works.
II. BACKGROUND CONCEPTS
A. Data provider, collector and user
Individuals who provide their personal information are
called data providers. Examples include customers,
patients, website users etc. Organizations that collect
and store information about individuals are called data
collectors. Data users are the people who use the
collected data. For instance, employees and business
partners of a data collector may use the data.
B. Privacy components
Privacy definition slightly varies in the literatures.
Agrawal et al. specify ten principles that are necessary
for protecting data privacy . Barker et al.
summarized these principles into four components–
purpose, visibility, granularity and retention. Obligation
is another component that exists in privacy policies and
also used by many privacy preserving models , .
These five components are briefly summarized below.
Purpose: It is the reason or intention for using data.
Privacy of a data item is breached if it is used for a
Visibility: It refers to the categories of data users who
can access data. Barker et al. identify four visibilities–
Data provider, Collector, Third party and World. Data
with visibility Data provider can be accessed by the
provider himself/herself. Data with visibility Collector
can only be accessed by data users inside the collecting
organization while visibility Third party let the collector
to share data with external organizations. Finally, the
visibility World allows everybody to access data. Data
providers select visibility for their data.
Retention: How long a collector should keep data is
defined by data retention period. It can be in the form of
a date, time period or number of accesses. Data should
be removed from the system once its retention ends.
Granularity: It refers to the levels of detail for a data
item. Consider home address as data that can have these
granularities- full address, only city, only country, etc.
Obligation: It is a task to be performed by data users as
a responsibility for accessing data. Seeking parental
consent to use children’s data is a sample obligation.
III. PRIVACY PRESERVING ACCESS
We divide the privacy preserving access control models
into four classes. It is important to note that most of the
privacy preserving models use or adopt an existing and
widely accepted access control model called role-based
access control model (RBAC) . RBAC groups the
users in an organization and assigns roles related to their
job responsibilities. Permissions are then assigned to
roles that give users access to data items. However,
RBAC was not designed with privacy in mind and
A. Access control models using privacy
The models of this class store privacy policies within a
database. Access to any data item should satisfy the
A.1 Extended relational database systems
Agrawal et al.  propose an extension for relational
database systems using the idea of Hippocratic database.
During data collection, this model uses P3P to collect
the provider’s privacy preference. A policy translator
module transfers the privacy preferences that are in P3P
syntaxes into a schema called T1. Policy stored in T1
states what data can be used for which purposes and by
which recipients. In this model, data users and recipients
are two different entities. A recipient can be the data
collecting organization itself or any of its partner
organizations. Schema T1 also stores the information
regarding a data provider’s opt out information for a
particular purpose and recipient.
restriction is created which is then tied to a data item.
Access to the data item is allowed when the attached
constraint is satisfied. The authors provide a language to
write constraint given in Fig. 1. The language expresses
that users in the list authors1 except the users in
authors2 are allowed to access table_T. Using options
like to columns, to rows and to cells, the constraint can
be applied to a column, row or even to a cell of a table.
The purpose and recipient information are drawn from
may specify which action or combination of actions is
allowed on data.
Data users are given privileges using SQL authorization
statement (e.g., grant). To be compliant with the
extended model that uses restriction, each grant
statement may include additional parameters like
purpose and recipient. When a data query is submitted
along with the purpose and recipient, an algorithm
checks if there is any restriction for each table reference
in the query and it then replaces the table reference by a
dynamic view. Dynamic view is created on-the-fly
based on the restriction related to the table. Detail of the
view building algorithm is not included here. Finally,
the output of the view works as the data source for the
query. As the view reveals the amount of data allowed
provider’s privacy is protected.
The model allows the data provider to specify the
purpose for which data can be used. This constraint will
prohibit illegitimate use of data since it can only be
accessed for the right purpose. In addition to purpose,
the data provider can specify the data recipient i.e., who
will use the data. For a particular purpose-recipient pair,
a data provider can opt out and then that recipient
cannot use the data for that purpose. This model does
not investigate privacy components like retention,
granularity and obligation.
A.2 View based privacy protection (Microview)
Byun and Bertino  present a privacy preserving
model that creates different views of a data item based
on the data provider’s privacy preference. Here, the
concept of view is identical to the privacy component
granularity. The model assumes that during data
collection, a data provider mentions purpose for using
data and the level of privacy for that use. The three
levels of privacy are considered – Low (L), Medium
(M) and High (H). For a given purpose, Low level
indicates that the provider is not really concerned about
using the data for that purpose, Medium level indicates
a moderate concern of the provider and finally, High
level indicates that the provider is very concerned about
such usage of the data. For each level of privacy, a
different version of a data item is created. A sample data
modeling is presented in Table I.
Create restriction name_of_restriction
for authors_1 [ except authtors_2]
(( (to columns list_of_columns)|(to rows [ where searching_condition])|(to cells(list_of_columns[where
[ for purpose list_of_purposes]
[ for recipient list_of_recipients])+
. 1. Lan
The table includes the customer’s preferred privacy
levels for purpose P_contact and P_promotion.
The authors of this model suggest adopting any existing
access control model for authorization management.
Regardless of the authorization model, a data user
should submit an access purpose for querying the
database. Query output depends on the preferred privacy
level of the data provider for that access purpose. For
instance, if a query runs on Table I with purpose
P_promotion, it will get the output <1010, John S., <6
Essex Street, Toronto,ON>, 35-45 years>.
A.3 Purpose based access control for RDBMS
Byun and Li propose a purpose-based access control
model for relational database systems . They
consider purpose as the only privacy information given
by a data provider. Purpose is stored with data in the
same table and acts as the intended purpose for using
The authors propose four types of purpose labeling of
data – element based, tuple based, attribute based and
relation based labeling. In the element-based labeling,
each data element contains an intended purpose. In tuple
based labeling, each record in a table has an intended
purpose and access to any data element of that record is
controlled by that purpose. In attribute based labeling,
there is one purpose for each column of a table. In
relation based approach, there is only one intended
purpose for the entire table.
PBAC adopts a role based approach for authorizing data
users where a data user is given a role. Instead of
assigning permissions to a role, purposes are assigned to
a role. Later, users use these purposes to request data.
Access request to a data is granted if the access purpose
matches the intended purpose of the data. In PBAC,
role-purpose assignment is conditional where users of a
role should satisfy the condition to use the purpose. For
example, (r, Admin, UserID=123 AND
time=[9am:5pm]) is a role-purpose assignment that
states any user with the role r can use the purpose
Admin if the user’s ID is 123 and time of data access is
between 9am and 5pm. PBAC organizes purpose into a
tree structure that expresses a partial ordering of
purposes. The model considers both positive and
negative notion of purpose. If a data has a negative
intended purpose, then data can be accessed for any
purpose except the intended one. As described, PBAC
considers purpose as the only privacy components.
However, purpose management of this model is
insightful and would set a good direction for future
works. Another limitation of the model is that the access
Once a data user is authorized for a purpose, he/she can
access all data having the same intended purpose.
However, an organization may want to limit its users’
privileges to a subset of the data allowed by the privacy
B. Access control models based on RBAC
Many privacy preserving models use RBAC. These
models fall into two categories- the ones that extend
RBAC and the ones that use RBAC as an add-on. We
describe a model of the first category here. It modifies
Ni et al. propose a family of role-based privacy
preserving access control models called P-RBAC ,
, -. The authors first propose a basic model.
The subsequent models are built on it by adding new
features. We can classify these models into these
categories – privilege model, hierarchical model,
condition model and obligation model. We summarize
all the models by starting with the basic model and then
discussing key features of the other models.
In the typical role-based access control model, an access
privilege consists of data and action. In P-RBAC,
privilege has the form (data d, action a, purpose p,
condition c, obligation ob) where purpose, condition
and obligation are the privacy requirements. This type
of privilege expresses a privacy-aware access control
policy stating that in order to perform the action a on
the data d, the intended purpose should be p while
satisfying the condition c and obligation ob. Similar to
the RBAC model, permission is assigned to a role which
in turn is assigned to a data user. A data user should
satisfy the privacy requirements in order to get access.
Condition used in permission is a Boolean expression
that verifies the values of context variables. Context
variables store relevant information necessary to
validate a privacy-aware permission. Commonly stored
information includes parental consent and data
provider’s consent. Obligation is usually a function that
should be performed before or after accessing a data.
that sells toys “From time to time, you may receive
periodic mailings, telephone calls or e-mails from "R"
Us Family members with information on products or
services, discounts, special promotions, upcoming
events or other offers from an "R" Us Family member
or its marketing partners. You may opt out of receiving
e-mail communications by clicking the link at the
bottom of the e-mail received” .
In short, the policy allows the website’s employees,
termed as “R” Us family members, to access the
customers’ contact information for sending promotional
offers. To express this policy using P-RBAC, let the
Address Age P_contact P_promotion
6 Essex Street,
Table I Different views of data for
role of an employee be MR, data be Email address,
purpose be Promo and OwnerConsent be a context
variable that stores the consent of a data provider. The
role-permission assignment is (MR, <Email address,
Promo, OwnerConsent !=NO, Ø>). This permission
does not require any obligation.
In the basic model, condition is written using relational
operators and context variables. In a later work , the
authors propose a more expressive language for writing
condition. The language introduces hierarchical
operator that is used to express the relation between two
roles. It also supports combining two or more logical
expression using logical connectors & and |.
In , , Ni et al. provide a detail framework for
expressing obligation policy in P-RBAC which consists
of general constraint, user, action, object and temporal
constraint. Here, the general constraint is a Boolean
expression, user is the person performing the obligation,
temporal constraint stores the activation time of
obligation with respect to data access and finally, object
may include users and temporal constraints. Consider an
visitors’ information. “Before using children’s data, we
take consent from the respective parents”. If we
consider Home address as data, Marketing as purpose
and parentalconsent and ownerAge13plus as context
<read, HomeAddress, Marketing,
ownerAge13plus=”Yes” AND parentalConsent=”Yes”,
(parentalConsent=No,self, (takeConsent(), pi),tc) >
The underlined part in the privilege is the obligation.
The privilege states that to read the home address of the
data providers who are below 13 years of age, a user
should ask (takeConsent()) the respective parents (pi)
for their consent if it is not taken yet
(parentalConsent=No). In obligation, self denotes that
users accessing privilege and performing obligation
should be the same. Temporal constraint tc typically
consists of start and end time that say when an
obligation should be performed. The constraint might
also specify when a user should reattempt to do
obligation if there is no reply from the target object.
As a privacy preserving model, P-RBAC considers only
purpose and obligation. In P-RBAC, the authorization
However, the authors did not present any formal model
language. This lacking leaves it unclear how policy is
consulted for creating new privileges. Another work of
this class is proposed in .
C. Access control models based on
In traditional access control models like RBAC, there is
no control on the order of data access. On the other
hand, in workflows and transactions the set of data
accesses to complete any objective are predetermined.
Data accesses are also ordered with respect to each
other. This section present research works that study
privacy enforcement in this type of systems.
C.1 Transaction based PBAC (TPBAC)
Yang et al. extends the purpose based access control
model (PBAC) in . We have described PBAC in the
section A.3. Originally, PBAC is based on relational
database systems. Yang et al. propose a privacy aware
data model that is independent of any database system.
In the data model, data is categorized into several
classes like registration data, contact data, treatment
data, etc. Privacy metadata is associated with each data
class instead of data itself. Privacy metadata includes
purpose and retention.
The authorization model is similar to the original PBAC
where purpose is conditionally assigned to a role. Yang
et al. constrain data access through transaction. They
define the set of necessary data accesses to fulfill each
purpose and the transactions that would contain those
accesses. For example, <Marketing, Contact data,
T#21> expresses that one of the necessary data accesses
to fulfill the purpose Marketing is to access data of type
Contact data through the transaction T#21. In addition
to purpose, the model includes retention. However, it
does not provide any enforcement framework that
removes data when retention expires.
As discussed before, one limitation of the original
PBAC is that it does not support complex access control
policies. Yang et al. address this problem by specifying
necessary accesses for each purpose. In their model, a
data user is authorized for a purpose. Users can access a
data item if the data is listed in the set of necessary data
access for their assigned purpose. Finally, the assigned
purpose should equal the intended purpose of the data.
Therefore, access control is fine-grained.
C.2 Purpose management using workflows
In privacy preserving access control, a user has to
submit a purpose for accessing data. Some models (e.g.,
) trust users and accept whatever purpose they
provide while some models ,  authorize users to
certain purposes that they can use. Jafari et al. 
propose using workflows to govern data access where
the access purpose is determined by a workflow’s
context that issues the access request.
In a hospital, when patients are admitted for treatment, a
workflow starts. Tasks in the workflow require access to
patients’ data. Therefore, access request issued from the
workflow will submit treatment as the access purpose
which comes from the context of the workflow. Since a
workflow can have sub-workflows, a purpose can have
sub-purposes or more specific purposes. For instance,
the purpose treatment can have sub purposes like
examination and prescription.
Once the access purpose is determined, to check if the
authors suggest using one of the existing approaches
, . The main focus of this model is how to
manage purpose at the enterprise level. Therefore, we
assume that the model considers purpose as the only
D. Access control models using XML
based policy language
Access control models of this class include an XML
based policy language. One interesting property of this
language is its extensibility to suit the requirements of
D.1 eXtensible Access Control Markup
XACML defines a general-purpose access control
system . It offers an XML based policy language for
specifying the requirements to access a protected
resource. The language uses a set of attributes that
encode different properties of subject (data user),
resource (data), action and environment. The semantics
of the attributes are defined in an XML schema file.
In XACML, the smallest unit of an access control policy
is called rule that consists of a target, condition and
response. Target that determines if a rule applies to an
access request contains subjects, resources and actions.
Condition of a rule is a function that checks additional
properties of a resource requestor and its environment.
After validating an access request, a rule outputs one of
these decisions- permit, deny, not applicable and
indeterminate (indicates an error generated during the
validation). The decision may include obligations which
are additional tasks a data user should perform.
The privacy profile of XACML, an extension of the
policy language , includes two more attributes:
resource purpose that is the intended purpose for using
a data item and action purpose that is the reason for
using data submitted by a data user. An access request is
allowed when these two purposes match. The privacy
profile also suggests a deny override algorithm in the
case when more than one rule apply to an access request
and one of them denies the request. In addition to the
policy language, XACML includes a policy processing
model that interprets policies in the relevant
XACML has several limitations. Policy language is
complex and verbose. To encode a simple policy, it
requires many lines of code. It also considers purpose as
the only privacy requirement. Since the policy language
can be extended, privacy components like visibility and
retention can be applied by defining them as attributes
of the user and data respectively. As granularity needs a
functional implementation, it would be challenging to
apply it in XACML.
D.2 Enterprise Privacy Authorization Language
IBM proposed a language called EPAL  to encode
privacy policies. An EPAL policy is more like an access
control rule or permission that can be enforced by an
enforcement engine. Similar to XACML, the language
is based on XML and uses a set of attributes called
vocabularies. The semantics of the vocabularies are
defined for the application domain in a schema file. Key
components of an EPAL policy are target, condition and
obligation. The target consists of user categories, data
categories, purpose and action. EPAL divides users and
data into different categories. Examples of user
categories are doctor, nurse etc. Examples of data
categories include registration data, treatment data, etc.
Action in the policy is the task that is performed on data
(e.g., read, write, update etc). Inclusion of purpose
makes the target privacy aware. The condition tests
contextual and privacy related information (e.g., if a
data provider has been opted out from the policy). If
applicable to an access request, a rule inference gives
one of the three decisions- allow, deny and not
applicable. The decision may also be accompanied by
As described, the privacy components in EPAL include
purpose and obligation. One limitation is that EPAL
does not provide any formal framework to express and
enforce obligation policies. The model also does not
elaborate the concept of role hierarchy which is a
limitation for encoding enterprise policy.
In this paper, we reviewed access control models that
can be used to enforce data privacy. Privacy
components used in different access control models are
listed in Table II. It is interesting to note that no model
uses all the components of our privacy definition.
However all of them use purpose since privacy policies
tend to focus on it. Ideas like retention and obligation
are getting popular. We believe that there will be many
research works in near future bringing more insights to
these ideas. Using different versions/granularities of
data is common in data publishing , though it is not
very popular in access control. The reason is the
overhead of creating different data versions on-the-fly.
Data publishing is done offline and therefore, the
overhead does not make any significant difference.
However, instant system response is the expected
performance in access control. So the same overhead
may severely affect the system performance. Further
Purpose √ √ √ √ √ √ √ √
Visibility √ x x x x x x x
Granularity x √ x x x x x x
Retention x x x x √ x x x
Obligation x x x √ x x √ √
Table II Checklists of
access control models
research work can bring more insight on purpose
interpretation at the enterprise level. With little research
done so far, how a system should verify the purpose
submitted by a data user is still an open question.
Retention is treated in the literature as a date in the
system calendar. In practice, retention can be date,
duration, number of data access or event. Consider a
should be deleted after they are discharged from the
hospital. In this case, the retention of patients’ data is an
event. Further works are necessary to support all types
of retention. Another future work can be the application
of game theory to better understand the privacy
negotiation among data providers, collectors and third
parties. In game theory , each player has a numeric
objective and players try to choose an action that
maximizes their objective. The situation of interest in
game theory is the equilibrium when players are
unlikely to change their behaviors. A data provider, data
collector and third party can be put as players in a game
where the provider will try to get as much services as
possible by exposing minimum data while others will
try to maximize their utility by getting more data with
minimum services. Most privacy preserving models do
not consider hierarchy for privacy components. Data,
purpose, retention, granularity and visibilities can be
organized to hierarchies which will make the model
more expressive. Ghazinour et al.  have used lattice
structure to organize purpose, retention, granularity and
visibilities. However, they did not consider any such
organization for data. To build hierarchy, one should
distinguish between the types and entities belong to
those types. For data, the definitions of type and entities
are intuitive. For example, demographic information is a
type and a customer’s country is an entity of this type.
This entity cannot be further divided as subclasses. For
other privacy components like purpose, this distinction
is not clear. One possible future work is to build a
model that organizes all privacy components into
hierarchies by considering their different characteristics.
 W3 Consortium. Platform for privacy preferences
(P3P) project. Available: http://www.w3.org/P3P/.
 L. Cranor, “Web Privacy with P3P,” O’Reilly &
Associates, September, 2002.
 R. Agrawal et al., “Hippocratic databases,” in Proc.
of 28th VLDB, 2002, pp. 143–154.
 K. Barker et al., “A data privacy taxonomy,” in
Proc. of 26th BNCOD, 2009, pp. 42-54.
 Q. Ni, E. Bertino & J. Lobo,“An Obligation Model
Bridging Access Control Policies and Privacy Policies,”
in Proc. of ACM SACMAT,2008,pp.133-142.
 A. Masoumzadeh & J.B.D. Joshi, “PuRBAC:
Purpose-aware role-based access control,” in On the
Move to Meaningful Internet Systems,Part
 R. Sandhu, D. Ferraiolo & R. Kuhn, “The NIST
model for role-based access control: Towards a unified
standard,” in Proc. of 5th ACM Workshop on RBAC,
2000, pp. 47–63.
 R. Agrawal et al., “Extending relational database
systems to automatically enforce privacy policies,” in
Proc. of 21st ICDE, 2005.
 J.W. Byun & E. Bertino, “Micro-views, or on how
to protect privacy while enhancing data usability:
concepts and challenges,” SIGMOD Record, Vol. 35,
No. 1, 2006, pp. 9-13.
 Q. Ni et al., “Privacy-aware role based access
control,” in Proc. of 12th SACMAT, 2007, pp.41-50.
 J. W. Byun & N. Li, “Purpose based access control
for privacy protection in relational database systems,”
Int. J. on VLDB, Vol. 17, Issue 4, 2008, pp. 603–619.
 Q. Ni et al., “Conditional privacy aware role based
access control,” in Proc. of 12th European Symp. on
Research In Computer Security,2007, pp.72-89.
 Q. Ni, E. Bertino & J. Lobo,“Privacy-aware RBAC
- Leveraging RBAC for Privacy,” in IEEE Security &
Privacy Magazine,Vol. 7,No. 4,2009,pp.35-43.
 N. Yang, H. Barringer & N. Zhang “A purpose-
based access control model,” in Proc. of 3rd Int. Symp.
on Inform. Assurance. & Security,2007,pp. 143–148.
 M. Jawad, P.S. Alvaredo, & P. Valduriez, “Design
of PriServ, a privacy service for DHTs,” in Int.
Workshop on Privacy & Anonymity in the Inform.
 M. Jafari, R. Safavi-Naini & N.P. Sheppard
“Enforcing purpose of use via workflows,” in Proc. of
8th ACM Workshop on Privacy in the Elect.
 OASIS. XACML. Available: http://www.oasis-
 OASIS. XACML’s Privacy profile. Available:
 P. Ashley et al.,“Enterprise Privacy Authorization
Language (EPAL),” Research Report RZ 3485, IBM
 L. Sweeney, “K-anonymity: A model for protecting
privacy,” Int. J. on Uncertainty, Fuzziness, &
Knowledge-based Syst., Vol.10,No.5,2002.
 R. Gibbons, “Game theory for applied economists,”
Princeton University Press, 1992.
 K. Ghazinour, M. Majedi, & K. Barker,"A Lattice-
based Privacy Aware Access Control Model," in Proc.
of IEEE Int. Conf. on Inform. Privacy, Security, Risk &