Conference Paper

Temporal Blurring: A Privacy Model for OMS Users.

DOI: 10.1007/11527886_56 Conference: User Modeling 2005, 10th International Conference, UM 2005, Edinburgh, Scotland, UK, July 24-29, 2005, Proceedings
Source: DBLP


Stereotypes and clustering are some techniques for creating user models from user behavior. Yet, they possess important risks as users actions could be misinterpreted or users could be associated with undesirable profiles. It could be worst if users' actions, beliefs, and comments are long term stored such as in Organizational Memory Systems (OMS) where users' contributions are available to the whole organization. We propose a privacy model based on four privacy roles that allow users to control the disclosure of their personal data and, when recovered, blurs such data as time passes.

Download full-text


Available from: Rosa Alarcón,
  • Source
    • "The need for privacy for every user, no matter how computer-illiterate, suggests privacy measures should be automatic. For example, a proposal of an automatic implementation of privacy measures is called Temporal Blurring [1]. This mechanism is a metaphor of the real world, in which people gradually forget information after a meeting. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The use of past information is becoming critical for organizations. Coordination, decision making and negotiation are some of the collaborative activities that depend on it. However, the need to provide privacy and information retrieval capabilities to all users of an organizational memory system (OMS), requires a good strategy for acquiring and structuring the information. This paper presents a strategy intended to (a) facilitate the feeding of an OMS based on information stored in legacy information systems, (b) ease the information retrieval process, and (c) provide an automatic privacy mechanism for the information stored in the OMS.
    Computer Supported Cooperative Work in Design, 2007. CSCWD 2007. 11th International Conference on; 05/2007
  • Source
    • "This would make all the information in the OMS have the same structure, and it would permit the retrieval of information and the automatic privacy measures to take place. An example implementation of the proposal, called Temporal Blurring, can be found in [3]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: People are usually concerned with the privacy of their personal information. However, the problem of privacy is also present when the information is directly linked to people, for example in personal opinions, discussions and decisions. Typically, if this information is indefinitely stored in an organizational memory system, then it may cause privacy problems inside the organization. Privacy is needed for users to feel free enough to express themselves honestly, and consequently, for the collaborative software to succeed. This paper presents a model to incorporate automatic privacy measures in an Organizational Memory System.
    Proceedings of the 10th International Conference on CSCW in Design, CSCWD 2006, May 3-5, 2006, Southeast University, Nanjing, China; 01/2006
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Protection of private data is one of the most challenging issues threatening the success of Pervasive Computing (PC). Regarding the principles of privacy and the vision of pervasive computing, one of the most troublesome problems is how to provide people's control of their data while not distract them to a great extent. A suitable approach for protecting people's private data in PC environments should consider the intention of all influential entities (data determiners), and invisibly gain their consent by applying their desired preferences. In extension of the current approaches which consider a single set of preference rules for a private data item when deciding about its disclosure, we propose dealing with different preferences of all involved determiners in a distributed manner. In this paper we investigate possible determiners of private data and propose a set of required meta-data as well as a multi-determiner protection procedure to protect private data with regard to the preferences of its determiners. Moreover, we propose the DELEGATION behavior as a determiner's response to a request instead of ACCEPT or REJECT. We demonstrate the efficiency of the suggested procedure and present a prototype implementation of a multi-determiner architecture to realize the protection procedure.
Show more