Article
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The visual lifelogging activity enables a user, the lifelogger, to passively capture images from a first-person perspective and ultimately create a visual diary encoding every possible aspect of her life with unprecedented details. In recent years, it has gained popularities among different groups of users. However, the possibility of ubiquitous presence of lifelogging devices specifically in private spheres has raised serious concerns with respect to personal privacy. In this article, we have presented a thorough discussion of privacy with respect to visual lifelogging. We have readjusted the existing definition of lifelogging to reflect different aspects of privacy and introduced a first-ever privacy threat model identifying several threats with respect to visual lifelogging. We have also shown how the existing privacy guidelines and approaches are inadequate to mitigate the identified threats. Finally, we have outlined a set of requirements and guidelines that can be used to mitigate the identified threats while designing and developing a privacy-preserving framework for visual lifelogging.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In addition to concerns about information overload [18], privacy was frequently raised as a significant issue for the potential widespread adoption of lifelogging [14,26,37,59]. Even without the public dissemination of lifelogging data that we associate with the practice today, privacy was seen as a roadblock due to legal issues and the recording of data beyond oneself (e.g., bystanders or sensitive settings [19,26,68]). ...
... In addition to concerns about information overload [18], privacy was frequently raised as a significant issue for the potential widespread adoption of lifelogging [14,26,37,59]. Even without the public dissemination of lifelogging data that we associate with the practice today, privacy was seen as a roadblock due to legal issues and the recording of data beyond oneself (e.g., bystanders or sensitive settings [19,26,68]). The failure of Google Glass, the first significant commercial augmented reality product, was in part due to both concern over privacy law and widespread public discomfort, illustrating the difficulty in navigating both policy and social norms [46,74]. ...
... Automated methods for preserving privacy in-situ for lifelogging were developed quite early on-for example, defensive frameworks defined by physical location such as bedrooms and bathrooms [68], and methods based on facial and movement detection [69]. However, these privacy-related issues become even more pronounced when lifelogging data is shared beyond the user [26,59]. ...
Article
As the process of creating and sharing data about ourselves becomes more prevalent, researchers have access to increasingly rich data about human behavior. Framed as a fictional paper published at some point in the not-so-distant future, this design fiction draws from current inquiry and debate into the ethics of using public data for research, and speculatively extends this conversation into even more robust and more personal data that could exist when we design new technologies in the future. By looking to how the precedents of today might impact the practices of tomorrow, we can consider how we might design policies, ethical guidelines, and technologies that are forward-thinking.
... • Pre-activity such as proactive opt-in and opt-out designs [32]; preference toward pre-determined consent [17,140], typically enacted through tracked physical artefacts [137] or biometric ID combined with cloud-based preferences [75,138] • During activity through interaction, from verbal social indications that direct the AR user's activity [69] to gestural technology-mediated interactions conveying opt-in/out to the headset directly [69], examples of "user empowerment" approaches [44,109]. ...
... Five respondents were concerned about the safety of the user or bystander, P69 (Female,[35][36][37][38][39][40][41][42][43][44] United Kingdom): "what if they [the AR user] are blocking awareness of cars and step in front of mine while I am driving!", and noted that additional safety measures would be required, P102 (Male, 18-24, United Kingdom): "there should be regulations in place to avoid hazards from unawareness or disregard for what's transpiring in an individual's environment". Five others were respondents who believed the technology would facilitate predatory behaviour, P93 (Male, 18-24, United Kingdom): "this technology would allow predatory behaviour to be carried out far too easily". ...
Article
Full-text available
Fundamental to Augmented Reality (AR) headsets is their capacity to visually and aurally sense the world around them, necessary to drive the positional tracking that makes rendering 3D spatial content possible. This requisite sensing also opens the door for more advanced AR-driven activities, such as augmented perception, volumetric capture and biometric identification - activities with the potential to expose bystanders to significant privacy risks. Existing Privacy-Enhancing Technologies (PETs) often safeguard against these risks at a low level e.g., instituting camera access controls. However, we argue that such PETs are incompatible with the need for always-on sensing given AR headsets' intended everyday use. Through an online survey (N=102), we examine bystanders' awareness of, and concerns regarding, potentially privacy infringing AR activities; the extent to which bystanders' consent should be sought; and the level of granularity of information necessary to provide awareness of AR activities to bystanders. Our findings suggest that PETs should take into account the AR activity type, and relationship to bystanders, selectively facilitating awareness and consent. In this way, we can ensure bystanders feel their privacy is respected by everyday AR headsets, and avoid unnecessary rejection of these powerful devices by society.
... Doherty et al. [30] summarize lifelogging as the process of automatically recording aspects of one's life in digital form. A specific technique of lifelogging, which has drawn particular attention in research, is visual lifelogging regarding privacy issues [31][32][33][34]. This activity enables users to passively capture images from a first-person perspective via camera and ultimately creates a visual diary encoding every possible life aspect with unprecedented details [32]. ...
... A specific technique of lifelogging, which has drawn particular attention in research, is visual lifelogging regarding privacy issues [31][32][33][34]. This activity enables users to passively capture images from a first-person perspective via camera and ultimately creates a visual diary encoding every possible life aspect with unprecedented details [32]. ...
Preprint
Health self-tracking is an ongoing trend as software and hardware evolve, making the collection of personal data not only fun for users but also increasingly interesting for public health research. In a quantitative approach we studied German health self-trackers (N=919) for differences in their data disclosure behavior by comparing data showing and sharing behavior among peers and their willingness to donate data to research. In addition, we examined user characteristics that may positively influence willingness to make the self-tracked data available to research and propose a framework for structuring research related to self-measurement. Results show that users' willingness to disclose data as a "donation" more than doubled compared to their "sharing" behavior (willingness to donate= 4.5/10; sharing frequency= 2.09/10). Younger men (up to 34 years), who record their vital signs daily, are less concerned about privacy, regularly donate money, and share their data with third parties because they want to receive feedback, are most likely to donate data to research and are thus a promising target audience for health data donation appeals. The paper adds to qualitative accounts of self-tracking but also engages with discussions around data sharing and privacy.
... Doherty et al. [35] summarize lifelogging as the process of automatically recording aspects of one's life in digital form. A specific technique of lifelogging, which has drawn particular attention in research, is visual lifelogging regarding privacy issues [36][37][38][39]. This activity enables users to capture images from a first-person perspective passively via camera and ultimately creates a visual diary encoding every possible life aspect with unprecedented details [37]. ...
... A specific technique of lifelogging, which has drawn particular attention in research, is visual lifelogging regarding privacy issues [36][37][38][39]. This activity enables users to capture images from a first-person perspective passively via camera and ultimately creates a visual diary encoding every possible life aspect with unprecedented details [37]. ...
Article
Full-text available
Health self-tracking is an ongoing trend as software and hardware evolve, making the collection of personal data not only fun for users but also increasingly interesting for public health research. In a quantitative approach we studied German health self-trackers (N = 919) for differences in their data disclosure behavior by comparing data showing and sharing behavior among peers and their willingness to donate data to research. In addition, we examined user characteristics that may positively influence willingness to make the self-tracked data available to research and propose a framework for structuring research related to self-measurement. Results show that users’ willingness to disclose data as a “donation” more than doubled compared to their “sharing” behavior (willingness to donate = 4.5/10; sharing frequency = 2.09/10). Younger men (up to 34 years), who record their vital signs daily, are less concerned about privacy, regularly donate money, and share their data with third parties because they want to receive feedback, are most likely to donate data to research and are thus a promising target audience for health data donation appeals. The paper adds to qualitative accounts of self-tracking but also engages with discussions around data sharing and privacy.
... More specifically, the public health sector has shown a growing interest in using life-logging as a memory prosthesis to aid patients with or without Autobiographical Memory Impairment (AMI) by collating and presenting the visual logs as a digital diary/storyboard, a collection of chronologically ordered images, to the older adults who can then retrieve the images and reminisce relevant episodic memories [16]. However, these embedded cameras automatically capture various sensitive information (such as bystanders, private situations, and credit card details) and amplify privacy-related implications [15], such as wearer worry & discomfort, facing financial and legal consequences. In addition, by longitudinally collating images and applying computer vision techniques, an attacker can extract various information (such as faces and relationships between people appear in the image) with high accuracy: an alarming threat that leads wearable camera users to abandon their devices. ...
... In this paper, we consider the privacy threat to the life-logger due to "unforeseen inference" [15] made by the host using the information embedded within the life-log. Several recent works have focused on using computer vision techniques to directly detect/identify sensitive/private objects present in the image, such as credit card details and other personally identifiable information [48,49]. ...
Article
Full-text available
Built-in pervasive cameras have become an integral part of mobile/wearable devices and enabled a wide range of ubiquitous applications with their ability to be "always-on". In particular, life-logging has been identified as a means to enhance the quality of life of older adults by allowing them to reminisce about their own life experiences. However, the sensitive images captured by the cameras threaten individuals' right to have private social lives and raise concerns about privacy and security in the physical world. This threat gets worse when image recognition technologies can link images to people, scenes, and objects, hence, implicitly and unexpectedly reveal more sensitive information such as social connections. In this paper, we first examine life-log images obtained from 54 older adults to extract (a) the artifacts or visual cues, and (b) the context of the image that influences an older life-logger's ability to recall the life events associated with a life-log image. We call these artifacts and contextual cues "stimuli". Using the set of stimuli extracted, we then propose a set of obfuscation strategies that naturally balances the trade-off between reminiscability and privacy (revealing social ties) while selectively obfuscating parts of the images. More specifically, our platform yields privacy-utility tradeoff by compromising, on average, modest 13.4% reminiscability scores while significantly improving privacy guarantees -- around 40% error in cloud estimation.
... Researchers have proposed several solutions to mitigate this issue. These include explicitly obtaining consent to record people in public [15], [16], wearing special tags on clothes [17] or using gestures [18] which can be recognized by the device and thus, automatically stops recording/removes the recording for those bystanders. Although, these approach may work in theory but may be quite hard to implement in practice. ...
... Description Audio/Visual cues [16] E.g. such as beep noise or flashing red light [34] on the wearable device to indicate that the device is capable of recording. Opt-out Markers [17] Physical tags on the clothing that can be detecting by the recording device to indicate a person does not wish to be recorded in public. ...
Conference Paper
Full-text available
With the continued improvement and innovation, technology has become an integral part of our daily lives. The rapid adoption of technology and its affordability has given rise to the Internet-of-Things (IoT). IoT is an interconnected network of devices that are able to communicate and share information seamlessly. IoT encompasses a gamut of heterogeneous devices ranging from a small sensor to large industrial machines. One such domain of IoT that has seen a significant growth in the recent few years is that of the wearable devices. While the privacy issues for medical devices has been well-researched and documented in the literature, the threats to privacy arising from the use of consumer wearable devices have received very little attention from the research community. This paper presents a survey of the literature to understand the various privacy challenges, mitigation strategies, and future research directions as a result of the widespread adoption of wearable devices.
... Finally, a discussion about personal data is not complete without also addressing privacy and security, and while these topics are particularly relevant to the lifelog domain [19,20], addressing them in this article would require a far more in-depth consideration of their application and usage. For this reason, we consider the LAD model as only addressing the technical facets of lifelog application research and thus operates on a premise that any target data is collected and stored ethically and securely, and in a location and method compliant with local and international law. ...
Article
Full-text available
Lifelogging is a form of personal data collection which seeks to capture the totality of one’s experience through intelligent technology and sensors. Yet despite notable advancement in such technologies, there remain persistent challenges to developing interactive systems to analyse the types of large-scale personal collections often generated by lifelogging. In response to this, we present the Lifelog Application Design (LAD) model which is intended to address these challenges and support the design of more novel interactive systems that may target a broader range of application use cases. The model is deliberately structured to remain impartial to the specific personal data, technology platform, or application criterion, to provide maximum utility across the domain. We demonstrate this utility by exploring two case studies and a retrospective analysis of VRLE, a real-world application prototype developed to examine the potential of large-scale personal data retrieval in virtual reality. This work is based on the accumulation of insights garnered from involvement in a number of collaborative lifelogging projects over the past decade. It is our goal to encourage future researchers to utilise the LAD model to support the design and development of their own application prototypes and further solidify the model’s contribution to the domain as a whole.
... However, the continuous video recording of every experiential moment-whether at home, at work, around family, or in public spaces-involves not only those doing the recording, but anyone who happens to be recorded. Egocentric vision has greatly enhanced the vulnerability of bystanders (Ferdous et al. 2017). Recent research has worked on preserving visual privacy of the third parties that did not give consent: Dimiccoli et al. (2018) analyzed how image degradation might preserve the privacy of persons appearing in the image while activities can still be recognized; Hassan and Sazonov (2020) proposed an image redaction approach for privacy protection by selective content removal using a semantic segmentation-based deep learning. ...
Article
Full-text available
Population aging resulting from demographic changes requires some challenging decisions and necessary steps to be taken by different stakeholders to manage current and future demand for assistance and support. The consequences of population aging can be mitigated to some extent by assisting technologies that can support the autonomous living of older individuals and persons in need of care in their private environments as long as possible. A variety of technical solutions are already available on the market, but privacy protection is a serious, often neglected, issue when using such (assisting) technology. Thus, privacy needs to be thoroughly taken under consideration in this context. In a three-year project PAAL (‘Privacy-Aware and Acceptable Lifelogging Services for Older and Frail People’), researchers from different disciplines, such as law, rehabilitation, human-computer interaction, and computer science, investigated the phenomenon of privacy when using assistive lifelogging technologies. In concrete terms, the concept of Privacy by Design was realized using two exemplary lifelogging applications in private and professional environments. A user-centered empirical approach was applied to the lifelogging technologies, investigating the perceptions and attitudes of (older) users with different health-related and biographical profiles. The knowledge gained through the interdisciplinary collaboration can improve the implementation and optimization of assistive applications. In this paper, partners of the PAAL project present insights gained from their cross-national, interdisciplinary work regarding privacy-aware and acceptable lifelogging technologies.
... The big data that encode our digital traces have a huge value from the commercial, social and scientific standpoints ( [ZCW14], [PPSG15], [FSSST2016], [MCG2016], [PTLG16], [SZOT2016], [CL17], [EPRF17], [RWP17]). They can be used by decision makers to provide better services to citizens but, at the same time, if not managed in an appropriate way, may compromise our privacy ( [LZD11], [HDMP15], [RRFC15], [FCJ17]). Therefore, data privacy is a fundamental requirement in designing the Internet of the future ( [RGK11], [BHAJ2016], [CRB2016], [RAL17], [ZLLJ17]), and in particular in the IoP design. ...
Preprint
The cyber-physical convergence, the fast expansion of the Internet at its edge, and tighter interactions between human users and their personal mobile devices push towards a data-centric Internet where the human user becomes more central than ever. We argue that this will profoundly impact primarily on the way data should be handled in the Next Generation Internet. It will require a radical change of the Internet data-management paradigm, from the current platform-centric to a human-centric model. In this paper we present a new paradigm for Internet data management that we name Internet of People (IoP) because it embeds human behavior models in its algorithms. To this end, IoP algorithms exploit quantitative models of the humans' individual and social behavior, from sociology, anthropology, psychology, economics, physics. IoP is not a replacement of the current Internet networking infrastructure, but it exploits legacy Internet services as (reliable) primitives to achieve end-to-end connectivity on a global-scale. In this opinion paper, we first discuss the key features of the IoP paradigm along with the underlying research issues and challenges. Then, we present emerging data-management paradigms that are anticipating IoP.
... It is also possible for applications to capture sensitive information from the user's environment that the user would prefer to keep private, such as written or printed material, faces of unaware bystanders, and images taken in a bathroom or bedroom. Systems have been proposed to protect environmental information using markers such as near-infrared labels [44] as well as QR codes and other physical markers [28,61,62] to declare to the capturing device what can and cannot be recorded. Schiboni et. ...
Article
Mobile augmented reality systems are becoming increasingly common and powerful, with applications in such domains as healthcare, manufacturing, education, and more. This rise in popularity is thanks in part to the functionalities offered by commercially available vision libraries such as ARCore, Vuforia, and Google’s ML Kit; however, these libraries also give rise to the possibility of a hidden operations threat , that is, the ability of a malicious or incompetent application developer to conduct additional vision operations behind the scenes of an otherwise honest AR application without alerting the end user. In this paper, we present the privacy risks associated with the hidden operations threat, and propose a framework for application development and runtime permissions targeted specifically at preventing the execution of hidden operations. We follow this with a set of experimental results, exploring the feasibility and utility of our system in differentiating between user-expectation-compliant and non-compliant AR applications during runtime testing, for which preliminary results demonstrate accuracy of up to 71%. We conclude with a discussion of open problems in the areas of software testing and privacy standards in mobile AR systems.
... Life-logging devices equip users to take photos automatically as they roam around. Therefore, Alice can just download the respective life-log containing Bob's image and try re-identify him using image search online services [60] which would seriously undermine the privacy of Bob. Other practical threats might arise if a user somehow can find about other user's contacts and/or their infection status. ...
Article
Full-text available
Contact tracing has become a vital tool for public health officials to effectively combat the spread of new diseases, such as the novel coronavirus disease COVID-19. Contact tracing is not new to epidemiologist rather, it used manual or semi-manual approaches that are incredibly time-consuming, costly and inefficient. It mostly relies on human memory while scalability is a significant challenge in tackling pandemics. The unprecedented health and socio-economic impacts led researchers and practitioners around the world to search for technology-based approaches for providing scalable and timely answers. Smartphones and associated digital technologies have the potential to provide a better approach due to their high level of penetration, coupled with mobility. While data-driven solutions are extremely powerful, the fear among citizens is that information like location or proximity associated with other personal data can be weaponised by the states to enforce surveillance. Low adoption rate of such apps due to the lack of trust questioned the efficacy and demanded researchers to find innovative solution for building digital-trust, and appropriately balancing privacy and accuracy of data. In this paper, we have critically reviewed such protocols and apps to identify the strength and weakness of each approach. Finally, we have penned down our recommendations to make the future contact tracing mechanisms more universally inter-operable and privacy-preserving.
... User friendliness and interface design presented to customer is based on the website design developed by the interested stakeholders involved in a business (Mantovani et al.,2013). The definition of privacy has become a highly debated issue since the involvement of research regarding the social, legal, psychological, political and technical connotations surrounding this concept have emerged (Ferdous, Chowdhury and Jose, 2017). ...
Article
Full-text available
This study examines the relationship between website quality, selling systems, and seller’s attitude regarding to buyer’s intention. There will be a specific focus on buyers and sellers in China for this study. The fast-economic development of China has established the country as a worldwide financial power. China has developed by 6.7% in 2016, which was a contrast to the 6.9% growth of the previous year. This was caused a slower development. Consequently, the rebalancing of the economic power will set China apart as the world's second greatest merchant in the field of merchandise and business administration. It isa common assessment to say that there is a major flow on effect towards monetary execution in China. It was found that the respondent rate from seller’s was 307 by using a non-probability sampling method. The main result of this study showed dimensions of the independent variable ‘Website Quality’, ‘Selling System’ and the dependent variable of ‘Sellers Attitude’ to learn the attitude towards buyers’ intention in China. The disclosures showed that among the four components decided for this examination, the most affective part for online buyers is Website Design/Features, followed by support, lastly by security. The results of this study have demonstrated that these components are effective in eliciting responses among online clients.
... In contrast to these perceived benefits of using assisting lifelogging applications, barriers and concerns are also associated with the use of such devices and systems. Based on the fact that lifelogging technologies collect personal health data and closely intervene in people's everyday lives, concerns with regard to privacy and data security as well as the fear of losing control represent central barriers of using the lifelogging applications [2,[20][21][22]. In this connection, the fear of unauthorized disclosure of the personal data to third parties was also considered as a relevant disadvantage of using lifelogging technology. ...
Chapter
In view of the consequences resulting from the demographic change, using assisting lifelogging technologies in domestic environments represents one potential approach to support elderly and people in need of care to stay longer within their own home. Yet, the handling of personal data poses a considerable challenge to the perceptions of privacy and data security, and therefore for an accepted use in this regard. The present study focuses on aspects of data management in the context of two different lifelogging applications, considering a legal and a human-centered perspective. In a two-step empirical process, consisting of qualitative interviews and an online survey, these aspects were explored and evaluated by a representative German sample of adult participants (N = 209). Findings show positive attitudes towards using lifelogging, but there are also high requirements on privacy and data security as well as anonymization of the data. In addition, the study allows deep insights into preferred duration and location of the data storage, and permissions to access the personal information from third parties. Knowledge of preferences and requirements in the area of data management from the legal and human-centered perspectives is crucial for lifelogging and must be considered in applications that support people in their daily living at home. Outcomes of the present study considerably contribute to the understanding of an optimal infrastructure of the accepted and willingly utilized lifelogging applications.
... Life-logging devices equip users to take photos automatically as they roam around. Therefore, Alice can just download the respective life-log containing Bob' image and try re-identify using image search online services [9] which would seriously undermine the privacy of Bob. Other practical threats might arise if a user somehow can find about other user's contacts and/or their infection status. ...
Preprint
Full-text available
Contact tracing has become a key tool for public health officials to effectively combat the spread of new diseases, such as the Covid-19 pandemic. Currently, this process is either manual or semi-manual and often very time consuming and inefficient. It largely relies of human memory and cannot be scalable to tackle pandemic like COVID-19. Researchers and practitioners around the world have turned into the technology based approaches to provide a scalable solution. Smartphone and associated digital technologies have the potential to provide a better solution due to its high level of penetration coupled with mobility. However, information like location or proximity associated with other personal data are very sensitive private information and can be used by the states to do surveillance over their citizen. Researchers have proposed different contact tracing protocols to overcome or limit those concerns. In this paper, we have critically reviewed these protocols and apps to identify the strength and weakness of each approaches. Finally, we have pen down our recommendations to make contact tracing mechanism more universally inter-operable and privacy preserving.
... -To lower barriers-to-participation by including sufficient metadata, such as the visual annotations of visual content. -To apply the principles of privacy-by-design [2] when creating the test collection, because personal sensor data (especially camera or audio data) carries privacy concerns [8], [14], [19]. -To include realistic topics representing real-world information needs of varying degrees of difficulty for various sub-tasks. ...
Conference Paper
Full-text available
Lifelog-3 was the third instance of the lifelog task at NTCIR. At NTCIR-14, the Lifelog-3 task explored three different lifelog data access related challenges, the search challenge, the annotation challenge and the insights challenge. In this paper we review the activities of participating teams who took part in the challenges and we suggest next steps for the community.
... Chowdhury et al. found that whether lifelogging imagery is suitable for sharing is (in addition to content, scenario, and location) mainly determined by its sensitivity [Chowdhury et al. 2016]. Ferdous et al. proposed a set of guidelines that, among others, include semi-automatic procedures to determine the sensitivity of captured images according to user-provided preferences [Ferdous et al. 2017]. All highlight the privacy sensitivity of first-person recordings and the importance of protecting user and bystander privacy. ...
Conference Paper
Eyewear devices, such as augmented reality displays, increasingly integrate eye tracking, but the first-person camera required to map a user's gaze to the visual scene can pose a significant threat to user and bystander privacy. We present PrivacEye, a method to detect privacy-sensitive everyday situations and automatically enable and disable the eye tracker's first-person camera using a mechanical shutter. To close the shutter in privacy-sensitive situations, the method uses a deep representation of the first-person video combined with rich features that encode users' eye movements. To open the shutter without visual input, PrivacEye detects changes in users' eye movements alone to gauge changes in the "privacy level" of the current situation. We evaluate our method on a first-person video dataset recorded in daily life situations of 17 participants, annotated by themselves for privacy sensitivity, and show that our method is effective in preserving privacy in this challenging setting.
... There is an urgent need to understand the implications of privacy, in the form of external and internal threats, in such everywhere technology. Different professionals and researchers have explored different ways of understanding these privacy implications, proposing, designing and developing different frameworks to mitigate specific threats [73]. In the policy, decision making can also be influenced by different kinds of biases such as time constraints, time inconsistency, immediate gratification and optimistic bias. ...
Conference Paper
Full-text available
Privacy is the capability of individuals or groups to isolate the information about themselves from the eyes of everybody partially or physically, then express themselves selectively. Therefore, there are differences in term of content or boundaries among culture and society on what is considered private but share common themes. In the Internet, privacy is the broad term to explain a diverse of antecedents, factors, predictors and mechanism to secure and protect confidential, sensitive or secret data in the communication channel based on selected preferences. Thus, this study will explore the concept of privacy in the following decades by collecting all empirical evidence that match with eligible criteria to deliver meticulous summary in the response to a research question, which are related to the drivers of privacy protection.
... Many of these changes are driven by technological advancements. This work is extended in [59]. ...
Book
Full-text available
In this book we propose the following algorithms and applications for image processing: 1) The increasing use of Web development and transmission networks tools last decades has lead to huge 2D databases that still have to be correctly indexed. The context of this book is the Content-Based Image Retrieval (CBIR) and its application in lifelogging; 2) In the context of the frequency estimation, the new 3D high resolution spectral analysis method is presented using second order and higher order statistics. The application of the frequency in the field of CBIR allows developing of new descriptor vector witch characterizes the image. The filtering of seismic images using their content frequency is a new approach allows improving the quality of images; 3) We consider the class of subgradient methods for solving minimization of a non-smooth convex function regularized by the discretized L1 norm models arising in image processing. In last chapter we propose a fast control subrgadient algorithm for image regularization.
... Many of these changes are driven by technological advancements. This work is extended in [59]. ...
... Many of these changes are driven by technological advancements. This work is extended in [59]. ...
... The use of personal, body-worn cameras is often problematic and controversial in social situations, as it may cause discomfort and social tension. Recent work discusses bystander privacy that might be compromised by body-worn cameras [10,12,17,27]. In particular, wearable cameras create a different experience for bystanders than camera phones or CCTV cameras because they are considered subtle personal devices that can enable covert recording without consent [12,55]. ...
Conference Paper
Full-text available
Privacy notices aim to make users aware of personal data gathered and processed by a system. Body-worn cameras currently lack suitable design strategies for privacy notices that announce themselves and their actions tosecondary andincidental users, such as bystanders, when they are being used in public. Hypothesizing that the commonly used status LED is not optimal for this use case, due to being not sufficiently understandable, noticeable, secure and trustworthy, we explore design requirements of privacy notices for body-worn cameras. Following a two-step approach, we contribute incentives for design alternatives to status LEDs: Starting from 8 design sessions with experts, we discuss 8 physical design artifacts, as well as design strategies and key motives. Finally, we derive design recommendations of the proposed solutions, which we back based on an evaluation with 12 UX & HCI experts.
... Many of these changes are driven by technological advancements. This work is extended in Ref. [20]. ...
Article
Full-text available
Today, we witness the appearance of many lifelogging cameras that are able to capture the life of a person wearing the camera and which produce a large number of images everyday. Automatically characterizing the experience and extracting patterns of behavior of individuals from this huge collection of unlabeled and unstructured egocentric data present major challenges and require novel and efficient algorithmic solutions. The main goal of this work is to propose a new method to automatically assess day similarity from the lifelogging images of a person. We propose a technique to measure the similarity between images based on the Swain’s distance and generalize it to detect the similarity between daily visual data. To this purpose, we apply the dynamic time warping (DTW) combined with the Swain’s distance for final day similarity estimation. For validation, we apply our technique on the Egocentric Dataset of University of Barcelona (EDUB) of 4912 daily images acquired by four persons with preliminary encouraging results. Methods The search strategy was designed for high sensitivity over precision, to ensure that no relevant studies were lost. We performed a systematic review of the literature using academic databases (ACM, Scopus, etc.) focusing on themes of day similarity, automatically assess day similarity, assess day similarity on EDUB, and assess day similarity using visual lifelogs. The study included randomized controlled trials, cohort studies, and case-control studies published between 2006 and 2017.
... Chowdhury et al. [10] found that whether lifelogging imagery is suitable for sharing is (in addition to content, scenario, and location) mainly determined by its sensitivity. Ferdous et al. proposed a set of guidelines that, amongst others, include semi-automatic procedures to determine the sensitivity of captured images according to userprovided preferences [14]. All of these works underline the highly privacy-sensitive nature of head-mounted displays, and first-person cameras in particular, as well as the importance of active measures to protect the privacy of users and bystanders. ...
Article
Full-text available
As first-person cameras in head-mounted displays become increasingly prevalent, so does the problem of infringing user and bystander privacy. To address this challenge, we present PrivacEye, a proof-of-concept system that detects privacysensitive everyday situations and automatically enables and disables the first-person camera using a mechanical shutter. To close the shutter, PrivacEye detects sensitive situations from first-person camera videos using an end-to-end deep-learning model. To open the shutter without visual input, PrivacEye uses a separate, smaller eye camera to detect changes in users' eye movements to gauge changes in the "privacy level" of the current situation. We evaluate PrivacEye on a dataset of first-person videos recorded in the daily life of 17 participants that they annotated with privacy sensitivity levels. We discuss the strengths and weaknesses of our proof-of-concept system based on a quantitative technical evaluation as well as qualitative insights from semi-structured interviews.
... These concerns are justified; the continuous recording of every experiential moment, whether at home, at work, around family, or in public spaces, implicates not only those doing the recording but anyone who happens to be photographed or video-recorded. In particular, this technique has greatly enhanced the vulnerability of bystanders [14]. Egocentric photos might catch bystanders in embarrassing situations, undesirable poses, or reveal information they would rather not have on record. ...
Article
Full-text available
Recent advances in wearable camera technology and computer vision algorithms have greatly enhanced the automatic capture and recognition of human activities in real-world settings. While the appeal and utility of wearable camera devices for human-behavior understanding is indisputable, privacy concerns have limited the broader adoption of this method. To mitigate this problem, we propose a deep learning-based approach that recognizes everyday activities in egocentric photos that have been intentionally degraded in quality to preserve the privacy of bystanders. An evaluation on 2 annotated datasets collected in the field with a combined total of 84,078 egocentric photos showed activity recognition performance with accuracy between 79% and 88% across 17 and 21 activity classes when the images were subjected to blurring (mean filter k=20). To confirm that image degradation does indeed raise the perception of bystander privacy, we conducted a crowd sourced validation study with 640 participants; it showed a statistically significant positive relationship between the amount of image degradation and participants' willingness to be captured by wearable cameras. This work contributes to the field of privacy-sensitive activity recognition with egocentric photos by highlighting the trade-off between perceived bystander privacy protection and activity recognition performance.
Preprint
Full-text available
Contact tracing has become a vital tool for public health officials to effectively combat the spread of new diseases, such asthe novel coronavirus disease COVID-19. Contact tracing is not new to epidemiologist rather, it used manual or semi-manualapproaches that are incredibly time-consuming, costly and inefficient. It mostly relies on human memory while scalabilityis a significant challenge in tackling pandemics. The unprecedented health and socio-economic impacts led researchersand practitioners around the world to search for technology-based approaches for providing scalable and timely answers.Smartphones and associated digital technologies have the potential to provide a better approach due to their high level ofpenetration, coupled with mobility. While data-driven solutions are extremely powerful, the fear among citizens is thatinformation like location or proximity associated with other personal data and can be weaponised by the states to enforcesurveillance. Low adoption rate of such apps due to the lack of trust questioned the efficacy and demanded researchers tofind innovative solution for building digital-trust, and appropriately balancing privacy and accuracy of data. In this paper,we have critically reviewed such protocols and apps to identify the strength and weakness of each approach. Finally, wehave penned down our recommendations to make the future contact tracing mechanisms more universally inter-operable andprivacy-preserving.
Chapter
Personal lifelogging builds upon the pervasive and continuous acquisition of sensor measurements and signals in time, and this may expose the subject, and eventually bystanders, to privacy violations. While the issue is easy to understand for image and video data, the risks associated to the use of wearable accelerometers is less clear and may be underestimated. This work addresses the problem of understanding if acceleration measurements collected from the wrist, by subjects performing different types of Activities of Daily Living (ADLs), may release personal details, for example about their gender or age. A positive outcome would motivate the need for de-identification algorithms to be applied to acceleration signals, embedded into wearable devices, in order to limit the unintentional release of personal details and ensure the necessary privacy by design and by default requirements.
Chapter
A shift to higher proportions of older people and people in need of care requires new solutions and technologies with the potential to assist people in their everyday activities and to support them in being as independent and self-determined as possible. Lifelogging technologies have this potential by the collection, storage, and evaluation of personal data. Despite their potential, the users’ acceptance of such technologies is of great importance, in particular with regard to the technology’s handling of data security and privacy. For this reason, a quantitative study was carried out using an online questionnaire (N = 182), investigating two different application contexts of lifelogging technologies: a preventive context (frailty monitoring) and an assisting context related to patients suffering from dementia. Based on a preceding qualitative study, data access, purpose of data processing, duration as well as location of data storage were chosen as factors which were investigated, applying a conjoint analysis approach. The results revealed that the purpose of data processing and data access were the most decisive factors when users decide about the data management of lifelogging technologies and comparing the two contexts, contradicting decision patterns were found in particular for data access. Beyond these insights, user group specific decision patterns were identified for each of the application contexts. This study provides relevant insights into the users’ perspectives and requirements with regard to data management of lifelogging technologies, which should be taken into account for technology development and communication.
Conference Paper
Full-text available
Automatically and passively taking pictures (using lifelogging devices such as wearable cameras) of people who don’t know they’re having their picture taken raises a number of privacy concerns (from a bystander’s perspective). We conducted a study focussing on the bystanders’ concerns to the presence of augmented reality wearable devices in two contexts (one formal and one informal). The results suggests the need to embed privacy enhancing techniques into the design of lifelogging applications, which are likely to depend upon an array of factors, but not limited to the context of use, scenario (and surroundings), and content.
Conference Paper
Full-text available
Lifelogging devices, which seamlessly gather various data about a user as they go about their daily life, have resulted in users amassing large collections of noisy photographs (e.g. visual duplicates, image blur), which are difficult to navigate, especially if they want to review their day in photographs. Social media websites, such as Facebook, have faced a similar information overload problem for which a number of summarization methods have been proposed (e.g. news story clustering, comment ranking etc.). In particular, Facebook's Year in Review received much user interest where the objective for the model was to identify key moments in a user's year, offering an automatic visual summary based on their uploaded content. In this paper, we follow this notion by automatically creating a review of a user's day using lifelogging images. Specifically, we address the quality issues faced by the photographs taken on lifelogging devices and attempt to create visual summaries by promoting visual and temporal-spatial diversity in the top ranks. Conducting two crowdsourced evaluations based on 9k images, we show the merits of combining time, location and visual appearance for summarization purposes.
Article
Full-text available
Pervasive computing is beginning to offer the potential to rethink and redefine how technology can support human memory augmentation. For example, the emergence of widespread pervasive sensing, personal recording technologies, and systems for the quantified self are creating an environment in which it's possible to capture fine-grained traces of many aspects of human activity. Contemporary psychology theories suggest that these traces can then be used to manipulate our ability to recall--to both reinforce and attenuate human memories. Here, the authors consider the privacy and security implications of using pervasive computing to augment human memory. They describe a number of scenarios, outline the key architectural building blocks, and identify entirely new types of security and privacy threats-namely, those related to data security (experience provenance), data management (establishing new paradigms for digital memory ownership), data integrity (memory attenuation and recall-induced forgetting), and bystander privacy. Together, these threats present compelling research challenges for the pervasive computing research community. This article is part of a special issue on privacy and security.
Conference Paper
Full-text available
There exist disparate sets of definitions with different se-mantics on different topics of Identity Management which often lead to misunderstanding. A few efforts can be found compiling several related vocabularies into a single place to build up a set of definitions based on a common semantic. However, these efforts are not comprehensive and are only textual in nature. In essence, a mathematical model of iden-tity and identity management covering all its aspects is still missing. In this paper we build up a mathematical model of different core topics covering a wide range of vocabular-ies related to Identity Management. At first we build up a mathematical model of Digital Identity. Then we use the model to analyse different aspects of Identity Management. Finally, we discuss three applications to illustrate the ap-plicability of our approach. Being based on mathematical foundations, the approach can be used to build up a solid understanding on different topics of Identity Management.
Article
Full-text available
The authors give a brief overview of consumer lifelogging devices and the implications of such technologies for society.
Conference Paper
Full-text available
In this paper, we present a comparative analysis of a few popular Identity Management Systems against a set of requirements. Identity Management and Identity Management Systems have gained significant attention in recent years with the proliferation of different web-enabled and e-commerce services leading to an extensive research on the field in the form of several projects producing many standards, prototypes and application models both in the academia and the industry. We have collected and compiled different requirements from different sources to profile an extensive set of requirements that are required for a Privacy-Enhancing Identity Management System and presented them in the form of a taxonomy. Then we have compared some Identity Management Systems against those requirements and presented them in a concise way to help readers find out instantly which systems satisfy what requirements and thus help them to choose the correct one to fit into their own scenarios.
Article
Full-text available
A factor analytic study proceeded in two phases to determine types of privacy. In Phase I 96 items were collected, administered to 166 people, and factor analyzed. For Phase II items were retained, revised, added, or deleted to form a condensed pool of 30 items of greater factorial purity. This revised questionnaire was given to 188 subjects and then factor analyzed. Six independent factors of privacy were obtained, and factor scales to measure them were developed consisting of five factor pure items per factor. The privacy factors identified were: Reserve, Isolation, Solitary, Intimacy with Family, Intimacy with Friends, and Anonymity.
Article
Full-text available
The growth of information acquisition, storage and retrieval capacity has led to the development of the practice of lifelogging, the undiscriminating collection of information concerning one’s life and behaviour. There are potential problems in this practice, but equally it could be empowering for the individual, and provide a new locus for the construction of an online identity. In this paper we look at the technological possibilities and constraints for lifelogging tools, and set out some of the most important privacy, identity and empowerment-related issues. We argue that some of the privacy concerns are overblown, and that much research and commentary on lifelogging has made the unrealistic assumption that the information gathered is for private use, whereas, in a more socially-networked online world, much of it will have public functions and will be voluntarily released into the public domain.
Conference Paper
Full-text available
In this paper, we present a study of responses to the idea of being recorded by a ubicomp recording technology called SenseCam. This study focused on real-life situations in two North American and two European locations. We present the findings of this study and their implications, specifically how those who might be recorded perceive and react to SenseCam. We describe what system parameters, social processes, and policies are required to meet the needs of both the primary users and these secondary stakeholders and how being situated within a particular locale can influence responses. Our results indicate that people would tolerate potential incursions from SenseCam for particular purposes. Furthermore, they would typically prefer to be informed about and to consent to recording as well as to grant permission before any data is shared. These preferences, however, are unlikely to instigate a request for deletion or other action on their part. These results inform future design of recording technologies like SenseCam and provide a broader understanding of how ubicomp technologies might be taken up across different cultural and political regions.
Conference Paper
Full-text available
This paper presents an exploration and analysis of attitudes towards everyday tracking and recording technologies (e.g., credit cards, store loyalty cards, store video cameras). Interview participants reported being highly concerned with information privacy. At the same time, however, they also reported being significantly less concerned regarding the use of everyday technologies that have the capabilities to collect, process, and disseminate personal information. We present results from this study that both identify and begin to explain this discrepancy. Author Keywords Tracking, recording, privacy, everyday technologies, retail, user study.
Conference Paper
Full-text available
More and more personal devices such as mobile phones and multimedia players use embedded sensing. This means that people are wearing and carrying devices capable of sensing details about them such as their activity, location, and environment. In this paper, we explore privacy concerns about such personal sensing through interviews with 24 participants who took part in a three month study that used personal sensing to detect their physical activities. Our results show that concerns often depended on what was being recorded, the context in which participants worked and lived and thus would be sensed, and the value they perceived would be provided. We suggest ways in which personal sensing can be made more privacy-sensitive to address these concerns.
Article
Full-text available
Rather than try to capture everything, system design should focus on the psychological basis of human memory.
Article
Full-text available
In the last couple of years, several European countries have started projects which intend to provide their citizens with electronic identity cards, driven by the European Directive on Electronic Signatures. One can expect that within a few years, these smart cards will be used in a wide variety of applications. In this paper, we describe the common threats that can be identified when using security tokens such as smart cards in web applications. We illustrate each of these threats with a few attack scenarios. This paper is part of a series of papers, written by several academic teams. Each paper focuses on one particular technological building block for web applications. Full Text at Springer, may require registration or fee
Article
Full-text available
In this paper we examine the potential of pervasive computing to create widespread sousveillance, which will complement surveillance, through the development of life-logs—sociospatial archives that document every action, every event, every conversation, and every material expression of an individual’s life. Reflecting on emerging technologies, life-log projects, and artistic critiques of sousveillance, we explore the potential social, political, and ethical implications of machines that never forget. We suggest, given that life-logs have the potential to convert exterior generated oligopticons to an interior panopticon, that an ethics of forgetting needs to be developed and built into the development of life-logging technologies. Rather than seeing forgetting as a weakness or a fallibility, we argue that it is an emancipatory process that will free pervasive computing from burdensome and pernicious disciplinary effects.
Article
Full-text available
Several countries have generated principles to protect individuals from the potential invasion of privacy that data collection and retrieval poses. The Organization for Economic Cooperation and Development (OECD) has provided probably the best known set of guidelines. A number of countries have adopted these guidelines as statutory law, in whole or in part. The OECD has specific guidelines pertaining to data privacy that directly affect those performing knowledge discovery generally, and those who use so called “personal data” in particular. The article addresses such questions as: What are the implications of the existing privacy guidelines, especially those of the OECD, for knowledge discovery? What are the limitations of these guidelines? How do the restrictions on knowledge discovery about individuals affect knowledge discovery on groups? How do legal systems influence knowledge discovery?
Chapter
Threat analysis of a web application can lead to a wide variety of identified threats. Some of these threats will be very specific to the application; others will be more related to the underlying infrastructural software, such as the web or application servers, the database, the directory server and so forth. This paper analyzes the threats that can be related to the use of web services technology in a web application. It is part of a series of papers, written by different academic teams, that each focus on one particular technological building block for web applications.
Chapter
In the last couple of years, several European countries have started projects which intend to provide their citizens with electronic identity cards, driven by the European Directive on Electronic Signatures. One can expect that within a few years, these smart cards will be used in a wide variety of applications. In this paper, we describe the common threats that can be identified when using security tokens such as smart cards in web applications. We illustrate each of these threats with a few attack scenarios. This paper is part of a series of papers, written by several academic teams. Each paper focuses on one particular technological building block for web applications.
Conference Paper
Lifelogging is becoming widely deployed outside the scope of solipsistic self quantification. In elite sport, the ability to utilize these digital footprints of athletes for sport analytic has already become a game changer. This raises privacy concerns regarding both the individual lifelogger and the bystanders inadvertently captured by increasingly ubiquitous sensing devices. This paper describes a lifelogging model for consented use of personal data for sport analytic. The proposed model is a stepping stone towards understanding how privacy-preserving lifelogging frameworks and run-time systems can be constructed.
Conference Paper
The lifelogging activity enables a user, the lifelogger, to passively capture multimodal records from a first-person perspective and ultimately create a visual diary encompassing every possible aspect of her life with unprecedented details. In recent years it has gained popularity among different groups of users. However, the possibility of ubiquitous presence of lifelogging devices especially in private spheres has raised serious concerns with respect to personal privacy. Different practitioners and active researchers in the field of lifelogging have analysed the issue of privacy in lifelogging and proposed different mitigation strategies. However, none of the existing works has considered a well-defined privacy threat model in the domain of lifelogging. Without a proper threat model, any analysis and discussion of privacy threats in lifelogging remains incomplete. In this paper we aim to fill in this gap by introducing a first-ever privacy threat model identifying several threats with respect to lifelogging. We believe that the introduced threat model will be an essential tool and will act as the basis for any further research within this domain.
Conference Paper
With continuous advances in the pervasive sensing and lifelogging technologies for the quantified self, users now can record their daily life activities automatically and seamlessly. In the existing lifelogging research, visualization techniques for presenting the lifelogs and evaluating the effectiveness of such techniques from a lifelogger's perspective has not been adequately studied. In this paper, we investigate the effectiveness of four distinct visualization techniques for exploring the lifelogs, which were collected by 22 lifeloggers who volunteered to use a wearable camera and a GPS device simultaneously, for a period of 3 days. Based on a user study with these 22 lifeloggers, which required them to browse through their personal lifelogs, we seek to identify the most effective visualization technique. Our results suggest various ways to augment and improve the visualization of personal lifelogs to enrich the quality of user experience and making lifelogging tools more engaging. We also propose a new visualization feature-drill-down approach with details-on-demand, to make the lifelogging visualization process more meaningful and informative to the lifeloggers.
Article
Modern applications increasingly rely on continuous monitoring of video, audio, or other sensor data to provide their functionality, particularly in platforms such as the Microsoft Kinect and Google Glass. Continuous sensing by untrusted applications poses significant privacy challenges for both device users and bystanders. Even honest users will struggle to manage application permissions using existing approaches. We propose a general, extensible framework for controlling access to sensor data on multi-application continuous sensing platforms. Our approach, world-driven access control, allows real-world objects to explicitly specify access policies. This approach relieves the user's permission management burden while mediating access at the granularity of objects rather than full sensor streams. A trusted policy module on the platform senses policies in the world and modifies applications' "views" accordingly. For example, world-driven access control allows the system to automatically stop recording in bathrooms or remove bystanders from video frames,without the user prompted to specify or activate such policies. To convey and authenticate policies, we introduce passports, a new kind of certificate that includes both a policy and optionally the code for recognizing a real-world object. We implement a prototype system and use it to study the feasibility of world-driven access control in practice. Our evaluation suggests that world-driven access control can effectively reduce the user's permission management burden in emerging continuous sensing systems. Our investigation also surfaces key challenges for future access control mechanisms for continuous sensing applications.
Article
Pervasive logging devices capture everything along with the public nearby without their consent, thus, possibly troubling people who prefer their privacy. This has issues for privacy and, furthermore, the widespread use of such logging devices may affect people's behavior, as they may feel uncomfortable that they are constantly being monitored. People may wish to have some control over the lifelogging devices of others and, in this article, we describe a framework to restrict anonymous logging, unless explicitly permitted. Our privacy framework allows the user of a logging device to define privacy policies controlling when, where and who to restrict from logging them. Moreover, it is possible to select which type of logging sensors to apply these restrictions. Evaluation results show that this approach is a practical method of configuring privacy settings and restricting pervasive devices from logging.
Article
A number of wearable 'lifelogging' camera devices have been released recently, allowing consumers to capture images and other sensor data continuously from a first-person perspective. Unlike traditional cameras that are used deliberately and sporadically, lifelogging devices are always 'on' and automatically capturing images. Such features may challenge users' (and bystanders') expectations about privacy and control of image gathering and dissemination. While lifelogging cameras are growing in popularity, little is known about privacy perceptions of these devices or what kinds of privacy challenges they are likely to create. To explore how people manage privacy in the context of lifelogging cameras, as well as which kinds of first-person images people consider 'sensitive,' we conducted an in situ user study (N = 36) in which participants wore a lifelogging device for a week, answered questionnaires about the collected images, and participated in an exit interview. Our findings indicate that: 1) some people may prefer to manage privacy through in situ physical control of image collection in order to avoid later burdensome review of all collected images; 2) a combination of factors including time, location, and the objects and people appearing in the photo determines its 'sensitivity;' and 3) people are concerned about the privacy of bystanders, despite reporting almost no opposition or concerns expressed by bystanders over the course of the study.
Article
We have recently observed a convergence of technologies to foster the emergence of lifelogging as a mainstream activity. Computer storage has become significantly cheaper, and advancements in sensing technology allows for the efficient sensing of personal activities, locations and the environment. This is best seen in the growing popularity of the quantified self movement, in which life activities are tracked using wearable sensors in the hope of better understanding human performance in a variety of tasks. This review aims to provide a comprehensive summary of lifelogging, to cover its research history, current technologies, and applications. Thus far, most of the lifelogging research has focused predominantly on visual lifelogging in order to capture life details of life activities, hence we maintain this focus in this review. However, we also reflect on the challenges lifelogging poses to an information retrieval scientist. This review is a suitable reference for those seeking an information retrieval scientist's perspective on lifelogging and the quantified self.
Article
It has become easier to create a lifelog by using smart phones. Lifelogs can be used in several ways. However, some problems exist in memory recollection; as many lifelogs owned by a person are partial information, it is impractical to remember days themselves. Therefore, a lifelog sharing system for memory recollection is proposed. The system elicits related information from other people's lifelogs to complete the system user's lifelog. In this paper, the definition of the lifelog used and operations for it are shown. Then, the effect between lifelogs and memory recollection is shown. Finally, system implementation is mentioned.
Article
Augmented reality (AR) devices are poised to enter the market. It is unclear how the properties of these devices will affect individuals' privacy. In this study, we investigate the privacy perspectives of individuals when they are bystanders around AR devices. We conducted 12 field sessions in cafés and interviewed 31 bystanders regarding their reactions to a co-located AR device. Participants were predominantly split between having indifferent and negative reactions to the device. Participants who expressed that AR devices change the bystander experience attributed this difference to subtleness, ease of recording, and the technology's lack of prevalence. Additionally, participants surfaced a variety of factors that make recording more or less acceptable, including what they are doing when the recording is being taken. Participants expressed interest in being asked permission before being recorded and in recording-blocking devices. We use the interview results to guide an exploration of design directions for privacy-mediating technologies.
Conference Paper
Perceptual, "context-aware" applications that observe their environment and interact with users via cameras and other sensors are becoming ubiquitous on personal computers, mobile phones, gaming platforms, household robots, and augmented-reality devices. This raises new privacy risks. We describe the design and implementation of DARKLY, a practical privacy protection system for the increasingly common scenario where an untrusted, third-party perceptual application is running on a trusted device. DARKLY is integrated with OpenCV, a popular computer vision library used by such applications to access visual inputs. It deploys multiple privacy protection mechanisms, including access control, algorithmic privacy transforms, and user audit. We evaluate DARKLY on 20 perceptual applications that perform diverse tasks such as image recognition, object tracking, security surveillance, and face detection. These applications run on DARKLY unmodified or with very few modifications and minimal performance overheads vs. native OpenCV. In most cases, privacy enforcement does not reduce the applications' functionality or accuracy. For the rest, we quantify the tradeoff between privacy and utility and demonstrate that utility remains acceptable even with strong privacy protection.
Article
This article begins with several attempts to define privacy. After an analysis of several competing conceptions a definition is offered and defended. Privacy may be understood as a right to control access to and use of both physical items, like bodies and houses, and to information, like medical and financial facts. Physical privacy affords individuals access control rights over specific bodies, objects, and places. Informational privacy, on the other hand, allows individuals to control access to personal information no matter how it is codified. The article concludes with numerous test cases for the account being offered.
Conference Paper
Lifelogging technologies have the potential to prov ide memory cues for people who struggle with episodic memory impairment (EMI). These memory cues enable the recollection of significant experiences, which is i mportant for people with EMI to regain a sense of normalcy i n their lives. However, lifelogging technologies often coll ect an overwhelmingly large amount of data to review. The best memory cues need to be extracted and presented in a way that best supports episodic recollection. We descri be the design of a new lifelogging system that captures ph otos, ambient audio, and location information and leverag es both automated content/context analysis and the expertis e of family caregivers to facilitate the extraction and annotation of a salient summary consisting of good cues from t he lifelog. The system presents the selected cues for review in a way that maximizes the opportunities for the pers on with EMI to think deeply about these cues to trigger mem ory recollection on his own without burdening the careg iver. We compare our system with another review system that requires the caregiver to repeatedly guide the revi ew process. Our self-guided system resulted in better memory retention and imposed a smaller burden on the careg iver whereas the caregiver-guided approach provided more opportunities for caregiver interaction. Author Keywords
Conference Paper
In this article, we describe the design process of Reno, a location-enhanced, mobile coordination tool and person finder. The design process included three field experiments: a formative Experience Sampling Method (ESM) study, a pilot deployment and an extended user study. These studies were targeted at the significant personal security, privacy and data protection concerns caused by this application. We distil this experience into a small set of guidelines for designers of social mobile applications and show how these guidelines can be applied to a different application, called Boise. These guidelines cover issues pertaining to personal boundary definition, control, deception and denial, and group vs. individual communication. We also report on lessons learned from our evaluation experience, which might help practitioners in designing novel mobile applications, including the choice and characterization of users for testing security and privacy features of designs, the length of learning curves and their effect on evaluation and the impact of peculiar deployment circumstances on the results of these finely tuned user studies.
Conference Paper
As a number of home network services are available and home network is expanding into ubiquitous computing environment, we need to protect home network system from illegal accesses and a variety of threats. Home network is exposed to various cyber attacks of Internet, involves hacking, malicious codes, worms, viruses, DoS attacks, and eavesdropping since it is connected to Internet. So in this paper, we propose a home network security framework for guaranteeing reliability and availability including authentication, authorization and security policy system.
Article
The term "lifelog" refers to a comprehensive archive of an individual's quotidian existence, created with the help of pervasive computing technologies. Lifelog technologies would record and store everyday conversations, actions, and experiences of their users, enabling future replay and aiding remembrance. Products to assist lifelogging are already on the market; but the technology that will enable people fully and continuously to document their entire lives is still in the research and development phase. For generals, edgy artists and sentimental grandmothers alike, lifelogging could someday replace or complement, existing memory preservation practices. Like a traditional diary, journal or day-book, the lifelog could preserve subjectively noteworthy facts and impressions. Like an old-fashioned photo album, scrapbook or home video, it could retain images of childhood, loved-ones and travels. Like a cardboard box time capsule or filing cabinet it could store correspondence and documents. Like personal computing software, it could record communications data, keystrokes and internet trails. The lifelog could easily store data pertaining to purely biological states derived from continuous self-monitoring of, for example, heart rate, respiration, blood sugar, blood pressure and arousal. To the extent that it preserves personal experience for voluntary private consumption, electronic lifelogging looks innocent enough, as innocent as Blackberries, home movies, and snapshots in silver picture frames. But lifelogging could fuel excessive self-absorption, since users would be engaged in making multimedia presentations about themselves all the time. The availability of lifelogging technology might lead individuals to overvalue the otherwise transient details of their lives. Furthermore, the potential would be great for incivility, emotional blackmail, exploitation, prosecution and social control by government surrounding lifelog creation, content and accessibility. Existing privacy law and policy do not suggest meaningful limits on unwanted uses of lifelogging data. This parry of the costs and benefits commences a fuller discussion of lifelogging's ethical and legal implications.
Article
Since the 1970s, computer systems have featured multiple applications and served multiple users, leading to heightened awareness of data security issues. System administrators and software developers focused on different kinds of access control to ensure that only authorized users were given access to certain data or resources. One kind of access control that emerged is role-based access control (RBAC). A role is chiefly a semantic construct forming the basis of access control policy. With RBAC, system administrators create roles according to the job functions performed in a company or organization, grant permissions (access authorization) to those roles, and then assign users to the roles on the basis of their specific job responsibilities and qualifications. A role can represent specific task competency, such as that of a physician or a pharmacist. Or it can embody the authority and responsibility of, say, a project supervisor. Roles define both the specific individuals allowed to access resources and the extent to which resources are accessed. For example, an operator role might access all computer resources but not change access permissions; a security officer role might change permissions but have no access to resources; and an auditor role might access only audit trails. Roles are used for system administration in such network operating systems as Novell's NetWare and Microsoft's Windows NT. This article explains why RBAC is receiving renewed attention as a method of security administration and review, describes a framework of four reference models the authors have developed to better understand RBAC and categorize different implementations, and discusses the use of RBAC to manage itself. The authors' framework separates the administration of RBAC from its access control functions.
Conference Paper
This paper tries to serve as an introductory reading to privacy issues in the field of ubiquitous computing. It develops six principles for guiding system design, based on a set of fair information practices common in most privacy legislation in use today: notice, choice and consent, proximity and locality, anonymity and pseudonymity, security, and access and recourse. A brief look at the history of privacy protection, its legal status, and its expected utility is provided as a background.
A privacy by design approach to lifelogging
  • Cathal Gurrin
  • Rami Albatal
  • Hideo Joho
  • Kaori Ishii
Cathal Gurrin, Rami Albatal, Hideo Joho, Kaori Ishii, A privacy by design approach to lifelogging, in: Digital Enlightenment Yearbook 2014: Social Networks and Social Machines, Surveillance and Empowerment, 2014, p. 49.
iForgot: A model of forgetting in robotic memories
  • C Gurrin
  • Hyowon Lee
  • J Hayes
C. Gurrin, Hyowon Lee, J. Hayes, iForgot: A model of forgetting in robotic memories, in: Human-Robot Interaction (HRI), 2010 5th ACM/IEEE International Conference on, March 2010, pp. 93-94.
Web Dictionary of Cybernetics and Systems
  • F Heylighen
F. Heylighen, Web Dictionary of Cybernetics and Systems. http://pespmc1.vub.ac.be/ASC/indexASC.html (Accessed on 15.07.15).
Common Terminological Framework for Interoperable Electronic Identity Management
  • Modinis Idm
MODINIS IDM. Common Terminological Framework for Interoperable Electronic Identity Management. https://www.cosic.esat.kuleuven.be/modinisidm/twiki/bin/view.cgi/Main/GlossaryDoc, 23 November, 2015. (Accessed on 15.07.15).
Privacy awareness: A means to solve the privacy paradox? in: The Future of Identity in the Information Society
  • Stefanie Pötzsch
Stefanie Pötzsch, Privacy awareness: A means to solve the privacy paradox? in: The Future of Identity in the Information Society, Springer, 2009, pp. 226-236.
  • Adam Moore
  • Defining Privacy
Adam Moore, Defining privacy, J. Soc. Philos. 39 (3) (2008) 411–428.
Threat modeling as a basis for security requirements
  • Suvda Myagmar
  • Adam J Lee
  • William Yurcik
Suvda Myagmar, Adam J. Lee, William Yurcik, Threat modeling as a basis for security requirements, in: Symposium on Requirements Engineering for Information Security, SREIS, vol. 2005, 2005, pp. 1-8.
Privacy by Design: The 7 Foundational Principles. https://www.ipc.on.ca/images/resources/7foundationalprinciples.pdf
  • Ann Cavoukian
Ann Cavoukian, Privacy by Design: The 7 Foundational Principles. https://www.ipc.on.ca/images/resources/7foundationalprinciples.pdf, August 2009. (Accessed on 15.01.16).
What does the revision of the OECD Privacy Guidelines mean for businesses?
  • Monika Kuschewsky
Monika Kuschewsky, What does the revision of the OECD Privacy Guidelines mean for businesses? https://www.cov.com/ ∼ /media/files/corporate/ publications/2013/10/what_does_the_revision_of_the_oecd_privacy_guidelines_mean_for_businesses.pdf, 22 October, 2013. (Accessed on 09.01.16).
Operationalizing Privacy by Design: A Guide to Implementing Strong Privacy Practices
  • Ann Cavoukian
Ann Cavoukian, Operationalizing Privacy by Design: A Guide to Implementing Strong Privacy Practices. https://www.ipc.on.ca/images/Resources/ operationalizing-pbd-guide.pdf, December, 2012. (Accessed on 15.01.16).