Hana Habib's research while affiliated with Carnegie Mellon University and other places
What is this page?
This page lists the scientific contributions of an author, who either does not have a ResearchGate profile, or has not yet added these contributions to their profile.
It was automatically created by ResearchGate to create a record of this author's body of work. We create such pages to advance our goal of creating and maintaining the most comprehensive scientific repository possible. In doing so, we process publicly available (personal) data relating to the author as a member of the scientific community.
If you're a ResearchGate member, you can follow this page to keep up with this author's work.
If you are this author, and you don't want us to display this page anymore, please let us know.
It was automatically created by ResearchGate to create a record of this author's body of work. We create such pages to advance our goal of creating and maintaining the most comprehensive scientific repository possible. In doing so, we process publicly available (personal) data relating to the author as a member of the scientific community.
If you're a ResearchGate member, you can follow this page to keep up with this author's work.
If you are this author, and you don't want us to display this page anymore, please let us know.
Publications (19)
A variety of methods and techniques are used in usable privacy and security (UPS) to study users’ experiences and behaviors. When applying empirical methods, researchers in UPS face specific challenges, for instance, to represent risk to research participants. This chapter provides an overview of the empirical research methods used in UPS and highl...
Assessing the usability of choice and consent mechanisms.
We conducted an online survey and remote usability study to explore user needs related to advertising controls on Facebook and determine how well existing controls align with these needs. Our survey results highlight a range of user objectives related to controlling Facebook ads, including being able to select what ad topics are shown or what perso...
Usable privacy and security researchers have developed a variety of approaches to represent risk to research participants. To understand how these approaches are used and when each might be most appropriate, we conducted a systematic literature review of methods used in security and privacy studies with human participants. From a sample of 633 pape...
Increasingly, icons are being proposed to concisely convey privacy-related information and choices to users. However, complex privacy concepts can be difficult to communicate. We investigate which icons effectively signal the presence of privacy choices. In a series of user studies, we designed and evaluated icons and accompanying textual descripti...
We conducted an in-lab user study with 24 participants to explore the usefulness and usability of privacy choices offered by websites. Participants were asked to find and use choices related to email marketing, targeted advertising, or data deletion on a set of nine websites that differed in terms of where and how these choices were presented. They...
Many websites offer visitors privacy controls and opt-out choices, either to comply with legal requirements or to address consumer privacy concerns. The way these control mechanisms are implemented can significantly affect individuals' choices and their privacy outcomes. We present an extensive content analysis of a stratified sample of 150 English...
Public sharing is integral to online platforms. This includes the popular multimedia messaging application Snapchat, on which public sharing is relatively new and unexplored in prior research. In mobile-first applications, sharing contexts are dynamic. However, it is unclear how context impacts users' sharing decisions. As platforms increasingly re...
Public sharing is integral to online platforms. This includes the popular multimedia messaging application Snapchat, on which public sharing is relatively new and unexplored in prior research. In mobile-first applications, sharing contexts are dynamic. However, it is unclear how context impacts users' sharing decisions. As platforms increasingly re...
Previous research has suggested that people use the private browsing mode of their web browsers to conduct privacy-sensitive activities online, but have misconceptions about how it works and are likely to overestimate the protections it provides. To better understand how private browsing is used and whether users are at risk, we analyzed browsing d...
Text passwords---a frequent vector for account compromise, yet still ubiquitous---have been studied for decades by researchers attempting to determine how to coerce users to create passwords that are hard for attackers to guess but still easy for users to type and memorize. Most studies examine one password or a small number of passwords per user,...
With the rapid deployment of Internet of Things (IoT) technologies and the variety of ways in which IoT-connected sensors collect and use personal data, there is a need for transparency, control, and new tools to ensure that individual privacy requirements are met. To
develop these tools, it is important to better understand how people feel about t...
Despite their ubiquity, many password meters provide inaccurate strength estimates. Furthermore, they do not explain to users what is wrong with their password or how to improve it. We describe the development and evaluation of a data-driven password meter that provides accurate strength measurement and actionable, detailed feedback to users. This...
People share personal content online with varied audiences, as part of tasks ranging from conversational-style content sharing to collaborative activities. We use an interview- and diary-based study to explore: 1) what factors impact channel choice for sharing with particular audiences; and 2) what behavioral patterns emerge from the ability to com...
Citations
... In the case of new systems, such as the EHR in Germany, frequency of use cannot be surveyed. Second, the actual context of use can be difficult to simulate in questionnaire studies, making it difficult to distinguish between intention and behavior [30]. Since the models of technology acceptance described above (TAM, UTAUT) were all evaluated using questionnaires, they may not provide reliable insights into usage behavior in the context of the EHR. ...
... This limits our ability to control for participant actions such as removing ads from an advertiser, or removing a specific interest. Prior work estimates that 10-19% of users tweak their ad settings [39,41], either from the ad preferences page or from the contextual menu next to ads. We attempt to account for such variance by factoring participants' awareness of privacy settings in Table A1, and find that disparate exposure to Problematic ads for older and minority participants persists. ...
... These folk theories about security mechanisms could contribute to users' security misconceptions, e.g., concerning private browsing [1,25,30], the security of electronic communication [2], secure password creation methods [67], or the anonymity of blockchain transactions [38]. Even when these security misconceptions have no direct negative influence on users' security, they can have side effects such as a security theater [61], i.e., when users feel more secure while the technology is not. ...
... many researchers have explored RTBF. These include experience reports from Google Search [13] and Microsoft Bing [41]; understanding the challenges in exercising RTBF from a user perspective [29,51]; and empirical analysis of RTBF practices in websites [30,42]. Researchers from Facebook [22] and Boston University [12] have built data management systems that natively support guaranteed deletion in order to meet RTBF requirements. ...
... Seinen et al. highlight that while repurposing through consent aligns more closely with the principles of privacy and user autonomy, the practical implementation of compatible use is often more feasible in practice (Seinen, Walter, and van Grondelle, 2018, p. 10). The current approach of collecting user consent through ubiquitous privacy notices has been considered ineffective and burdensome (Habib et al., 2022). Hence, repurposing through compatibility has the potential to additionally ease the burdens placed on individual data subjects in their day-to-day interactions with digital services. ...
... For example, in 2022 Google agreed to a $392 million settlement in the US for misleading consumers with privacy setting interfaces that failed to clearly inform users about how to turn off location tracking [28]. Previous work has investigated users' perceptions and behaviors with respect to privacy settings in various contexts, such as social media [10,21,23,36,37], smart home devices [34,38,59], and mobile apps [12,45]. However, little published research has investigated privacy settings of web services' accounts such as Google, or privacy settings in non-Western societies. ...
... In recent years, many research groups and agencies have addressed the balance between security and usability, particularly after the increase in Internet use (including smart working and e-commerce) stemming from the COVID-19 pandemic and a related significant increase in cyberattacks [5][6][7][8][9][10]. Contributions to this area are variegated. ...
... In response, a variety of explorations have been conducted to make ToS accessible, readable, and comprehensible: privacy policy nutrition labels (Kelley et al., 2009); textured agreements employing visual design principles (Kay & Terry, 2010); icons for privacy notice and choices (Habib et al., 2021); visual interactive privacy policies (Reinhardt et al., 2021); comics to increase user attention and comprehension (Tabassum et al., 2018); highlighted information through crowdsourced sentiment-enhanced visualizations (Taber et al., 2020); simplified text stripped from heavy legalese (Robinson & Zhu, 2020); visual interface design illustrating data packages and data practices (Jones et al., 2017); and modular privacy policies (Das et al., 2018). ...
... To inform users about data collection and usage, information technology companies commonly present privacy policies within their software applications [1,32]. These policies often include privacy choices and settings that allow users to determine how their personal data is shared In [15,17,33]. However, privacy policy interfaces tend to be complex and lack usability [2,7,17,25,26], increasing the risk of human error and associated threats to companies' information security [24]. ...
... Their OPP-115 corpus [20] contains annotated segments from 115 website privacy policies, enabling advanced machine learning research and automated analysis. Another dataset from the same project is the OptOutChoice-2020 corpus [21], which includes privacy policy sentences with labeled opt-out choices types. PolicyIE [22] offers a more recent dataset with annotated data practices, including intent classification and slot filling, based on 31 web and mobile app privacy policies. ...