Content uploaded by Aditya Kumar Purohit
Author content
All content in this area was uploaded by Aditya Kumar Purohit on Apr 24, 2020
Content may be subject to copyright.
Designing for Digital Detox: Making
Social Media Less Addictive with
Digital Nudges
Aditya Kumar Purohit
University of Neuchâtel
Neuchâtel, Switzerland
aditya.purohit@unine.ch
Louis Barclay
Nudge
Johannesburg, South Africa
louis@nudgeware.io
Adrian Holzer
University of Neuchâtel
Neuchâtel, Switzerland
adrian.holzer@unine.ch
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for third-party components of this work must be honored.
For all other uses, contact the Owner/Author.
CHI ’20 Extended Abstracts, April 25–30, 2020, Honolulu, HI, USA.
© 2020 Copyright is held by the author/owner(s).
ACM ISBN 978-1-4503-6819-3/20/04.
https://dx.doi.org/10.1145/3334480.3382810
Abstract
Social media addiction concerns have increased steadily
over the past decade. Digital nudges have previously been
shown to hold enormous potential to change behavior.
However, it is not clear how they might be designed to
combat social media addiction. In this late-breaking work,
we aim at clarifying this issue by investigating how digital
nudges can reduce the addictive features of social me-
dia and other addictive sites. More precisely, we present
the design of NUDGE, a novel browser extension that aims
to make social media less addictive by delivering digital
nudges founded on behavioral science. We conducted a
preliminary evaluation of NUDGE with 67 actual users and
14 university students. Our results show that NUDGE (1)
helped users to become reflective of their social media us-
age, (2) possibly decreased their time spent, and (3) made
the experience more pleasant.
Author Keywords
Digital nudge; Social media addiction; Digital wellbeing;
Digital Detox; Digital Addiction
CCS Concepts
•Human-centered computing →User studies; Social
media;
CHI 2020 Late-Breaking Work
CHI 2020, April 25–30, 2020, Honolulu, HI, USA
LBW126, Page 1
Introduction
Our dependence on technology is on the rise, and as a
consequence, an estimated 210 million people are suffering
from social media addiction worldwide [20]. This condition
makes individuals overly concerned about, and increasingly
dependent on, online social networks (OSN) [3]. Social me-
dia were designed as web-based technologies built to facil-
itate messaging and sharing of content between people [6].
At the same time, many social media companies, including
Facebook, Twitter, and YouTube, rely on the continued at-
tention of users for their revenue generation [19], in what
is sometimes called the attention economy [9]. As a result,
these technologies are now designed to be intrinsically per-
suasive [1] to attract people’s attention [8]. This has conse-
quences such as impairing interpersonal relationships [16],
and/or psychological health and well-being [30].
Figure 1: Hook model adapted
from Eyal [11]
In this paper, we argue that the same design principles that
create addiction could be leveraged to mitigate it. More
specifically, digital nudges, i.e. design features that steer
people’s decisions without banishing freedom of choice, can
be used in welfare-promoting directions [25]. We present
specific digital nudges that can be used by designers to
mitigate the addictive features of social media. Further,
we present the implementation of these nudges in a novel
Chrome extension called NUDGE. Finally, we present a pre-
liminary evaluation of NUDGE with 67 actual users and 14
students who assessed its usability and effectiveness.
Methodology
This research project follows the steps of the design sci-
ence research methodology (DSRM) [24]. The paper first
identifies the problem (Introduction). Second, it presents the
objectives of a solution by describing the addictive model
used in social media (Hooked on Social Media). Third, it
presents the design of a solution and a real word demon-
strator (i.e., NUDGE) before providing a preliminary evalua-
tion. Finally, it concludes and presents future work.
Hooked on Social Media
The design of social media platforms is intentionally en-
gineered to be addictive and exploit vulnerabilities in hu-
man psychology [1]. This intentional design results in un-
desirable platform usage that can lead to addictive usage
patterns [22]. One model that many companies adopt for
the development of habit-forming social media products
is the Hook model [11]. This model draws on behaviorist
principles to build habits [26]. It describes a four-phase it-
erative process for software design that encourages habit
formation. The process starts with (1) a trigger that leads
to (2) an action, which produces (3) a reward and ultimately
creates (4) an investment. Successfully utilizing these four
phases produces an addictive feedback loop where the
more users receive rewards and invest, the more they will
be incentivized to respond to triggers and perform actions,
which in turn will produce more rewards and investments
and so on (see Figure 1).
Trigger
A trigger operates as a foundation on which habits are
formed. Internal triggers hinge on thoughts and emotions
like boredom, pre-existing routines, and loneliness, prompt-
ing the user to take mindless actions. External triggers
could be, for instance, notifications indicating the arrival of a
new message in the user’s environment. A trigger prompts
an individual to take action by supplying cues.
Action
The action phase is based on the Fogg behavior model [12].
When sufficient ability and motivation exist with a trigger, an
action occurs. Actions in social media are typically brows-
ing, posting, commenting, or reacting. Social media plat-
CHI 2020 Late-Breaking Work
CHI 2020, April 25–30, 2020, Honolulu, HI, USA
LBW126, Page 2
forms increase the ability of users to take actions by making
actions more accessible. For instance, YouTube presents
personalized recommendations on the right side of its web-
site, reducing the burden to search for similar videos. The
motivation to seek pleasure, social acceptance, and to
avoid pain and social rejection increase the chance that
the user will take action.
Figure 2: Potential digital nudges
to combat the design context of the
Hook Model
Reward
Giving users rewards after they have taken action brings
them to social media platforms over and over again. Re-
wards can be of three types [11]: the ‘tribe’, the ‘hunt’, and
the ‘self’. The rewards of the tribe are social rewards such
as comments and likes. The rewards of the hunt are the
consumption of new content, for instance, on the endless
News Feed on Facebook. The rewards of the self are in-
trinsic rewards of mastery, competence, and completion,
for instance, on platforms like LinkedIn completion of profile
increases visibility to recruiters.
Investment
Investment increases the likelihood of users returning to
social media platforms as they have invested value in the
form of content, data, followers, and reputation [11]. The
investment on the platform increases the possibility of users
responding to the next trigger to start the cycle once again.
The objectives of a solution to address the social media ad-
diction problem should address these four phases in order
to break users free from the vicious cycle.
Designing for Digital Detox
Hereafter, we illustrate, with scenarios, how designers could
leverage digital nudges to counteract the various phases
of the Hook model. Figure 2 provides an overview of the
different digital detox nudges.
Dismissing Triggers
Imagine a user called Thomas. Like many other mobile
phone users, he receives around 100 notifications from
social media per day (external triggers), indicating a new
message is received. Also, every so often, Thomas visits
his favorite social media platform almost unconsciously out
of boredom (internal triggers). The following two nudges
could work on dismissing these triggers and thus preventing
users from accessing social media in the first place:
• Hiding Nudge: Hiding nudge operates by obscuring
a trigger so that it becomes harder to respond to [7].
After Thomas enters the social media platform, notifi-
cations within the platform can be hidden so that they
fail to trigger an action.
• Default Nudge: Default options are very powerful,
as it takes effort to deviate from the default environ-
ment set by some intervention [28, 14, 10]. Typical
examples are opt-out rather than opt-in approaches.
In our context, social media platforms could be turned
off by default, i.e., Thomas is not taken directly to
Facebook, but to another page by default where he is
asked if he is sure he would like to visit Facebook, in
which case he has to switch Facebook on mindfully.
Limiting User Action
After Thomas enters the social media platform, algorithms
supply personalized content. The tailored content increases
the opportunity of action on the platform; thus, Thomas
ends up looking at linked content that he was not interested
in the first place, but was conveniently accessible. The fol-
lowing nudge could limit his action on social media:
• Creating Friction: In the course of interaction with
technology, frictions hinder people from painlessly
CHI 2020 Late-Breaking Work
CHI 2020, April 25–30, 2020, Honolulu, HI, USA
LBW126, Page 3
achieving their goals [21]. Usually, systems are de-
signed to lower friction. However, in our context, fric-
tion creates an opportunity for the user to interact
mindfully. Frictions could be introduced by asking if
Thomas wants to see video or post recommendations
in the sidebar of social media; thus, making it harder
for Thomas to click on other content mindlessly.
Figure 3: (1) Notifications on
Facebook (2) Hiding nudge hides
notifications on Facebook
Figure 4: (1) Sidebar on YouTube
with personalized content (2)
Sidebar on YouTube with Friction
nudge to confirm actions
Figure 5: Default nudge in NUDGE
switching off YouTube.com
Reducing Reward
Once Thomas is online, the infinite scrolling offered by
websites as well as the likes of his posts produce rewards
to make him stay longer on the site. In addition to hiding
nudge that can also be used to reduce the impact of likes
and comments from others, the two following nudges can
reduce the rewards of staying on the site:
• Pause-reminders: Pause-reminders break users free
from addictive continuous scrolling [2] by creating an
interface that reduces hunt rewards. In this case, the
pause-reminder makes Thomas aware of his mind-
less scrolling.
• Feedback nudge: Feedback about time spent on an
app combined with negative reinforcement could re-
duce the rewards of users from staying on social me-
dia [23]. Given the scenario, Thomas could be pre-
sented with a timer that shows how much time he has
already spent on social media.
Reduce Investment
Thomas has invested in his favorite social media in the form
of following various pages and friends which keeps bringing
him back on the social media platform. The following nudge
could reduce that investment:
• Unfollow nudge: According to the Fogg behavior
model [12], with increased ability, sufficient motiva-
tion, and a trigger, an action occurs. Presently on
social media platforms, unfollowing friends, groups,
and pages is a strenuous task. Alternatively, Thomas
could be presented with a prompt that makes unfol-
lowing friends, groups, and pages automized.
Demonstration
In this section, we present a demonstration of the nudges
described above in a tool for digital detox called NUDGE.
Technically NUDGE is a Chrome extension written in JavaScript
that provides an overlay on certain predefined websites. It
uses custom JavaScript code and CSS with injected HTML
to create this overlay. NUDGE currently has 1,200 weekly
active users.
The hiding nudge in NUDGE hides notifications to counter
triggers (See Figure 3). NUDGE creates friction by conceal-
ing certain sections of websites that keep the users sucked
in longer than they want. For instance, sections such as
sidebars displaying personalized recommendations on
YouTube, and sponsored ads and links on Facebook. If the
user desires to see these sections, they have to hover over
the NUDGE logo and decide to choose either ‘Show once’ or
‘Show always.’ (See Figure 4).
The default nudge intervention in NUDGE switches off so-
cial media websites, which means that when users enter
the URL of Facebook, for instance, they are not taken di-
rectly to Facebook, but to another page that requires them
to drag a slider across the screen in order to proceed to
Facebook. The slider gets stickier to drag with every visit.
In other words, increasing your visits to a particular web-
site will make the slider harder to drag, thus also adding a
friction component to the default nudge (See Figure 5).
The pause-reminder nudge disrupts mindless infinite scrolling
to keep users away from rewards of the hunt on social me-
CHI 2020 Late-Breaking Work
CHI 2020, April 25–30, 2020, Honolulu, HI, USA
LBW126, Page 4
dia (See Figure 6). NUDGE reduces the reward further by
presenting a feedback nudge that overlays on the user in-
terface. The feedback nudge grows on the screen every
five minutes a user spends on a social media site. It does
not stop a user from taking action, but keeps the user more
aware of the time they are spending (See Figure 7).
Figure 6: Pause-reminder Nudge
Figure 7: (1) Feedback nudge in
NUDGE (2) Feedback nudge
growing on screen every five
minutes a user spends on
Facebook
Figure 8: Unfollow Nudge
Finally, to undermine investment, an unfollow nudge is
available. It allows a user to unfollow 100% of their friends,
pages, and groups, and permanently get rid of their clut-
tered News Feed. Once a user has enabled the feature,
with the assistance of the ‘auto-unfollow’ feature, unfollow-
ing happens automatically (See Figure 8).
Evaluation
We conducted a preliminary evaluation of NUDGE with uni-
versity students and with actual users. A survey was pushed
on the Facebook platform for actual users to provide their
feedback. In two weeks, we received 67 responses, which
is a 24.8% response rate out of 270 active users of Face-
book out of the 1,200 NUDGE active users. For the students,
we introduced NUDGE to first-year business students (about
51 students) at the University of Neuchâtel, Switzerland. 14
students agreed to be a part of the evaluation.
The students were instructed to install NUDGE on the Chrome
browser and given two days to test it. Then the students
were asked to fill out the SUS [5] and AttrakDiff 2 [17] ques-
tionnaire to assess usability. The ability to generate reliable
results even with small sample sizes makes SUS valuable
[29]. The goal of the survey with students and actual users
was to make a preliminary assessment of (1) usability (2)
effectiveness and (3) design.
Is NUDGE usable and effective?
On the SUS scale for general usability, NUDGE achieved
a mean score of 79.8, which indicates between Good and
Excellent usability [5]. We used the short version of the
AttrakDiff 2 survey to evaluate the hedonic and pragmatic
qualities of NUDGE. The results are presented in Figure 9.
The results show mostly positive attractive (ATT), hedo-
nic (HQ), and pragmatic (PG) measures. The extension is
viewed as neither cheap nor premium.
To assess if users believed NUDGE reduced their Facebook
usage and at the same time made Facebook more enjoy-
able, we have assessed two items (1) “NUDGE reduces
time spent on Facebook” and (2) “NUDGE makes Facebook
more enjoyable” on a Likert scale from ‘Strongly disagree’
to ‘Strongly agree’ from actual users. 67 actual users re-
sponded to the items. We applied a one-sample Wilcoxon
signed-rank test to both the items that indicated the median
for the first item was significantly different from 3, Z= 6.35,
p< .001, with a strong effect size (r = .78). It can be seen
in Figure 10 that 59.7% of the users think NUDGE reduces
their Facebook usage. The median for the second item was
significantly different from 3, Z= 3.48, p< 0.01, with a mod-
erate effect size (r = .43). We can observe in Figure 10 that
22.4% users agree and 35.8% strongly agree that NUDGE
makes Facebook more enjoyable.
We were also interested in collecting diverse perspectives
about the efficacy and experiences of NUDGE. The actual
users were presented with two open questions: 1) What
is the best thing about NUDGE? And 2) What is the worst
thing about NUDGE? 65 actual users provided open-ended
comments. The analysis was completed using grounded
theory [27]. By coding the data through line-by-line, we
articulated emergent themes. Here we discuss the major
themes.
Automaticity
One theme that emerged was the various ways in which
users relished NUDGE. Automatic unclutter, automates, and
CHI 2020 Late-Breaking Work
CHI 2020, April 25–30, 2020, Honolulu, HI, USA
LBW126, Page 5
auto-unfollow were recurring codes from the users. As an
example, the following comment was coded as automates:
“Simple one-click automatization that really completes the
task of deleting my facebook feed.” Another comment that
was coded as auto-unfollow: “I didn’t have to select one
by one all the friends or pages I didn’t want to follow any-
more.” While users reported how much they valued the au-
tomatic unfollow feature, for some users automatic unfollow
evoked a sense of loss. For example, “Deleting my feed
was a drastic choice, but it’s cut my time there like tenfold.”
Figure 9: AttrakDiff 2 results
highlighting median values. N= 14
Figure 10: Survey results N = 67.
Mindfulness
Another interesting theme was whether digital nudge in-
terventions of NUDGE made users mindful of their actions.
One user saw the interventions as an incentive to remain
mindful. For example, " It acts as an incentive to be mindful
of my web use." A code that kept occurring was rationaliz-
ing; a user reported: "The notification that shows how many
pages you’ve scrolled. It is really excellent and works every
time to make me stop mindlessly scrolling." Another user
commented: "having to slide the bar across. It creates an
opportunity for mindfulness that I didn’t have before. It’s a
pain when I have to be on social media a lot in one day for
work, but the tradeoff is worth it because every time I have
to decide: Do I need to be here?."
Privacy Threat
One common area of concern for the users was that of pri-
vacy. In particular, some users found themselves to be feel-
ing as if they are losing control of their data. For instance,
one user reported, "I felt as if I took a risk in relation to my
FB-data, by giving over some control to Nudge." In some
cases, users developed skepticism " I did feel skeptical
about letting a relative alien add-on interfere with my face-
book." One user stated, " Not sure about the privacy thing."
Customization
Several users asked for greater possibility of customization.
Codes like customizing, personalizing, and customizability
were recurrent. For instance, one user reported, " I wish
there could be some kind of middle ground. Like a feed with
only my best friends and favorite pages? And no sponsored
content?". Another user reported on customizing rainbow
timer, "Maybe there should be an option for how much time
it takes to add each ring to make it faster? Or an option to
slightly adjust the color of the rings?".
Conclusion and Future Work
In this paper, we introduced and evaluated NUDGE, a Chrome
extension designed to counter the cycle of addictive design
contexts employed by social media platforms. These pre-
liminary results show how system designers can design and
channel digital nudges to combat social media addiction. It
is relevant to note that NUDGE utilizes a counter-intuitive de-
sign approach to decrease the usability of social media plat-
forms. Users appreciated NUDGE and believed that NUDGE
reduced their Facebook usage and, at the same time, made
Facebook more enjoyable. Future research could build on
previous work on unorthodox usability (e.g., [13, 4, 15, 18])
and investigate how reducing social media usability could
improve the user experience. The design evaluation of
NUDGE and its shortcomings presented in this paper will al-
low designers to build effective, likable, and practical digital
detox apps. In the future, we aim to grasp a more in-depth
understanding of the described nudges individually by con-
ducting RCTs with a combination of usage log data and
survey results. Furthermore, we will deeply investigate the
concerns of privacy, sadness following a loss of investment,
and bringing the ability to customize digital nudges.
CHI 2020 Late-Breaking Work
CHI 2020, April 25–30, 2020, Honolulu, HI, USA
LBW126, Page 6
REFERENCES
[1] A. Alutaybi, E. Arden-Close, J. McAlaney, A.
Stefanidis, K. Phalp, and R. Ali. 2019. How Can Social
Networks Design Trigger Fear of Missing Out?. In
2019 IEEE International Conference on Systems, Man
and Cybernetics (SMC). 3758–3765. DOI:
http://dx.doi.org/10.1109/SMC.2019.8914672
[2] Emelie Andersson. 2019. Pause-reminders in mobile
applications. In CONFERENCE IN INTERACTION
TECHNOLOGY AND DESIGN. 67.
[3] Cecilie Schou Andreassen. 2015. Online social
network site addiction: A comprehensive review.
Current Addiction Reports 2, 2 (2015), 175–184.
[4] Paul M. Aoki and Allison Woodruff. 2005. Making
Space for Stories: Ambiguity in the Design of Personal
Communication Systems. In CHI’05. 181–190.
[5] Aaron Bangor, Philip T. Kortum, and James T. Miller.
2008. An Empirical Evaluation of the System Usability
Scale. International Journal of Human–Computer
Interaction 24, 6 (2008), 574–594. DOI:
http://dx.doi.org/10.1080/10447310802205776
[6] Jaclyn Cabral. 2008. Is generation Y addicted to social
media. Future of children 18 (2008), 125.
[7] Ana Caraban, Evangelos Karapanos, Daniel
Gonçalves, and Pedro Campos. 2019. 23 Ways to
Nudge: A Review of Technology-Mediated Nudging in
Human-Computer Interaction. In Proceedings of the
2019 CHI Conference on Human Factors in Computing
Systems. ACM, 503.
[8] Marta E. Cecchinato, John Rooksby, Alexis Hiniker,
Sean Munson, Kai Lukoff, Luigina Ciolfi, Anja Thieme,
and Daniel Harrison. 2019. Designing for Digital
Wellbeing: A Research & Practice Agenda. In
Extended Abstracts of the 2019 CHI Conference on
Human Factors in Computing Systems (CHI EA ’19).
ACM, New York, NY, USA, Article W17, 8 pages. DOI:
http://dx.doi.org/10.1145/3290607.3298998
[9] Thomas H Davenport and John C Beck. 2001. The
attention economy: Understanding the new currency of
business. Harvard Business Press.
[10] Isaac Dinner, Eric J. Johnson, Daniel G. Goldstein,
and Kaiya Liu. 2011. Partitioning default effects: Why
people choose not to choose. Journal of Experimental
Psychology: Applied 17, 4 (dec 2011), 332–341. DOI:
http://dx.doi.org/10.1037/a0024354
[11] Nir Eyal. 2014. Hooked: How to build habit-forming
products. Penguin UK.
[12] Brian J Fogg. 2009. A behavior model for persuasive
design. In Proceedings of the 4th international
Conference on Persuasive Technology. ACM, 40.
[13] William W Gaver, Jacob Beaver, and Steve Benford.
2003. Ambiguity as a resource for design. In ACM
CHI’03. 233–240.
[14] Claus Ghesla, Manuel Grieder, and Jan Schmitz.
2019. Nudge for Good? Choice Defaults and Spillover
Effects. Frontiers in Psychology 10 (2019), 178. DOI:
http://dx.doi.org/10.3389/fpsyg.2019.00178
[15] Dimitris Grammenos. 2014. Abba-dabba-ooga-booga-
hoojee-goojee-yabba-dabba-doo: stupidity, ignorance
& nonsense as tools for nurturing creative thinking. In
CHI’14 EA.
CHI 2020 Late-Breaking Work
CHI 2020, April 25–30, 2020, Honolulu, HI, USA
LBW126, Page 7
[16] Eduardo Guedes, Antonio Egidio Nardi, Flávia Melo
Campos Leite GuimarÃ, Sergio Machado, and Anna
Lucia Spear King. 2016. Social networking, a new
online addiction: a review of Facebook and other
addiction disorders. MedicalExpress 3 (02 2016).
[17] Marc Hassenzahl, Michael Burmester, and Franz
Koller. 2003. AttrakDiff: Ein Fragebogen zur Messung
wahrgenommener hedonischer und pragmatischer
Qualität. Vieweg+Teubner Verlag, Wiesbaden,
187–196. DOI:
http://dx.doi.org/10.1007/978-3-322-80058- 9_19
[18] Adrian Holzer, Andrii Vozniuk, Sten Govaerts, Harry
Bloch, Angelo Benedetto, and Denis Gillet. 2015.
Uncomfortable Yet Fun Messaging with Chachachat.
In Proceedings of the 2015 Annual Symposium on
Computer-Human Interaction in Play (CHI PLAY ’15).
ACM, New York, NY, USA, 547–552. DOI:
http://dx.doi.org/10.1145/2793107.2810296
[19] Logan Kugler. 2018. Getting Hooked on Tech.
Commun. ACM 61, 6 (May 2018), 18–19. DOI:
http://dx.doi.org/10.1145/3204447
[20] Phil Longstreet and Stoney Brooks. 2017. Life
satisfaction: A key to managing internet & social media
addiction. Technology in Society 50 (2017), 73–77.
[21] Thomas Mejtoft, Sarah Hale, and Ulrik Söderström.
2019. Design Friction. In Proceedings of the 31st
European Conference on Cognitive Ergonomics
(ECCE 2019). Association for Computing Machinery,
New York, NY, USA, 41–44. DOI:
http://dx.doi.org/10.1145/3335082.3335106
[22] Christian Montag, Zhiying Zhao, Cornelia Sindermann,
Lei Xu, Meina Fu, Jialin Li, Xiaoxiao Zheng, Keshuang
Li, Keith M Kendrick, Jing Dai, and Benjamin Becker.
2018. Internet Communication Disorder and the
structure of the human brain: initial insights on
WeChat addiction. Scientific Reports 8, 1 (2018),
2155. DOI:
http://dx.doi.org/10.1038/s41598-018-19904-y
[23] Fabian Okeke, Michael Sobolev, Nicola Dell, and
Deborah Estrin. 2018. Good Vibrations: Can a Digital
Nudge Reduce Digital Overload?. In Proceedings of
the 20th International Conference on
Human-Computer Interaction with Mobile Devices and
Services (MobileHCI ’18). ACM, New York, NY, USA,
Article 4, 12 pages. DOI:
http://dx.doi.org/10.1145/3229434.3229463
[24] Ken Peffers, Tuure Tuunanen, Marcus A Rothenberger,
and Samir Chatterjee. 2007. A design science
research methodology for information systems
research. Journal of management information systems
24, 3 (2007), 45–77.
[25] Aditya Kumar Purohit and Adrian Holzer. 2019.
Functional Digital Nudges: Identifying Optimal Timing
for Effective Behavior Change. In Extended Abstracts
of the 2019 CHI Conference on Human Factors in
Computing Systems (CHI EA ’19). ACM, New York,
NY, USA, Article LBW0147, 6 pages. DOI:
http://dx.doi.org/10.1145/3290607.3312876
[26] Nick Seaver. 2018. Captivating algorithms:
Recommender systems as traps. Journal of Material
Culture (2018), 1359183518820366.
[27] Anselm Strauss and Juliet Corbin. 1994. Grounded
theory methodology. Handbook of qualitative research
17 (1994), 273–85.
CHI 2020 Late-Breaking Work
CHI 2020, April 25–30, 2020, Honolulu, HI, USA
LBW126, Page 8
[28] Richard H Thaler and Cass R Sunstein. 2009. Nudge:
Improving decisions about health, wealth, and
happiness. Penguin.
[29] Thomas S Tullis and Jacqueline N Stetson. 2004. A
comparison of questionnaires for assessing website
usability. In Usability professional association
conference, Vol. 1. Minneapolis, USA.
[30] Patti M Valkenburg, Jochen Peter, and Alexander P
Schouten. 2006. Friend networking sites and their
relationship to adolescents’ well-being and social
self-esteem. CyberPsychology & Behavior 9, 5 (2006),
584–590.
CHI 2020 Late-Breaking Work
CHI 2020, April 25–30, 2020, Honolulu, HI, USA
LBW126, Page 9