ArticlePDF Available

Abstract and Figures

Research and practice have mostly focused on the "bright side" of social media, aiming to understand and help in leveraging the manifold opportunities afforded by this technology. However, it is increasingly observable that social media present enormous risks for individuals, communities, firms, and even for society as a whole. Examples for this "dark side" of social media include cyberbullying, addictive use, trolling, online witch hunts, fake news, and privacy abuse. In this article, we aim to illustrate the multidimensionality of the dark side of social media and describe the related various undesirable outcomes. To do this, we adapt the established social media honeycomb framework to explain the dark side implications of each of the seven functional building blocks: conversations, sharing, presence, relationships, reputation, groups, and identity. On the basis of these reflections, we present a number of avenues for future research, so as to facilitate a better understanding and use of social media.
Content may be subject to copyright.
Social media? It's serious! Understanding the dark side of social media
Christian V. Baccarella
, Timm F. Wagner
, Jan H. Kietzmann
, Ian P. McCarthy
at Erlangen-Nürnberg, Lange Gasse 20, 90403, Nürnberg, Germany
adidas AG, Adi-Dassler-Strasse 1, 91074, Herzogenaurach, Germany
Gustavson School of Business, University of Victoria, 3800 Finnerty Rd, Victoria, BC, V8P 5C2, Canada
Beedie School of Business, Simon Fraser University, 500 Granville Street, Vancouver, BC, V6C 1W6, Canada
article info
Article history:
Social media
Dark side
Unintended consequences
Fake news
Research and practice have mostly focused on the bright sideof social media, aiming to understand and
help in leveraging the manifold opportunities afforded by this technology. However, it is increasingly
observable that social media present enormous risks for individuals, communities, rms, and even for
society as a whole. Examples for this dark sideof social media include cyberbullying, addictive use,
trolling, online witch hunts, fake news, and privacy abuse. In this article, we aim to illustrate the
multidimensionality of the dark side of social media and describe the related various undesirable out-
comes. To do this, we adapt the established social media honeycomb framework to explain the dark side
implications of each of the seven functional building blocks: conversations, sharing, presence, re-
lationships, reputation, groups, and identity. On the basis of these reections, we present a number of
avenues for future research, so as to facilitate a better understanding and use of social media.
©2018 Published by Elsevier Ltd.
1. Introduction
Over the past decade, social media have been transforming how
individuals, communities, and organizations create, share, and
consume information from each other and from rms. What ap-
peals to almost 90% of the younger EU citizens (Eurostat, 2017a)is
how social media differ from traditional media (e.g., newspaper and
television) in terms of their reach, interactivity, usability, and
ubiquity. In 2017, users spent more than 2 hours on average per day
on social networks and messaging services (half an hour each day
longer than ve years earlier), which amounted to about one third
of their entire daily computer time (Mander, 2017).
Many studies have touted the advantages that social media
would bring to individuals and rms (e.g., Kumar, Bezawada,
Rishika, Janakiraman, &Kannan, 2016;Sabate, Berbegal-Mirabent,
nabate, &Lebherz, 2014;Wagner, 2017). They highlight the
bright side of social mediaand how engagement between rms
and consumers is being democratized (Kietzmann, Hermkens,
McCarthy, &Silvestre, 2011). For rms, this means social media
would improve marketing, public relations, customer service,
product development, personnel decision-making, and other
business activities that rely on information exchanges and
engagement with consumers and employees. Many of these ad-
vantages have materialized, thus leading almost 50% of all EU rms
to use at least one form of social media in 2017 (Eurostat, 2017b).
These rms use social media to not only broadcast company con-
tent but also track sentiment worldwide by analyzing user-
generated content (Paniagua, Korzynski, &Mas-Tur, 2017),
consumer-generated intellectual property (Berthon, Pitt,
Kietzmann, &McCarthy, 2015), and interactions on social
networking sites (Wagner, Baccarella, &Voigt, 2017), to adjust their
business and marketing strategies appropriately.
Regardless of the numerous opportunities social media offer, an
increasing number of incidents demonstrate that there is un-
doubtedly a dark sideto social media. Chamath Palihapitiya, a
former Facebook executive, recently stated that he regrets that
some of the tools he has helped to create are ripping apart the
social fabric of how society works(Wong, 2017). This quote vividly
illustrates how the qualities that underlie the enormous presence
of social media platforms are now also undermining the freedoms
and the well-being of the individuals and communities they serve.
For example, there have been an increasing number of reports and
research attention into concerns such as cyberbullying (O'Keeffe &
Clarke-Pearson, 2011), trolling (Buckels, Trapnell, &Paulhus, 2014;
Hardaker, 2010), privacy invasions (Pai &Arnott, 2013), fake news
(Allcott &Gentzkow, 2017;European Commission, 2018), online
*Corresponding author.
E-mail address: (C.V. Baccarella).
Contents lists available at ScienceDirect
European Management Journal
journal homepage:
0263-2373/©2018 Published by Elsevier Ltd.
European Management Journal 36 (2018) 431e438
restorms (Pfeffer, Zorbach, &Carley, 2014), and addictive use
(Blackwell, Leaman, Tramposch, Osborne, &Liss, 2017). Further-
more, a 2017 survey found that Britons aged 14e24 believe that
social media, such as Facebook, Instagram, Snapchat, and Twitter,
exacerbated self-consciousness and fear of missing out
(Przybylski, Murayama, Dehaan, &Gladwell, 2013), which can
result in increased levels of anxiety, sleep loss, and depression (e.g.,
Levenson, Shensa, Sidani, Colditz, &Primack, 2016). In the work-
place, a recent study found that the benets of social media also
come with negative consequences through work-life conicts and
interruptions that increase exhaustion (van Zoonen, Verhoeven, &
Vliegenthart, 2017).
Even with social media executives admitting that their plat-
forms have deleterious impacts, users tend not to question the
short- and long-term implications and potential risks of their
choices. Many company employees and customers now belong to a
generation of digital natives who have grown up with social media,
rather than rst learning to use these technologies as adults
(Bennett, Maton, &Kervin, 2008). Most adult users, too, have
become so accustomed to social media that the types of conver-
sations, self-expression, community building, and other forms of
online engagement are now parts of the only reality they know. It is
therefore of utmost importance to take a step back to reect on how
we have arrived at the present and what our most recent social
media advancesmight mean for us in the future.
In this article, we draw attention to the duality of social media:
for the many bright sides of social media, there are also dark sides
that are worthy of being investigated so that we become more
conscious of their potential risks and make better-informed de-
cisions. We begin by clarifying what we mean by the dark side of
social media, and why it is a concern for society. We then introduce
a framework for understanding the dark side of the core func-
tionalities of different social media platforms. By using the ideas
and issues that this framework highlights, we then outline a
number of important research opportunities that could help in
facilitating a healthier use of the mediaby better understanding
their related negative impact on the socialfabric of society.
2. The darkness of social media
With the expression dark side,we highlight that social media
like many phenomena, including fast food (Schlosser, 2002),
entrepreneurship (Beaver &Jennings, 2005), capital markets
(Scharfstein &Stein, 2000), crowdsourcing (Kietzmann, 2017;
Wilson, Robson, &Botha, 2017), and the sharing economy
(Malhotra &van Alstyne, 2014), can have negative or detrimental
consequences on society that are worthy of research attention.
However, it is important to recognize that social media are not good
or bad, helpful or unhelpful, black or white, and bright or dark. The
consequences of many technological innovations, intentional and
unintentional, are usually not dichotomous, but simultaneously
have both bright and dark sides. When Alfred Nobel invented
dynamite in 1866, he called it Nobel's Blasting Powder.It signif-
icantly improved mining, quarrying, and construction, but of
course, it also improvedwarfare when armies realized the
weaponized potential of dynamite explosions. In a similar duality,
we use social media to connect to our far-away friends, and at the
same time, we disconnect from those who sit across the table from
us. Importantly, these new types of engagement have long-term
implications. The shallowing hypothesis,for instance, suggests
that certain types of social media activity (e.g., sharing and
conversing) lead to a decline in ordinary daily reective thinking
and instead promote quick and supercial thoughts that can result
in cognitive and moral triviality.
With social media, the degree of brightness or darkness is often
a subjective matter. When Rachel Burns in the UK posted a photo on
Facebook of her singalong activity with residents at the care home
at which she worked, she joined the many others who have been
red for a sharing faux-pas (Karl, Peluchette, &Schlaegel, 2010;
Schmidt &O'Connor, 2015). While she, the people in the photo, and
likely much of the UK public thought the posting was well inten-
tioned and harmless, her employers saw it as a breach of privacy
rules. In the end, the Facebook posting cost Burns her job, after 21
years of service. In contrast, few would argue with the intense
darkness of the case of Britain's Richard Huckle, who created an
online leader board and awarded himself and others pedopoints
for sharing original and new recordings of sexual crimes against
minors on social media (Wolak, Liberatore, &Levine, 2014). The
degree to which perpetrators are aware of the nature of their ac-
tions varies, too. Cyberbullying may be a way to intentionally harm
individuals, while oversharing photos of positive experiences un-
intentionally causes anxiety among those who live lives less
glamorous. Moreover, some actions require technological savvy
(e.g., gamication of criminal behavior and hacking), whereas
others rely on the use of blunt tools (e.g., posting videos online).
As the attraction, use, and impacts of the bright side of social
media can be studied and understood using a multidimensional
honeycomb framework based on seven social media building
blocks (Kietzmann et al., 2011), the impacts of these dimensions on
society can also be dark, separated by various shades of gray. Thus,
to understand how social media can also lead to undesirable out-
comes for individuals and communities, we now employ this
framework in the next section of this paper.
3. The dark side of the seven building blocks of social media
To understand how individuals, communities, and organizations
can use different social media platforms to connect, monitor, and
engage with each other, Kietzmann et al. (2011) developed a hon-
eycomb framework. This framework unpacks social media func-
tionalities into seven building blocks (see Fig. 1) to describe
different features of the social media user experience and the
extent to which different social media are driven by each func-
tionality. These functionalities refer to the extent to which users
can (i) converse with each other, (ii) share content, (iii) let others
Fig. 1. Social media functionality (Kietzmann et al., 2011).
C.V. Baccarella et al. / European Management Journal 36 (2018) 431e438432
know about their presence, (iv) form relationships, (v) know the
reputation of others, (vi) form groups, and (vii) ultimately reveal
their identity (Kietzmann et al., 2011). A large number of studies
show how different social media platforms use these functions of
the honeycomb framework to different degrees to achieve specic
business functions or the aims of an overall business model for the
organization (e.g., Moorhead et al., 2013;Smith, Fischer, &Yongjian,
2012;Zhou &Wang, 2014).
Given the honeycomb framework's usefulness in understanding
the functional building blocks of social media, we suggest that each
functional block of the framework can also be used to help to un-
derstand and examine different aspects of the dark side of social
media. Focusing on the negative consequences that consumers and
communities face from social media, we now explain the dark side
implications of each functional block (see Fig. 2). The explanations
are not exhaustive but based on prototypical behaviors and
3.1. Conversations
This building block refers to the extent to which users
communicate with others using a social media platform. Twitter,
Facebook, Instagram, LinkedIn, and many other platforms provide
the ability to converse with functions such as Like,”“Reply,
Comment,and Direct Message.For more detailed conversa-
tions, blogs provide a forum for users to post and discuss diary-style
commentaries, which readers can then comment on or even add to.
Further, Reddit, which at the time of writing this article was ranked
the 6th most visited website in the world (Alexa Internet, 2018), is
essentially a bulletin board system for over 200 million unique
users who each month discuss and vote on content covering a vast
array of topics, such as news, entertainment, sport, health, music,
and literature.
The dark side of the conversation functionality is that excessive,
aggressive, and inaccurate engagement can occur. For example,
despite having clear rules and signicant moderating resources
intended to prohibit harassment, Reddit users can be exposed to
threats and bullying. In 2010, a Reddit user who had genuinely
donated a kidney made a posting seeking donations to a related
charity. Some Reddit users thought this was a scam and contacted
the user at his home with death threats. Further, in 2013, Reddit
admitted that its platform had helped to fuel online witch hunts
when groups of users had wrongly named several people as sus-
pects in the Boston bombing (BBC News, 2013;Suran &Kilgo, 2017).
Furthermore, with the advent of articial intelligence-powered
chatbots and social bots that mimic human behavior and conver-
sations, social media platforms often surreptitiously use these bots
to pollute conversations in online spaces with spams and
misleading advertisements (Ferrara, Varol, Davis, Menczer, &
Flammini, 2016). The contributing editor of Scientic American
Mind and former editor in chief of Psychology Today, for instance,
was fooled into thinking a chatbot on a dating service was inter-
ested in him romantically (Epstein, 2007). But even the tools can be
victims that in turn can contribute to dark conversations. When
Microsoft released its Tay chatbot in 2016 to engage with millen-
nials through Twitter, for example, some users tricked the chatbot
into learning how to make racist statements (Wolf, Miller, &
Grodzinsky, 2017).
3.2. Sharing
This aspect of a social media platform concerns the extent to
which consumers exchange, distribute, and receive content.
Consider for example how people use Flickr to share photos, You-
Tube to share videos, and Instagram to share photos and videos.
Along with the content produced by the conversation functionality
of social media, the content shared by social media users often
constitutes user-generated content (UGC), i.e., the audio, video,
images, and text created by users of an online system (Berthon
et al., 2015;van Dijck, 2009). Successful social media platforms
understand how to build on the user motivations for sharing
different content along with the relationships and groups that can
be produced from these motivations (Kietzmann et al., 2011).
Once users can easily share content, a fundamental risk is that
the shared content can be inappropriate and undesirable or that it
can be shared without permission from the holder of any intel-
lectual property rights (IPR) associated with the content. In terms of
inappropriate content, a survey of 10,000 European children aged
9e16 reported that 40% of children expressed shock and disgust
when viewing violent or pornographic content that had been
shared by others online (Livingstone, Kirwil, Ponte, &Staksrud,
2014). Conversely, investigations by news agencies revealed that
more than 100 websites that posted fake news about the 2016 U.S.
presidential election were run by teenagers in the small town of
Veles, Macedonia (Allcott &Gentzkow, 2017). For IPR infringement
risks, consider for example the case of Stephanie Lenz who, in
February 2007, posted a 29-second video on YouTube of her baby
pushing a toy around the kitchen, while dancing to the song Let's
Go Crazyby Prince that was being played in the kitchen. In June
2007, the Universal Music Group (Universal), which manages the
copyright of Prince's music, decided that the song rather than the
dancing baby was the focal content in the video. Universal issued a
takedown notice to YouTube, and the video was immediately
removed from the website. Lenz disputed the removal of the video
and led a counter-noticeto YouTube, arguing that no rights of
Universal were violated by her video. Universal insisted that
sharing the video was wilful copyright infringement and that Lenz
was liable to a ne of up to US$150,000 (Lessig, 2008). In sum, it is
clear that the powerful and addictive sharing functionality of social
media presents risks to those who share content and those who
consume the content that is shared.
3.3. Presence
This functional block concerns the extent to which
Fig. 2. The dark side of social media functionality.
C.V. Baccarella et al. / European Management Journal 36 (2018) 431e438 433
organizations and individuals know whether, where and when,
others are accessible. This includes knowing the presence of others
in an online world and/or in the real world. For example, Facebook,
Trapster, Google Maps, and other social media use internet protocol
(IP) address information, location information from mobile devices,
and check-indeclarations to monitor where individuals are and
possibly allow others to know this information too. This status
about one's presence allows others to communicate synchronously
(Elaluf-Calderwood, Kietzmann, &Saccol, 2005) and to have higher
levels of intimacy and immediacy (Kaplan &Haenlein, 2010), which
results in more inuential interactions.
The dark side of this functionality is that the location and
availability of users are known and can be tracked without their
awareness or consent. For example, in 2011, Facebook launched the
Messenger application for mobile devices. From 2011 to 2015, the
default settings for this application were to collect and display
geolocation information with message content in the conversa-
tions. In 2012, a number of organizations reported privacy concerns
about the implication of this geolocation sharing feature of the
application (Cipriani, 2012), but to no avail. The policy was not
changed. Then, in 2015, an extension for the Chrome browser
named Marauders Map (after the magical map in the Harry Potter
books that reveals by magic the location of each person) was
developed and made available by a Harvard University computer
science and mathematics student (Nosouhi, Qu, Yu, Xiang, &
Manuel, 2017). This extension used Messenger data to map, and
thus stalk, the identity, locations, and movements of all individuals
in a conversation. In the dark side context, the presence building
block is very directly tied to issues pertaining to protecting or
invading the privacy and possibly the safety of people who engage
on social media.
3.4. Relationships
This building block is concerned with the extent to which users
can relate to other users through social media platforms. In other
words, users have some form of an association that determines why
they engage with each other and the what-and-how of the content
they exchange. For example, LinkedIn's 500 million users are
largely structured around professional relationships based on who
users work for, work with, and where they work. In contrast,
Facebook is based on friendships of all kinds, including those based
on residential address, educational institutions attended, and as-
sociations with friends of friends.
When social media help to establish and reveal relationships,
they enable different types of social engagement and related
deleterious consequences. This includes cyberbullying, stalking,
and online harassment (Kwan &Skoric, 2013), where estimates are
that 10e40% of the youth are victims of cyberbullying (Kowalski,
Giumetti, Schroeder, &Lattanner, 2014), and 40% of those who
cyberbully report they do so for fun (Raskauskas &Stoltz, 2007),
possibly connected with occurrences of jealousy. This is because
heavy users of social media are more likely to believe that others
have better and happier lives (Chou &Edge, 2012), and those who
live vicariously through others online have signicant maintenance
demands and access social media platforms excessively for the fear
of missing out (Fox &Moreland, 2015).
3.5. Reputation
Reputation is the degree to which users can identify and inu-
ence the standing of others, including themselves, in a social media
setting. It is a construct that can be viewed in terms of how in-
dividuals display competence (functional reputation), demonstrate
social norms (social reputation), or appear to be attractive,
fascinating, or inspiring (expressive reputation) (Eisenegger, 2009).
The reputation of a social media user can be indicated by, for
example, the number of followers someone has on Twitter, the
sentiments of the comments expressed about a social media user,
the view and like counts for a user's YouTube video, or the status
some platforms award to users (e.g., the inuencer status on
A major reputation risk stems from sharing inappropriate con-
tent, which can destroy the sharer's reputation and/or the reputa-
tions of others. Regardless of whether a posting is based on true or
false content, the more outrageous the posting, the quicker it tends
to spread and harm reputations. Every year, numerous business and
political leaders, and others, are forced to resign after posting
offensive, disingenuous, or ridiculous content on social media. To
protect employers' own reputation, during hiring, they often re-
view applicants' proles on social media to see whether they have
posted inappropriate content (Roulin, 2014); and 44% of adults in
2010 searched for information about someone who they sought to
engage in a professional capacity (Madden &Smith, 2010).
Certainly, there are numerous reasons not only for celebrities but
also for everyday citizens to pay attention to their online standing
and to be on high alert. Blogs such as Gawker and Wonkette have
helped to destroy the reputations of many public gures, whereas
Hollaback!, Don't Date Him Girl, and revenge pornsites provide
platforms for users to shame, hurt, or reprimand others (Woodruff,
2014). Despite the long-term consequences of their actions, in an
online setting, users revealed that they did not properly think about
their reason for posting, misjudged who could be the audience,
were highly emotional when posting, or were under the inuence
of drugs or alcohol (Wang et al., 2011).
3.6. Groups
This building block ascribes the function that social media users
can create or join circles of friends or communities centered around
a shared practice or interest. On Facebook, people participate in
these open or closed groups (or those hidden to non-members) for
socializing, entertainment, self-status seeking, and information
purposes (Park, Kee, &Valenzuela, 2009). Associated benets for
the users are the possibility to organize their growing social
network and to allow permissions what content can be accessed by
specic (sub-)groups, e.g., what photos will be displayed on a
contact's timeline on Facebook. Next to the managing function of
this aspect, being part of a specic group on social media may also
act as a signal to make other users aware with what kind of in-
terests or values one can identify with.
The negative side of the groups building block, for instance,
often shows its face through what social psychologists refer to as
ingroup-outgroup bias(Tajfel &Turner, 2004). People dene
themselves in terms of social groupings (ingroup identity). They
nd themselves in an echo-chamber in which their own beliefs are
amplied and reinforced, and those who do not t into those
groups (outgroups) are belittled. The ingroup love and outgroup
hate(Brewer, 1999) can be seen when people not only exclude
others from conversations or group membership but also lose
empathy for them (i.e., the intergroup empathy gap). The many
polarized discussions of racial and gender (in)equality serve as
appropriate examples, as do the recent populist elections, and more
generally, the separation of leaders and inuencers vs. the general
public online (Hogg &Reid, 2006).
3.7. Identity
Presenting one's identity is a central aspect of social media. Next
to providing objective personal information, such as gender or age,
C.V. Baccarella et al. / European Management Journal 36 (2018) 431e438434
users can also provide more subjective details that express their
identity by, for example, joining groups with a specic topic or by
liking, disliking, or commenting on content. Setting up a prole and
therefore becoming identiable is a prerequisite for most social
media applications. Although each user can generally decide for
him-/herself what kind of personal information they want to share,
social media sites have a great interest that users share as much
information as possible.
Displaying one's identity online, however, has many downsides.
The other building blocks (e.g., the conversations and relationships
online, the sharing of images, the location-specic data, the
ingroup-behavior, and the online reputation) have a direct impact
of who we are, to our personality and character. The visibility,
transparency, permanence, and granularity with which social me-
dia content online connects to people's lives ofine emphasizes that
social media users are not in control of their own identity any
longer, thus leading to all sorts of privacy and safety risks. A recent
UK study found that the lack of privacy and protection on social
media is one major concern of children and young people (The
Children's Society, 2018). Some have even gone so far as to say
that stalking others on social media has already become a
normality. For example, returning to the Facebook or Instagram
prole of former partners and spouses to see what they are up to is
nowadays considered as relatively harmless (Lyndon, Bonds-
Raacke, &Cratty, 2011). Using our identity against us is also an
issue in the business context. Despite potential privacy issues,
searching online for private proles appears reasonable in order to
gain a comprehensive picture of a job applicant (Fuchs, 2014). Such
snooping behavior is strongly supported by many social media
platforms. For instance, in Europe, a report commissioned by the
Belgian Privacy Commission stated that the largest social media
platform, Facebook, does not provide appropriate control mecha-
nisms concerning user data. It highlights that Facebook's default
privacy settings are so difcult to nd and change that the auto-
matic and common outcome is behavioral proling by the platform
(van Alsenoy et al., 2015).
Utilized individually and together, these seven building blocks
can help researchers and managers make sense of the multidi-
mensionality of various dark side phenomena we observe today. In
Fig. 3, we illustrate how two common harmful social media activ-
ities (trolling and fake news) occur through different degrees of
dark social media functionality. The darker the color of the block,
the greater is the role of this functionality in the example delete-
rious social media activity.
Trolling, in shing, is a method where one moves the shing
lines slowly back and forth, dragging the bait through the water and
hoping for a bite. Trolling on social media is much the same eso-
called trolls bait others by posting inammatory lines (messages
in the conversations block) or sharing inappropriate content (in the
sharing block) and then wait for a bite on the line. The intent is to
provoke members of an online community (groups block) and to
disrupt normal, on-topic discussions, relationships, or reputations.
The motivation for trolling is not to stimulate thought-provoking
discussions but to sow discord on the Internet and get a rise out
of people simply for the amusement of the troll. Trolling usually
happens in online forums, YouTube comments, or on Reddit, but it
can also happen in organizational contexts. Employees who send
malicious messages and leave touchy comments for their co-
workers bring distraction to team work, decrease motivation, and
impact effectiveness. Sometimes, trolls nd entertainment in
wasting organizational resources by taking up the time of customer
support representatives, for instance in pointless Twitter exchanges
or in lengthy, off-topic LiveChat conversations.
Fake news are pseudo newsthat are presented as being
factually accurate, without truly being based on actual facts. Such
news are either fabricated by disseminating information on social
media that is deliberately false (i.e., disinformation) or the result of
accidental, honest mistakes (i.e., misinformation) (see Hannah,
McCarthy, &Kietzmann, 2015). In both cases, the degree of truth-
fulness in the news on social media is difcult to determine,
because the authenticity of the news source is almost impossible to
verify (as opposed to the certainty of traditional media outlets, like
the Washington Post). However, the popular appeal and often
sensational nature of fake news attracts millions of readers. Ac-
cording to Buzzfeed's Craig Silverman, during the 2016 U.S. presi-
dential election, the top 20 fake news stories received more
engagement on Facebook than the top 20 news stories on the
election from 19 major media outlets (Chang, Lefferman, &
Pedersen, 2016). Whether they truly inuenced the outcome of
the election is a different matter, but the intention of the fake news
was clearly to create a greater separation between groups (e.g.,
voters) by seeding disinformation into online conversations,
Fig. 3. Contrasting the dark functionalities of trolling and fake news.
C.V. Baccarella et al. / European Management Journal 36 (2018) 431e438 435
sharing inappropriate content, and shaming and defamation of the
opposite party. But fake news are not just about celebrities and
politicians. Brands can be negatively affected, too. Pepsi's CEO, for
instance, was said to have told Trump fans to take their business
elsewhere,a false statement that led to an online restorm (a
sudden discharge of large quantities of messages containing
negative word-of-mouth and complaint behavior against a person,
company, or group in social media networks,Pfeffer et al., 2014,p.
118) and a threatened boycott of Pepsi's products. Snopes, an online
fact-checking website, lists many similar fake news stories. The
sportswear company New Balance, for instance, was falsely praised
as the ofcial brand of the Trump Revolution,thus leading cus-
tomers who opposed the candidate's agenda to burn their New
Balance shoes (Gupta, 2016). But fake news do not only impact big
brands. Fake news about any publicly traded company, small or big,
can truly inuence its share value, allowing those who seed the fake
news to potentially realize a healthy nancial gain at the expense of
the rm. Or fake news can harm the reputation of a rm and place
its closest competitor in a favorable position, dissuade potential
employees, and harm relationships with other stakeholders.
These two examples show the descriptive, explanatory, and
comparative value of using the honeycomb framework to under-
stand the dark side and the social media landscape more generally.
The framework thus provides a basis to guide the development of
appropriate strategies to use and consume social media, which we
now discuss in our call-to-action for researchers.
4. Don't be afraid of the dark: a call-to-action for social media
To shed light on and lighten the dark side of social media, a
report by the European Commission (2018) recommended to
refrain from simplistic approaches but rather to respond with a
multidimensional approachto the many social media phenom-
ena that have reared their ugly heads. The report illustrates the
inherent complexity of these issues and emphasizes the urgency to
nd appropriate answers to counteract further unfavorable de-
velopments. In this reections article, our goal was to promote a
similar multidimensional approach in order to understand how
social media can lead to undesirable outcomes for individuals,
communities, or organizations. With this goal, our dark side
honeycomb frameworksuggests that combinations of identity,
presence, relationships, conversations, groups, reputations, and
sharing are key constructs of the dark side phenomena we observe
The development of the dark side honeycomb and its seven
functional building blocks is one initial option toward this goal. We
hope that it serves as a useful foundation for further research and
motivates more researchers to take a walk on the dark sideso as
to understand the risks and how social media can help individuals,
communities, and organizations engage effectively and safely on-
line. With this goal, we pose the following three calls-to-action,
each of which offers multiple avenues for future management
4.1. Build dark side-orientated social media theories, models, and
classication frameworks!
The intention of this article was to illustrate the seven building
blocks of the dark side of social media, in great part to motivate
further research that tries to untangle the underlying mechanisms
in new ways. Existing theories cannot necessarily be transferred to
the social media sphere (Naylor, Lamberton, &West, 2012). New
theories, or combinations of existing theories, might better suit the
inherent characteristics of social media, akin for example, to
Scheiner, Kr
amer, and Baccarella (2016) who base their theoretical
framework to explain unethical behavior on social media by en-
trepreneurs on the concept of moral disengagement and regulatory
focus theory. We believe that our dark side honeycomb framework
can help to motivate and guide the combination of lenses from
different disciplines in order to develop novel theories, models, and
classication frameworks that shed light on the dark side of social
4.2. Use adequate methodologies for online and dark contexts!
There is a signicant opportunity for future research studies
using contemporary methodologies that suit the characteristics of
social media. For instance, a recent and effective development for
understanding online behavior might be netnography (a port-
manteau of Internet and ethnography), which allows researchers to
study social interaction in modern digital communication contexts.
However, a lot has happened since its introduction by Kozinets in
1998: smart phones with high-denition cameras, ubiquitous data
networks, and social media networks that did not exist at all. The
activities in the sharing building block of the dark side honeycomb,
for instance, certainly were not the same before the widespread
adoption of these tools, and neither were likely any of the other
building blocks. These technological developments and their
pervasiveness in our society certainly warrant the advancement of
digital data collection and analysis methodologies. Especially in
light of recent advancements (e.g., articial intelligence-powered
social media content analysis tools included in IBM Watson), we
hope that fellow researchers will develop and test new ways in
which we can study the dark side of social media.
4.3. Collect data and build cases!
Many of the dark side examples reported in this article were
drawn from accounts in the popular press. The reasons for this are
clear ethere is a dearth of data and case studies based on rigorous
data collection and appropriate analysis methodologies. Research
on the dark side of social media is sparsely populated with studies
that are narrowly focused or that include actual empirical data.
Most are conceptual and at a rather high level, and we hope that the
dark side honeycomb allows researchers to focus their work more
narrowly. We call for empirical work specically on dark side an-
tecedents and motivations (e.g., why individuals, communities, and
organizations deliberately engage in dark side behaviors and
practices), on accidently or unintentional dark side outcomes, be-
haviors, and practices (e.g., social media faux pas), and on dark side
management strategies (e.g., how individuals, communities, and
organizations aim to minimize, prevent, or respond to the dark side
of social media). We would further like to see more granularity in
all these research streams to differentiate, when appropriate, by
region, industry, and sector (e.g., Europe vs. North America, not-for-
prot vs. for prot, and private vs. public) and social media phe-
nomena (e.g., hashtag hijacks, culture jamming, reputation black-
mail, online restorms, online bullying and workplace tensions,
tweet-up disasters, and activities on the darknet/dark web/deep
Our three calls-to-action (i.e., for more work on theory and
conceptual frameworks, on advancing methodology, and on
focused empirical studies) are suggestions, which we hope provide
inspiration to attract researchers to the dark side.
5. Final thoughts
Social Media is everywhereis how Kietzmann et al. (2011)
article rst introduced the honeycomb framework. Seven years
C.V. Baccarella et al. / European Management Journal 36 (2018) 431e438436
later, social media have trulybecome ubiquitous. But unlike in 2011,
the news surrounding social media are no longer just cheerful and
bright. Instead, the popular press usually informs us about new
instances of how social media have fueled intellectual property
leaks, fake news, privacy invasions, election meddling, etc. The tone
has changed, and we are becoming keenly aware of how deeply the
dark side of social media impacts our private and business lives.
As private individuals, although we know better, we continue to
use social media with little regard to its darker side. We continue to
upload pictures and videos of our children without their consent
and start a digital presence they will never be able to reclaim. We
install new apps on our phone without reading the end-user-
license agreement, and we give away our (and our friends') Face-
book data in exchange for free Wi-Fi. We take and post pictures of
our food for distant friends instead of enjoying the moment with
our loved ones at the table. It has become normal for people to
slutshameand post revenge pornof their former partners e
two terms that did not even exist a decade ago but are common
among adolescents today. Children are losing their sense of
empathy, because when they insult someone online, they no longer
see the impact of their actions and the sadness of their victims
faces. And like them, hiding behind online anonymity, adults are
quick to judge others publicly, without really knowing all the
necessary details. On social media, people are guilty until proven
innocent, and then, it is usually too late for their reputation to
recover. While social media offer us previously unforeseeable levels
of connectedness and with it offer tremendous advantages (e.g., the
collective power behind the #metoomovement against sexual
harassment), we simultaneously see levels of online harassment,
bullying, vigilantism, etc. increase sharply.
For organizations, the cost of social media gone badis difcult
to quantify, but the consequences can nevertheless be dire. When
in 2009, musician Dave Carroll unsuccessfully attempted to recover
the cost for the Taylor guitar United Airlinesbaggage handlers
destroyed, he composed a song about United that went viral and
led to an undesirable impact for United Airlines. Millions watched
the video shared on YouTube, which some attributed to the fall in
stock value shortly after. Social media are nonforgiving, brands are
held to increasingly high standards and are immediately punished
for their missteps. What is most surprising, possibly, is that brands
frequently do not appear to learn from social media fails. Every
single year sees many companies, big and small, causing uproar
online for their actions or reactions to social media posts. In 2017,
eight years after the United Airlines guitar handling debacle, it
made headlines again when video footage was posted on social
media showing passenger Dr. David Dao being forcibly dragged,
screaming and bleeding, from United Express Flight 3411, so that a
United Airlines staff person could take his legitimately booked seat
instead. Unsurprisingly, outrage ensued online.
Aside from these public relations social media disasters, within
the boundaries of the rm, there are plenty of dark side examples,
too. Employers monitor their employees' activities on social media,
disgruntled employees cause damage to employers' public repu-
tation when they rant about their experiences, co-workers bully
each other, and misinformation and aggressive social engagement
cause work-tension. Enterprise social mediaplatforms (e.g.,
Microsoft's Yammer, Salesforce's Chatter, or the collaboration tool
Slack) are developed and deployed specically to connect col-
leagues. However, despite the many knowledge management ad-
vantages these tools offer, they are not immune to these inter-
personal ills either.
While media keep improving, our social behavior online seems
to stagnate, at best. It sometimes even seems that social media are
destroying many of the human traits that made us a social species
in the rst place. Firms wanting to avoid some of the social media
risks will nd a useful tool in the honeycomb framework. By
analyzing the seven building blocks econversations, sharing,
presence, relationships, reputation, groups, and identity ethey can
monitor and understand the dark side that social media can present
for their brands, customers, and workforce. However, much more
work is necessary to improve the way we use social media. We
hope that this article motivates our readers (as researchers, man-
agers, employees, and social media consumers) to spend more time
on understanding the dark side and weighing the long-term costs
of using social media. A lot is at stake, and it's serious!
Alexa Internet. (2018). The top 500 sites on the web. Retrieved from https://www. Accessed 22 May 2018.
Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election.
The Journal of Economic Perspectives, 31(2), 211e236.
van Alsenoy, B., Verdoodt, V., Heyman, R., Wauters, E., Ausloos, J., & Acar, G. (2015).
From social media service to advertising network: A critical analysis of Facebook's
revised policies and terms.
Beaver, G., & Jennings, P. (20 05). Competitive advantage and entrepreneurial power:
The dark side of entrepreneurship. Journal of Small Business and Enterprise
Development, 12(1), 9e23.
Bennett, S., Maton, K., & Kervin, L. (2008). The digital nativesdebate: A critical
review of the evidence. British Journal of Educational Technology, 39(5),
Berthon, P., Pitt, L., Kietzmann, J., & McCarthy, I. P. (2015). CGIP: Managing
consumer-generated intellectual property. California Management Review, 57(4),
Blackwell, D., Leaman, C., Tramposch, R., Osborne, C., & Liss, M. (2017). Extraversion,
neuroticism, attachment style and fear of missing out as predictors of social
media use and addiction. Personality and Individual Differences, 116,69e72.
Brewer, M. B. (1999). The psychology of prejudice: Ingroup love and outgroup hate?
Journal of Social Issues, 55(3), 429e444.
Buckels, E. E., Trapnell, P. D., & Paulhus, D. L. (2014). Trolls just want to have fun.
Personality and Individual Differences, 67,97e102.
Chang, J., Lefferman, J., & Pedersen, C. (2016). When fake news stories make real news
headlines. abc News. Retrieved from
news-stories-make-real-news-headlines/story?id¼43845383 Accessed 15 June
Chou, H. T. G., & Edge, N. (2012). They are happier and having better lives than I
am: The impact of using Facebook on perceptions of others' lives. Cyberp-
sychology, Behavior, and Social Networking, 15(2), 117e121.
Cipriani, J. (2012). How to prevent Facebook Messenger from sharing your location.
Retrieved from
messenger-from-sharing-your-location Accessed 29 May 2018.
van Dijck, J. (2009). Users like you? Theorizing agency in user-generated content.
Media, Culture &Society, 31(1), 41e58.
Eisenegger, M. (2009). Trust and reputation in the age of globalisation. In J. Klewes,
& R. Wreschniok (Eds.), Reputation Capital: Building and maintaining trust in the
21st century (pp. 11e22). Berlin Heidelberg: Springer.
Elaluf-Calderwood, S., Kietzmann, J., & Saccol, A. Z. (2005). Methodological
approach for mobile studies: Empirical research considerations. In 4th European
conference on research methodology for business and management studies (pp.
133 e140).
Epstein, R. (2007). From Russia, with love. Scientic American Mind, 18(5), 16e17.
European Commission. (2018). Final report of the high level expert group on fake news
and online disinformation. Retrieved from
disinformation Accessed 05 March 2018.
Eurostat. (2017a). Digital economy and society in the EU. Retrieved from http://ec.
pdf Accessed 06 March 2018.
Eurostat. (2017b). Social media estatistics on the use by enterprises. Retrieved from
statistics_on_the_use_by_enterprises#Further_Eurostat_information Accessed
22 May 2018.
Ferrara, E., Varol, O., Davis, C., Menczer, F., & Flammini, A. (2016). The rise of social
bots. Communications of the ACM, 59(7), 96e104.
Fox, J., & Moreland, J. J. (2015). The dark side of social networking sites: An
exploration of the relational and psychological stressors associated with Face-
book use and affordances. Computers in Human Behavior, 45,168e176 .
Fuchs, C. (2014). Social media and the public sphere. tripleC: Communication, Cap-
italism &Critique, 12(1), 57e101.
Gupta, S. (2016). Trump supporters call to boycott Pepsi over comments the CEO never
made. CNN Money. Retrieved from
companies/pepsi-fake-news-boycott-trump Accessed 17 June 2018.
Hannah, D. R., McCarthy, I. P., & Kietzmann, J. (2015). Were leaking, and every-
thing's ne: How and why companies deliberately leak secrets. Business Hori-
zons, 58(6), 659e667.
Hardaker, C. (2010). Trolling in asynchronous computer-mediated communication:
C.V. Baccarella et al. / European Management Journal 36 (2018) 431e438 437
From user discussions to academic denitions. Journal of Politeness Research,
6(2), 215e242.
Hogg, M. A., & Reid, S. A. (2006). Social identity, self-categorization, and the
communication of group norms. Communication Theory, 16(1), 7e30.
Kaplan, A. M., & Haenlein, M. (2010). Users of the world, unite! the challenges and
opportunities of social media. Business Horizons, 53(1), 59e68.
Karl, K., Peluchette, J., & Schlaegel, C. (2010). Who's posting Facebook faux pas? A
cross-cultural examination of personality differences. International Journal of
Selection and Assessment, 18(2), 174e186.
Kietzmann, J. H. (2017). Crowdsourcing: A revised denition and introduction to
new research. Business Horizons, 60(2), 151e153.
Kietzmann, J. H., Hermkens, K., McCarthy, I. P., & Silvestre, B. S. (2011). Social media?
Get serious! Understanding the functional building blocks of social media.
Business Horizons, 54(3), 241e251.
Kowalski, R. M., Giumetti, G. W., Schroeder, A. N., & Lattanner, M. R. (2014). Bullying
in the digital age: A critical review and meta-analysis of cyberbullying research
among youth. Psychological Bulletin, 140(4), 1073e1137 .
Kumar, A., Bezawada, R., Rishika, R., Janakiraman, R., & Kannan, P. K. (2016). From
social to Sale: The effects of rm generated content in social media on customer
behavior. Journal of Marketing, 80(1), 7e25.
Kwan, G. C. E., & Skoric, M. M. (2013). Facebook bullying: An extension of battles in
school. Computers in Human Behavior, 29(1), 16e25.
Lessig, L. (2008). In defense of piracy. The Wall Street Journal. Retrieved from https:// Accessed 29 May 2018.
Levenson, J. C., Shensa, A., Sidani, J. E., Colditz, J. B., & Primack, B. A. (2016). The
association between social media use and sleep disturbance among young
adults. Preventive Medicine, 85,36e41.
Livingstone, S., Kirwil, L., Ponte, C., & Staksrud, E. (2014). In their own words: What
bothers children online? European Journal of Communication, 29(3), 271e288.
Lyndon, A., Bonds-Raacke, J., & Cratty, A. D. (2011). College students' Facebook
stalking of ex-partners. Cyberpsychology, Behavior, and Social Networking, 14(12),
Madden, M., & Smith, A. (2010). Reputation Management and Social Media - how
people monitor their identity and search for others online. Washington, D.C.
Retrieved from Pew Internet &American Life Project website: http://www.
Malhotra, A., & van Alstyne, M. (2014). The dark side of the sharing economy and
how to lighten it. Communications of the ACM, 57(11), 24e27.
Mander, J. (2017). Daily time spent on social networks rises to over 2 hours. Retrieved
social-networks Accessed 10 June 2018.
Moorhead, S. A., Hazlett, D. E., Harrison, L., Carroll, J. K., Irwin, A., & Hoving, C.
(2013). A new dimension of health care: Systematic review of the uses, benets,
and limitations of social media for health communication. Journal of Medical
Internet Research, 15(4).
Naylor, R., Lamberton, C., & West, P. (2012). Beyond the likebutton: The impact of
mere virtual presence on brand evaluations and purchase intentions in social
media settings. Journal of Marketing, 76(6), 105e120.
News, B. B. C. (2013). Reddit apologises for online Boston witch hunt. Retrieved from Accessed 28 May 2018.
Nosouhi, M. R., Qu, Y., Yu, S., Xiang, Y., & Manuel, D. (2017). Distance-based location
privacy protection in social networks. In Telecommunication networks and ap-
plications conference (ITNAC), (pp. 1e6).
O'Keeffe, G., & Clarke-Pearson, K. (2011). Clinical report - the impact of social media
on children, adolescents, and families. Pediatrics, 127,800e804.
Pai, P., & Arnott, D. C. (2013). User adoption of social networking sites: Eliciting uses
and gratications through a means-end approach. Computers in Human
Behavior, 29(3), 1039e1053.
Paniagua, J., Korzynski, P., & Mas-Tur, A. (2017). Crossing borders with social media:
Online social networks and FDI. European Management Journal, 35(3), 314e326.
Park, N., Kee, K. F., & Valenzuela, S. (2009). Being immersed in social networking
environment: Facebook groups, uses and gratications, and social outcomes.
CyberPsychology and Behavior, 12(6), 729e733.
Pfeffer, J., Zorbach, T., & Carley, K. M. (2014). Understanding online firestorms:
Negative word-of-mouth dynamics in social media networks. Journal of
Marketing Communications, 20(1e2), 117e128.
Przybylski, A. K., Murayama, K., Dehaan, C. R., & Gladwell, V. (2013). Motivational,
emotional, and behavioral correlates of fear of missing out. Computers in Human
Behavior, 29, 1841e1848.
Raskauskas, J., & Stoltz, A. D. (2007). Involvement in traditional and electronic
bullying among adolescents. Developmental Psychology, 43(3), 564e575.
Roulin, N. (2014). The inuence of employers' use of social networking websites in
selection, online self-promotion, and personality on the likelihood of faux pas
postings. International Journal of Selection and Assessment, 22(1), 80e87.
Sabate, F., Berbegal-Mirabent, J., Ca~
nabate, A., & Lebherz, P. R. (2014). Factors
inuencing popularity of branded content in Facebook fan pages. European
Management Journal, 32(6), 1001e1011.
Scharfstein, D. S., & Stein, J. C. (2000). The dark side of internal capital markets:
Divisional rent-seeking and inefcient investment. The Journal of Finance, 55(6),
Scheiner, C. W., Kr
amer, K., & Baccarella, C. V. (2016). Cruel Intentions? ethe role of
moral awareness, moral disengagement, and regulatory focus in the unethical
use of social media by entrepreneurs. Lecture Notes in Computer Science, 9742,
Schlosser, E. (2002). Fast food nation: The dark side of the all-American meal. Boston,
MA: Houghton Mifin Harcourt.
Schmidt, G. B., & O'Connor, K. W. (2015). Fired for Facebook: Using NLRB guidance to
craft appropriate social media policies. Business Horizons, 58(5), 571e579.
Smith, A. N., Fischer, E., & Yongjian, C. (2012). How does brand-related user-
generated content differ across YouTube, Facebook, and Twitter? Journal of
Interactive Marketing, 26(2), 102e113 .
Suran, M., & Kilgo, D. K. (2017). Freedom from the press? How anonymous gate-
keepers on Reddit covered the Boston Marathon bombing. Journalism Studies,
18(8), 1035e1051.
Tajfel, H., & Turner, J. C. (2004). The social identity theory of intergroup behavior. In
J. T. Jost, & J. Sidanius (Eds.), Political Psychology: Key readings (key readings in
social Psychology) (pp. 276e293). New York, NY: Psychology Press.
The Childrens Society. (2018). Safety Net: Cyberbullying's impact on young people's
mental health.
Wagner, T. F. (2017). Promoting technological innovations: Towards an integration
of traditional and social media communication channels. Lecture Notes in
Computer Science, 10282, 256e273.
Wagner, T. F., Baccarella, C. V., & Voigt, K.-I. (2017). Framing social media commu-
nication: Investigating the effects of brand post appeals on user interaction.
European Management Journal, 35(5), 606e616.
Wang, Y., Norcie, G., Komanduri, S., Acquisti, A., Leon, P. G., & Cranor, L. (2011).
I regretted the minute I pressed share: A qualitative study of regrets on Face-
book. In Proceedings of the 7th symposium on usable privacy and security (Vol.
Wilson, M., Robson, K., & Botha, E. (2017). Crowdsourcing in a time of empowered
stakeholders: Lessons from crowdsourcing campaigns. Business Horizons, 60(2),
Wolak, J., Liberatore, M., & Levine, B. N. (2014). Measuring a year of child pornog-
raphy trafcking by US computers on a peer-to-peer network. Child Abuse &
Neglect, 38(2), 347e356.
Wolf, M. J., Miller, K., & Grodzinsky, F. S. (2017). Why we should have seen that
coming: Comments on Microsoft's tay experiment, and wider implications.
ACM SIGCAS - Computers and Society, 47(3), 54e64.
Wong, J. C. (2017). Former Facebook executive: Social media is ripping society apart.
The Guardian. Retrieved from
dec/11/facebook-former-executive-ripping-society-apart Accessed 02 May
Woodruff, A. (2014). Necessary, unpleasant, and disempowering: Reputation
management in the internet age. In Proceedings of the SIGCHI conference on
human factors in computing systems (pp. 149e158).
Zhou, L., & Wang, T. (2014). Social media: A new vehicle for city marketing in China.
Cities, 37,27e32.
van Zoonen, W., Verhoeven, J. W., & Vliegenthart, R. (2017). Understanding the
consequences of public social media use for work. European Management
Journal, 35(5), 595e605.
C.V. Baccarella et al. / European Management Journal 36 (2018) 431e438438
... Corresponding Author: Oxana Komarova Selection and peer-review under The stages of development of social media outlined above necessitate regulation of the social media space and the formation of their regulatory functions when the state uses social media to regulate significant processes, taxation, etc. In addition, social media carries risks for individuals, firms and society, such as cyberbullying, addiction, trolling, online witch-hunting, fake news, and abuse of privacy (Baccarella et al., 2018), which also explains the need for regulation of social media. ...
Conference Paper
Full-text available
... Social networking sites have a dark side to it which is getting larger over time and it is ruining the independence and wellbeing of societies and the public (Baccarella et al., 2018). Scholars discussed that these platforms motivate socially vindictive attitude e.g., narcissism, selfpromotion, self-objectification, emotional coldness, and deception (Fox, Rooney, 2015). ...
... Фиксация цифрового следа может быть рассмотрена с опорой на сотовую структурную модель, предложенную в (Baccarella et al., 2018) и описывающую «темную сторону социальных медиа». Каждый компонент сотовой модели отвечает за сохранение данных, которые впоследствии могут быть использованы для точной идентификации участников онлайн-коммуникации, в том числе в коммерческих целях. ...
The article discusses the problem of digital poverty, arising when communication through digital platforms reduces the cost of the process of obtaining and exchanging information and replaces traditional economic processes. Using the example of the consumption of digital and online services, the author shows how digital communications can act as a marker for differentiating the behavior of the poor and the rich. Using cluster analysis and assessment of multicollinearity, the author interprets the data of a sociological study of five groups of respondents, indicating the factors of manifestation of digital poverty in the behavior of economic agents. The problem of the digital trace formed as a result of the automated data collection from users of online services is also considered. The author notes that consumers of digital services, in exchange for discounts, transfer their personal data to digital platforms that use the information received to stimulate further online consumption through new discounts and loyalty programs, which has a negative impact on offline consumption. The study also raises the issue of the accompanying digital poverty of economic externalities, identifies markers of property inequality in the digital economy, possible options for the development of the online economy against the background of the classical communication and social relations become luxurious. It also indicates the main scenarios for leveling the effects of digital poverty.
... Social media has proven to be a novel educational technic in learning during the COVID19 pandemic. Although in the dark side of social media which spread panic and affected the mental health of social media users and it could be described as a double-edged sword [63,64], in the other hand, it can play an important role in terms of growing the awareness and educating people about the epidemic's dangers through the flow of the information and constantly update the information. This study aimed to explore the students' attitude to use social media as a learning tool during COVID-19 pandemic based on the view of the Health Belief Model (HBM). ...
Full-text available
At the beginning of the year 2020, massive fast spread Coronavirus or (COVID-19) pandemic caused a serious impact on the education system worldwide. This chapter aims to explore the students’ attitude to use social media as a learning tool during COVID-19 pandemic based on the view of the Health Belief Model (HBM). A total of 504 students in Malaysian universities were involved in this study. The partial least squares structural equation modelling (PLS-PM) has been employed to analyse the data collected in this study. The results indicated that perceived susceptibility, perceived severity, perceived barriers, perceived (health) motivation, perceived benefits and self-efficacy were significant in predicting the students’ attitude to use social media as a learning tool during COVID-19 pandemic. The results of this study has been contributed to the existing literature by validating HBM in the Malaysian context and provide theoretical contributions and practical implications to the theory, and practice.
... Social networking sites have a dark side to it which is getting larger over time and it is ruining the independence and wellbeing of societies and the public (Baccarella et al., 2018). Scholars discussed that these platforms motivate socially vindictive attitude e.g., narcissism, selfpromotion, self-objectification, emotional coldness, and deception (Fox, Rooney, 2015). ...
... However, the Internet is not always a reliable source for information on diets and food choices. In fact, individuals are exposed to a variety of dietary and food (mis)information and lifestyle advice that may be contradictory and deviant with respect to health standards, encouraging "unhealthy behavior" [2][3][4]. For these reasons, it appears urgent to understand the factors behind believing in online food fake news to engage people in a more aware search for information and better food choices. ...
Full-text available
In the Italian context, the diffusion of online fake news about food is becoming increasingly fast-paced and widespread, making it more difficult for the public to recognize reliable information. Moreover, this phenomenon is deteriorating the relation with public institutions and industries. The purpose of this article is to provide a more advanced understanding of the individual psychological factors and the social influence that contributes to the belief in food-related online fake news and the aspects that can increase or mitigate this risk. Data were collected with a self-report questionnaire between February and March 2019. We obtained 1004 valid questionnaires filled out by a representative sample of Italian population, extracted by stratified sampling. We used structural equation modelling and the multi-group analyses to test our hypothesis. The results show that self-evaluation negatively affects the social-influence, which in turn positively affects the belief in online fake news. Moreover, this latter relationship is moderated by the readiness to change. Our results suggest that individual psychological characteristics and social influence are important in explaining the belief in online fake news in the food sector; however, a pivotal role is played by the motivation of lifestyle change. This should be considered to engage people in clear and effective communication.
... A través de estos escenarios los usuarios de una comunidad virtual, unidos por diversos intereses y preferencias, interactúan entre los parámetros de la fugacidad, la liquidez y la instantaneidad propia de la red. Con un alcance de 2,95 mil millones de usuarios en 2020 (Rehman, Baharu y Salleh, 2020) y un promedio de 2 horas de uso al día por usuario (Baccarella, Wagner, Kietzmann y McCarthy, 2018;Barcelos, Dantas y Senecal, 2018), las redes sociales se han convertido en uno de los espacios virtuales más populares y frecuentes entre los jóvenes. ...
... Social networking sites have a dark side to it which is getting larger over time and it is ruining the independence and wellbeing of societies and the public (Baccarella et al., 2018). Scholars discussed that these platforms motivate socially vindictive attitude e.g., narcissism, selfpromotion, self-objectification, emotional coldness, and deception (Fox, Rooney, 2015). ...
Abstract With the advancement of technology, the flow of information has increased many folds. With this, the spread of fake news has also increased. This has many objectives on part of its originating source. The panic criterion is one of it, which aims to create unrest and a distorted image of a situation prevailing around the readers. To consider the problem, this study analyzed the WhatsApp user’s response to fake news concerning (Covid–19) in Pakistan. This study considered a quantitative survey research method. A Purposive sampling technique has been used to select respondents from the four provinces of Pakistan i.e. Punjab, Baluchistan, Sindh and Khyber Pakhtunkhua. A total of 200 respondents were considered for the study, and only those respondents were included in the study who responded that they check the authenticity of news that is shared through the WhatsApp groups and statuses. The data has been analyzed through SPSS by extracting frequencies and percentages. The questionnaire was tested for reliability by applying a Chonbach’s Alpha, the value came out to be .621 which is 62 %. Hence, the questionnaire is reliable in terms of measuring the main concept of the study. Keywords: Covid–19, WhatsApp, Fake News, Groups, Status, Pakistan, media.
... One of the essential benefits of SM is the rapidity with which information of interest can be spread to a larger audience when compared to traditional outlets like print media. On average, users spend more than 2 hours of their time on SM daily (Baccarella et al., 2018;Barcelos, Dantas, & Senecal, 2018). This penetration of SM in research has resulted in the phenomenon of social exchange. ...
Full-text available
The emergence of social media has sparked a lot of interest in academic libraries especially in the area of adoption. However, there appears to be limited knowledge on whether librarians' generation differs in the adoption levels of social media specifically in the Southwestern, Nigeria. In a bid to carrying out this focus, this study adopted the descriptive survey design. The population comprised seventy-nine (79) librarians from eight academic libraries. The total enumeration sampling technique was used to study all respondents for the study. A self-structured questionnaire was the instrument used for data collection. Data gathered were analysed using descriptive (frequency, percentage & mean) and inferential statistics (ANOVA). The results indicate that there is a significant difference between the generations (Baby boomers, Generation X, Generation Y, Generation Z) with respect to their adoption of social media, but that no significant differences were found between the generations and social media adoption for library services. The study concluded by noting that social media can be adopted by librarians across different generations. Library administrators should acknowledge these differences and formulate their social media strategy accordingly when designing plans on social media in Southwestern, Nigeria.
... One of the essential benefits of social media for research is the rapidity with which information of interest can be spread to a larger audience when compared to traditional outlets like print media. On average, users spend more than 2 hours of their time on social media daily (Baccarella, Wagner & Kietzmann, 2018;Barcelos, Dantas & Sénécal, 2018). Social media users create, share as well as exchange information and ideas in virtual communities and they can connect with different individuals who share a common interest, dreams, and objectives (Sharma & Shukla, 2016). ...
Full-text available
This study sought to investigate the relationship between social media use and research productivity of lecturers in private universities in Ogun State, Nigeria. This study adopted the survey design. The population comprised of 1353 lecturers from seven private universities. The purposive sampling technique was used to select 621 respondents for the study. A pretested questionnaire was used for data collection, yielding the following Cronbach alpha value: Social media use-0.86 and research productivity-0.95. Data were analyzed using descriptive and inferential statistic. Findings revealed that social media use has a positive but very weak and significant relationship with the research productivity (r = 0.173 ** , P < 0.05). Social media tools, namely: Social networks, Collaborative tools, Web video sites, Blogs & Microblogs were found to be used. Social bookmarking and citation tools were not used for research. However, research productivity was found to be low. The study concluded by noting that research productivity can be greatly improved by the use of social media. It, therefore, recommends that lecturers improve their knowledge of social bookmarking and citation tools as these will help them in organizing information from social media specifically and the web, in general, amongst others.
Full-text available
Social media has slowly become ubiquitous in the workplace; however, the use of these technologies has been associated with both positive and negative consequences. Using the JD-R model, this study examines these positive and negative consequences of the public social media use for work. Survey data of 421 employees is used to explore the relationship between public social media use for work and engagement, and exhaustion, through opposing mechanisms. The findings demonstrate that interruptions and work–life conflict are important demands, whereas accessibility and efficient communication are resources associated with social media use for work. These demands and resources are related to engagement and exhaustion.
Full-text available
The aim of this study is to examine how the mechanisms of consumer adoption of technological innovations have been affected by the advent of social media. For this purpose, a list of major adoption determinants is derived from previous research, including theories such as innovation diffusion theory, the technology acceptance model, and the unified theory of acceptance and use of technology. Findings from empirical research are used to show which adoption determinants can be influenced through firms' communication efforts and how this can be done. After outlining how social media jumbles the established routines and mechanisms of marketing communications, this article explains how these new circumstances in the social media landscape can assist firms to facilitate innovation adoption. The main contribution of this article is to connect the established research field of technology and innovation adoption with the new and emerging field of social media research.
Social media has become a key term in Media and Communication Studies and public discourse for characterising platforms such as Facebook, Twitter, YouTube, Wikipedia, LinkedIn, Wordpress, Blogspot, Weibo, Pinterest, Foursquare and Tumblr. This paper discusses the role of the concept of the public sphere for understanding social media critically. It argues against an idealistic interpretation of Habermas and for a cultural-materialist understanding of the public sphere concept that is grounded in political economy. It sets out that Habermas’ original notion should best be understood as a method of immanent critique that critically scrutinises limits of the media and culture grounded in power relations and political economy. The paper introduces a theoretical model of public service media that it uses as foundation for identifying three antagonisms of the contemporary social media sphere in the realms of the economy, the state and civil society. It concludes that these limits can only be overcome if the colonisation of the social media lifeworld is countered politically so that social media and the Internet become public service and commons-based media.Acknowledgement: This paper is the extended version of Christian Fuchs’ inaugural lecture for his professorship of social media at the University of Westminster that he took up on February 1st, 2013. He gave the lecture on February 19th, 2014, at the University of Westminster.The video version of the inaugural lecture is available at:
In this paper we examine the case of Tay, the Microsoft AI chatbot that was launched in March, 2016. After less than 24 hours, Microsoft shut down the experiment because the chatbot was generating tweets that were judged to be inappropriate since they included racist, sexist, and anti-Semitic language. We contend that the case of Tay illustrates a problem with the very nature of learning software (LS is a term that describes any software that changes its program in response to its interactions) that interacts directly with the public, and the developer's role and responsibility associated with it. We make the case that when LS interacts directly with people or indirectly via social media, the developer has additional ethical responsibilities beyond those of standard software. There is an additional burden of care.
Social networking sites (SNS) play an increasingly important role in the mix of brands' marketing communication. A key question for marketing departments, therefore, is how brand posts can be best framed to provoke positive user reactions and interactions. In order to better understand the determinants of communication success on SNS, we propose a theoretical framework of how users process posts on SNS. Its logic suggests that the overall theme of a post (" post appeal ") is a main antecedent of communication success. Thus, we empirically examine the effects of post appeals on user interaction by profoundly analysing a sample of 1,948 Facebook posts. Results show that some post appeals have positive and others have negative impact on user interaction. Interestingly, some of the appeals with positive impact are rarely used by brands, while some of the appeals with negative impact are used quite frequently, indicating that brands currently do not grasp the full potential of post appeal strategy. This article concludes by discussing theoretical and managerial implications.
Following the 2016 US presidential election, many have expressed concern about the effects of false stories ("fake news"), circulated largely through social media. We discuss the economics of fake news and present new data on its consumption prior to the election. Drawing on web browsing data, archives of fact-checking websites, and results from a new online survey, we find: 1) social media was an important but not dominant source of election news, with 14 percent of Americans calling social media their "most important" source; 2) of the known false news stories that appeared in the three months before the election, those favoring Trump were shared a total of 30 million times on Facebook, while those favoring Clinton were shared 8 million times; 3) the average American adult saw on the order of one or perhaps several fake news stories in the months around the election, with just over half of those who recalled seeing them believing them; and 4) people are much more likely to believe stories that favor their preferred candidate, especially if they have ideologically segregated social media networks.
Social media use is prevalent in today's society and has contributed to problems with social media addiction. The goal of the study was to investigate whether extraversion, neuroticism, attachment style, and fear of missing out (FOMO) were predictors of social media use and addiction. Participants in the study (N = 207) volunteered to complete a brief survey measuring levels of extraversion, neuroticism, attachment styles, and FOMO. In the final model of a hierarchical regression, younger age, neuroticism, and fear of missing out predicted social media use. Only fear of missing out predicted social media addiction. Attachment anxiety and avoidance predicted social media addiction, but this relationship was no longer significant after the addition of FOMO.
Crowdsourcing can test a company's willingness to relinquish control to key stakeholders. Using past examples of four failed crowdsourcing initiatives, we explore the negative and unintended consequences of crowdsourcing in an age when stakeholders are empowered to speak their minds, make a mockery of organizational initiatives, and direct initiatives as it suits their own agenda. The concepts of crowdthink and crowd hijacking are introduced, and advice is given on how managers can avoid or anticipate some of the potential issues that arise during crowdsourcing endeavors. With these considerations, managers can harness the power of crowds effectively to achieve organizational goals with limited negative consequences.