ArticlePDF Available

Abstract

Due to rapid technological advances and the increasing diffusion of smart devices, public health applications (apps) have become an integral aspect of public health management. Yet, as governments introduce innovative public health apps (e.g., contact tracing apps, data donation apps, ehealth apps), they have to confront controversial debates that fuel conspiracy theories and face the fact that app adoption rates are often disappointing. This study explores how conspiracy theories affect the adoption of innovative public health apps as well as how policymakers can fight harmful conspiracy beliefs. Acknowledging the importance of word of mouth (WOM) in the context of conspiracy beliefs, the study focuses on the interplay between WOM and conspiracy beliefs and their effects on app adoption. Based on theories of social influence and conspiracy beliefs, substantiated by data derived from a multi-wave field study and confirmed by a controlled experiment, the results show that (1) changes in WOM concerning public health apps change conspiracy beliefs, (2) the effects of WOM on changes in conspiracy beliefs depend on both the sender (peer vs. expert) and the receiver's initial conspiracy beliefs, and (3) increases in conspiracy beliefs reduce public health app adoption and trigger more negative WOM regarding such apps. These results should inform health agencies about how to market innovative public health apps. For consumers with initially low levels of conspiracy beliefs, the distribution of expert WOM supporting the efficacy of public health apps effectively prevents the development of conspiracy beliefs and increases app adoption. However, expert WOM is ineffective in reducing conspiracy beliefs among firm conspiracy believers. These consumers should instead be targeted by campaigns distributing peer WOM that highlights an app's benefits and contradicts conspiracy theories.
1
Deflected by the Tin Foil Hat? Word of Mouth, Conspiracy Beliefs, and the
Adoption of Innovative Public Health Apps
Abstract
Due to rapid technological advances and the increasing diffusion of smart devices, public health
applications (apps) have become an integral aspect of public health management. Yet, as
governments introduce innovative public health apps (e.g., contact tracing apps, data donation
apps, ehealth apps), they have to confront controversial debates that fuel conspiracy theories and
face the fact that app adoption rates are often disappointing. This study explores how conspiracy
theories affect the adoption of innovative public health apps as well as how policymakers can fight
harmful conspiracy beliefs. Acknowledging the importance of word of mouth (WOM) in the
context of conspiracy beliefs, the study focuses on the interplay between WOM and conspiracy
beliefs and their effects on app adoption. Based on theories of social influence and conspiracy
beliefs, substantiated by data derived from a multi-wave field study and confirmed by a controlled
experiment, the results show that (1) changes in WOM concerning public health apps change
conspiracy beliefs, (2) the effects of WOM on changes in conspiracy beliefs depend on both the
sender (peer vs. expert) and the receiver’s initial conspiracy beliefs, and (3) increases in conspiracy
beliefs reduce public health app adoption and trigger more negative WOM regarding such apps.
These results should inform health agencies about how to market innovative public health apps.
For consumers with initially low levels of conspiracy beliefs, the distribution of expert WOM
supporting the efficacy of public health apps effectively prevents the development of conspiracy
beliefs and increases app adoption. However, expert WOM is ineffective in reducing conspiracy
beliefs among firm conspiracy believers. These consumers should instead be targeted by campaigns
distributing peer WOM that highlights an app’s benefits and contradicts conspiracy theories.
Page 1 of 46 Journal of Product Innovation Management
2
Practitioner Points
Consumer conspiracy beliefs are a major threat to the success of public health apps.
Negative WOM about public health apps fosters conspiracy beliefs and sets a negative
WOM cycle in motion.
Consumers with high initial conspiracy beliefs should be targeted with positive WOM by
peers but not by experts.
Page 2 of 46Journal of Product Innovation Management
3
1. INTRODUCTION
As societies have become increasingly digitized, innovative mobile health applications (apps) have
become an integral aspect of public health management (Budd et al., 2020). These public health
apps, which are issued by government agencies in an effort to improve public health, can be
employed for different purposes, such as tracing the spread of disease, enabling access to health
data for scientists, or providing information to disaster responders (CDC, 2022). Although
innovative public health apps have immense potential to improve public health and benefit the
individual user, their introduction is often met with skepticism, igniting heated debates among
experts and consumers (Trang et al., 2020).
The intense debates concerning innovative public health apps, which mainly occur on social
media in the form of word of mouth (WOM), are often fueled by conspiracy theories, which link
the apps to a hidden and evil purpose. The importance of WOM and conspiracy theories in relation
to the adoption of public health apps can be attributed to the apps’ central characteristics. Public
health apps touch on the sensitive issue of health and so pose potential risks for consumers, which
elevates the importance of WOM (Lin and Fang, 2006, Ram and Sheth, 1989). Moreover, public
health apps are issued by governments, which results in a rich breeding ground for conspiracy
theories (e.g., claims that the apps are used to control the population) (Douglas et al., 2017).
The controversy surrounding public health apps and the related role of conspiracy theories
became very apparent during the COVID-19 pandemic when many countries issued tracing apps
to identify and warn individuals who may have been in contact with infected persons (Trang et al.,
2020). Although public agencies praised the apps as key instruments for limiting the spread of
COVID-19 fierce debates involving experts and consumers arose, and adoption rates in countries
in which app usage was not mandated were much lower than expected (Seto et al., 2021). Within
these debates, conspiracy theories such as the claim that the Gates Foundation and corrupt
Page 3 of 46 Journal of Product Innovation Management
4
politicians had orchestrated the pandemic and intended to exploit the tracing apps to control
unwitting populations played a central role.
While it is reasonable to assume that conspiracy beliefs impede the diffusion of public
health apps, prior research provides few insights into this matter beyond anecdotal evidence and
findings showing that conspiracy beliefs hinder other health measures (e.g., vaccination, HIV
treatment) (Bogart et al., 2010, Jolley and Douglas, 2014). Moreover, extant studies do not reveal
how conspiracy beliefs shape consumer engagement in WOM and consumer reactions to WOM
they receive from peers and experts. However, such insights are crucial if policymakers are to
successfully market public health apps, especially since intense debates on social media and
conspiracy theories flourish during times of crisis, which is when public health apps are most
needed (Jolley and Douglas, 2017). We aim to address these research gaps by elucidating (a) how
WOM by peers and experts concerning public health apps changes consumers’ conspiracy beliefs,
(b) how consumers’ initial conspiracy beliefs influence these effects, and (c) how consumers’
conspiracy beliefs affect the adoption of public health apps and outgoing WOM about the apps.
Addressing these issues requires an interdisciplinary approach combining insights from
innovation research that provides crucial findings on the effects of WOM and innovation adoption
(but little on conspiracy beliefs) with insights from political psychology research that provides
important findings on conspiracy beliefs (but little on WOM and adoption). In particular, we merge
insights into the influence of WOM on adoption processes and social influence as the underlying
mechanism (Abrams and Hogg, 1990, Kawakami et al., 2013) with insights into how conspiracy
beliefs emerge and influence information processing (Douglas et al. 2017, 2019). This novel
theoretical framework allows us to explore how the interplay between conspiracy beliefs and WOM
affects the adoption of public health apps. Accounting for the dynamics of social interaction, we
examine changes in both WOM and conspiracy beliefs over time. More specifically, we posit that
Page 4 of 46Journal of Product Innovation Management
5
a change in the extent to which consumers receive negative WOM (NWOM) and positive WOM
(PWOM) from peers and experts concerning a public health app will lead to a change in their
conspiracy beliefs, which will affect app adoption and consumers’ outgoing WOM regarding the
apps. We further propose that the influence changes in the different types of WOM exert on change
in conspiracy beliefs depends on consumers’ initial level of conspiracy beliefs. For instance, based
on the idea that consumers with high initial conspiracy beliefs view increasing expert PWOM as
an indicator that a growing number of experts are part of the conspiracy, we predict that such
consumers will discredit increasing expert PWOM concerning a public health app.
We test our hypotheses within a multi-wave field study focusing on the German COVID-
19 tracing app. The data analysis supports our central predictions, showing (a) that change in WOM
result in change in conspiracy beliefs, (b) that such effects depend on the WOM sender and the
consumer’s initial conspiracy beliefs, and (c) that change in conspiracy beliefs affect both app
adoption and the consumer’s outgoing WOM concerning the app. An experimental study exploring
consumer reactions to a fictional public health app validates the results of the field study and
increases the generalizability of the findings.
Overall, we make four substantial contributions to the literature. First, we complement the
research on technology acceptance in general and public health app adoption in particular (Trang
et al., 2020, Walrave et al., 2020) by providing initial empirical evidence that conspiracy beliefs
impede public health app adoption above and beyond the established drivers. In addition, we
provide detailed insights into the mechanism behind this influence, revealing a twofold process:
(1) individuals who exhibit increasing conspiracy beliefs are less likely to adopt public health apps
because they are increasingly convinced that the government is pursuing an evil agenda in issuing
such apps, and (2) conspiracy beliefs affect how individuals interpret WOM regarding public health
apps, which indirectly influences app adoption. While expert WOM praising a public health app
Page 5 of 46 Journal of Product Innovation Management
6
reduces conspiracy beliefs and increases app adoption among consumers with low initial
conspiracy beliefs, it proves ineffective or even counterproductive among consumers with high
initial conspiracy beliefs. This finding complements prior research revealing that conspiracy beliefs
can reduce the acceptance of fact-based arguments (Jolley and Douglas, 2017). Beyond the
implications for research on public health apps, our findings indicate that innovation research
should consider conspiracy beliefs when exploring the adoption of other public and commercial
innovations that could be associated with conspiracy theories, such as innovations that concern the
sensitive topic of health or collect extensive user data.
Second, aside from showing how conspiracy beliefs affect consumers’ public health app
adoption decisions, we provide insights into how this effect can spread among consumers. More
specifically, consumers who experience increasing conspiracy beliefs tend to voice more NWOM
concerning public health apps. This NWOM fosters conspiracy beliefs among their peers, who then
also tend to express more NWOM about public health apps. These findings suggest a self-
reinforcing loop by which conspiracy beliefs spread and are reinforced in social groups. These
insights complement prior research linking conspiracy beliefs to high social media usage (Enders
et al., 2021), elucidating the role of peer WOM in the spread of conspiracy beliefs.
Third, we provide important insights into how health agencies can employ WOM marketing
to both reduce conspiracy beliefs and increase public health app adoption. In particular, our results
reveal the need to consider individuals’ initial levels of conspiracy beliefs when employing WOM
marketing. Although the dissemination of expert PWOM concerning public health apps is useful
in preventing the rise of conspiracy beliefs (i.e., when conspiracy beliefs are still at a low level), it
is ineffective or even counterproductive in reducing conspiracy beliefs held by committed
conspiracy believers. Among individuals with substantial initial conspiracy beliefs, WOM from
peers that contradicts conspiracy theories can reduce conspiracy beliefs and increase app adoption.
Page 6 of 46Journal of Product Innovation Management
7
Thus, these consumer segments should be targeted with marketing campaigns encouraging peer-
to-peer PWOM (e.g., by providing shareable content).
Fourth, we provide novel insights into factors influencing the effects of WOM in innovation
adoption processes, which should inform innovation research beyond the topic of conspiracy
beliefs and public health apps. Our findings that initial conspiracy beliefs influence how consumers
react to WOM and that this influence differs between peer and expert WOM show that the effects
of WOM can depend on an interplay between the WOM sender’s characteristics and the
consumer’s pre-existing attitudes. The existing innovation research paid substantial attention to the
influence of the type of WOM communication (e.g., personal vs. virtual) (e.g., Kawakami and
Parry, 2013, Parry et al., 2012), but only little attention to the WOM sender’s characteristics, the
consumer’s pre-existing attitudes, and the interaction between these factors. Consideration of these
factors could provide novel insights into the influence of WOM on innovation adoption processes.
2. THEORETICAL BACKGROUND
2.1. WOM and Social Influence
WOM, which refers to informal communication concerning the assessment of a product or service
(Anderson, 1998), substantially influences consumers’ innovation adoption decisions, especially
when innovations are perceived as risky (Lin and Fang, 2006, Parry et al., 2012). WOM can be
disseminated through different channels (e.g., personal or virtual); however, research emphasizes
the crucial impact of WOM delivered via virtual channels (Kawakami et al., 2013). Besides the
communication channel, WOM can be differentiated based on criteria such as the WOM message’s
content or the WOM sender’s characteristics (Babić Rosario et al., 2016, Bansal and Voyer, 2000).
Our conceptual development relies on two criteria to distinguish four types of WOM that
are expected to impact conspiracy beliefs and public health app-related outcomes differently. First,
based on the valence, we differentiate between PWOM (WOM favoring the innovation) and
Page 7 of 46 Journal of Product Innovation Management
8
NWOM (WOM criticizing the innovation), as consequences of WOM tend to crucially depend on
the valence of the WOM message (Babić Rosario et al., 2016). Second, based on the WOM sender’s
characteristics, we differentiate between peer WOM (the sender has a social tie to the receiver) and
expert WOM (the sender is an expert, who reaches receivers beyond social contacts), as extant
research shows that consumers can react very differently to WOM by peers and experts (Keh and
Sun, 2018). Next, we will present theoretical insights into social influence, which are crucial to
understanding the impact of WOM.
Social influence describes a process by which individuals alter their attitudes, beliefs, or
behaviors based on social interaction (Hu et al., 2019). Generally, two types of social influences
can be distinguished: normative and informational social influence (Deutsch and Gerard, 1955,
Kuan et al., 2014). Normative social influence describes a subjective pressure to comply with the
attitudes, beliefs, and behavior of valued individuals or social groups (Abrams and Hogg, 1990).
By agreeing with a group’s judgment, individuals can increase their identification with the group
and enhance their status within it; thus, conformity offers social rewards such as a sense of
belonging and social acceptance (Kuan et al., 2014). For instance, if a consumer’s peer group
voices WOM linking a public health app to a conspiracy, the group consciously or unconsciously
puts social pressure on the consumer to conform with the group’s beliefs. A failure to conform
threatens the consumer’s objective or subjective belonging to, and status within, the group.
Informational social influence describes a process by which individuals view the
information that social actors provide to be compelling and so alter their attitudes, beliefs, or
behaviors based on it (Abrams and Hogg, 1990, Broekhuizen et al., 2011). Individuals who receive
ambiguous information and who are uncertain about the correct decision are particularly prone to
informational social influence (Hu et al., 2019). The more frequently information is presented, and
the more individuals voice the relevant opinion, the more social influence information exerts (Babić
Page 8 of 46Journal of Product Innovation Management
9
Rosario et al., 2016). Moreover, the characteristics of an information source determine its effect,
with individuals assigning more weight to information from individuals with whom they share
social ties (Hofstetter et al., 2018) or consider to be experts (Abrams and Hogg, 1990). For
example, when exposed to increasing WOM from peers or experts who link a public health app to
a conspiracy theory (or refute such a link), individuals can be persuaded to adopt the same opinion.
2.2. Conspiracy Beliefs
A conspiracy is a “secret plot by two or more powerful actors” who behave malevolently and
illegitimately (Douglas et al., 2019; p. 4). Conspiracy theories blame such secret plots for important
events (Douglas et al., 2017). Even though some conspiracy theories have turned out to be true
(e.g., the Watergate scandal), they are typically counterfactual or implausible (van Prooijen and
Van Vugt, 2018). Despite diverse conceptualizations of conspiracy theories in the literature, most
authors agree that conspiracy theories involve the basic beliefs that “(a) nothing happens by chance;
(b) nothing is what it seems; (c) everything interconnects with everything” (Orosz et al., 2016; p.
1). Popular conspiracy theories are, for instance, that NASA staged the moon landings, that
governments use radiation for mind control and that tin foil hats protect against this control, and
that the Gates Foundation developed COVID-19 in cooperation with various governments.
In most conspiracy theories, entire governments or influential units within governments
play a critical role, either as the central actor or as the puppet of a secret organization. Conspiracy
theories provide alternatives to official explanations (Jolley et al., 2018). We use the term
“conspiracy belief” to describe the belief in a set of conspiracy theories (Douglas et al., 2019).
Most individuals who believe in one conspiracy theory also embrace multiple other (even unrelated
or possibly contradictory) conspiracy theories (Goertzel, 1994).
Although prior research has linked conspiracy beliefs to a variety of sociodemographic
factors (e.g., low education, unemployment) (Freeman and Bentall, 2017), such beliefs can be
Page 9 of 46 Journal of Product Innovation Management
10
found across the entire population (Uscinski and Parent, 2014), as they promise to satisfy salient
psychological needs (Douglas et al., 2017). Three types of needs from system justification theory
explain the attraction of conspiracy beliefs: epistemic, existential, and social needs (Douglas et al.,
2017, Jost and Andrews, 2011). These needs also explain why individuals tend to maintain
conspiracy beliefs even when confronted with contradictory information.
Epistemic needs are based on the human tendency to believe that significant events must
have been planned by someone (Orosz et al., 2016). Thus, individuals seek causal explanations for
salient events to maintain an internally consistent understanding of the environment (Douglas et
al., 2017). When individuals face uncertainty, conspiracy theories help them to make sense of their
environment (Sunstein and Vermeule, 2009). Therefore, conspiracy beliefs flourish during times
of unforeseeable change or when evidence-based explanations of large-scale events are perceived
as unsatisfactory (Douglas et al., 2019). Conspiracy theories differ from other causal explanations
in two major ways. First, conspiracy theories are speculative, claiming without substantial evidence
that there are extensive actions by powerful actors hidden from the public (Jolley et al., 2018).
Second, conspiracy theories are very resistant to falsification, as the information refuting them is
often discredited by the belief that the individuals providing such information are part of the
conspiracy (Douglas et al., 2017). This self-sealing quality of conspiracy beliefs is built on the
assumption that actors who have the power to plan a conspiracy also have the means to disseminate
information that allegedly debunks it (Sunstein and Vermeule, 2009).
Causal explanations of salient events based on conspiracy theories also contribute to the
satisfaction of existential needs for security, safety, and control (Douglas et al., 2017). Thus, during
times when individuals are anxious and their existential needs are subjectively threatened,
conspiracy beliefs provide a certain and conclusive narrative that satisfies such needs (Grzesiak-
Feldman, 2013). Although conspiracy beliefs typically involve the idea that society is controlled
Page 10 of 46Journal of Product Innovation Management
11
by untrustworthy and malicious individuals (implying an existential threat), knowledge of these
plots and an understanding of how the world works provide a sense of control (Douglas et al.,
2019). As a consequence, information that challenges conspiracy beliefs is likely to be perceived
as an existential threat. Therefore, existential needs tend to uphold conspiracy beliefs.
Individuals exhibit inherent social needs in terms of fostering a positive self-identity and a
positive social identity (Ashforth and Mael, 1989, Douglas et al., 2019). Conspiracy beliefs can
satisfy these needs by shifting the blame for negative events away from the self or an in-group
toward external groups such as the government or other alleged conspirators (Douglas et al., 2017).
Furthermore, communally held conspiracy beliefs can both strengthen social bonds and improve
social status by fostering a feeling of belonging to an exclusive group that possesses important
knowledge (Douglas et al., 2019). Social needs also tend to maintain conspiracy beliefs, as rejecting
these beliefs would substantially impair an individual’s self and social identity.
3. HYPOTHESIS DEVELOPMENT
3.1. Overview of the Conceptual Model
Building on the theoretical background, we will now present our hypotheses, which rest on the
basic proposition that individuals associate WOM concerning a public health app with the
likelihood of a conspiracy. Given that public health apps are issued by governments, individuals
are likely to connect an app’s design and functionality to the government’s motives and abilities.
As most conspiracy theories claim that influential people within governments are engaged in an
evil plot to harm the majority of the population (van Prooijen and Van Vugt, 2018), information
about a public health app is directly associated with conspiracy beliefs. If an app is presumed to
work well, a conspiracy seems less likely, as the government is apparently pursuing its official
goals. By contrast, the perception that a public health app does not provide its advertised function
increases the possibility that the government is involved in a conspiracy. For instance, if a tracing
Page 11 of 46 Journal of Product Innovation Management
12
app is presumed to be unable to limit the spread of the disease, it could indicate that the government
has ulterior motives and is using the app for other purposes (e.g., controlling users).
In accordance with these arguments, we predict that changes in peer and expert WOM (i.e.,
change in the perceived extent peers and experts engage in NWOM and PWOM) cause change in
individuals’ conspiracy beliefs. However, we also posit that individuals’ initial levels of conspiracy
beliefs influence their appraisal of peer and expert NWOM and PWOM, meaning that initial
conspiracy beliefs moderate the effects that peer and expert NWOM and PWOM have on change
in conspiracy beliefs. Finally, we predict that changes in individuals’ conspiracy beliefs affect
public health app adoption and change the valence of individuals’ WOM regarding such apps.
Figure 1 summarizes our conceptual model.
-Insert Figure 1 about here-
3.2. How Change in Perceived Peer WOM Affects Change in Conspiracy Beliefs
Based on insights concerning social influence and the proposition that individuals associate WOM
regarding a public health app with the likelihood of a conspiracy, we predict that a change in peer
WOM causes a change in an individual’s conspiracy beliefs through normative and informational
social influence. In terms of normative social influence, peer WOM exerts social pressure on the
recipient to conform to the peer group’s beliefs so as to maintain a sense of belonging and social
status within the group (Kuan et al., 2014). Informational social influence occurs when arguments
provided in WOM persuade the receiver to adopt the sender’s opinion (Abrams and Hogg 1990).
Individuals pay a great deal of attention to information voiced by people within their social
environment (Hofstetter et al., 2018). Thus, when peers increasingly voice NWOM concerning a
public health app, individuals may feel pressured to agree with such an evaluation (normative
influence) and these arguments could convince them (informational influence). If individuals
believe the increasing peer NWOM concerning the public health app, a conspiracy is subjectively
Page 12 of 46Journal of Product Innovation Management
13
more likely, meaning that their conspiracy beliefs increase. For example, WOM that questions a
tracing app’s ability to effectively trace contacts could lead an individual to seek an alternative
causal explanation (other than contact tracing) for the existence of the app. Conspiracy theories can
provide such an alternative explanation. When peers increasingly voice PWOM concerning a
public health app, social influence may lead an individual to adopt the increasingly positive group
opinion. In that case, a conspiracy becomes less likely, causing conspiracy beliefs to decrease.
However, we further predict that an individual’s initial conspiracy beliefs substantially
moderate the impact that a change in peer WOM has on the change in their conspiracy beliefs. This
proposition is based on findings of prior studies indicating that individuals who hold firm
conspiracy beliefs tend to hold them due to salient psychological needs (Jolley and Douglas, 2017).
Conspiracy theories promise to fulfill psychological needs by providing individuals with causal
explanations for important developments that appear to make the world more predictable and
secure, in addition to fostering feelings of social belonging and status (Douglas et al., 2017). Thus,
relinquishing conspiracy beliefs potentially leads to undesirable feelings of disorientation and fear,
and threatens socials needs, whereas intensifying conspiracy beliefs promises control, safety, social
belonging, and status (Douglas et al., 2019, Jolley et al., 2018).
The greater individuals’ conspiracy beliefs, the greater their unconscious motivation to
maintain and foster such beliefs. Therefore, individuals with high levels of conspiracy beliefs tend
to overestimate the credibility of information supporting those beliefs and to devalue information
contradicting them in order to maintain their psychological well-being (Jolley and Douglas, 2017).
We predict that individuals with high initial conspiracy beliefs place a high value on increasing
peer NWOM regarding public health apps, as such information supports their beliefs. Conversely,
individuals with lower initial conspiracy beliefs are less likely to give credence to increasing peer
NWOM concerning public health apps. Accordingly, we hypothesize that an individual’s initial
Page 13 of 46 Journal of Product Innovation Management
14
conspiracy beliefs positively moderate the positive effect that change in peer NWOM exerts on
change in conspiracy beliefs.
Similarly, we suggest that individuals with high initial conspiracy beliefs tend to devalue
increasing peer PWOM regarding public health apps, as such information contradicts their
conspiracy beliefs and so endangers their psychological well-being. Thus, the negative effects of
increasing peer PWOM on change in conspiracy beliefs should be limited among such individuals.
By contrast, individuals with lower initial conspiracy beliefs tend to find increasing peer PWOM
concerning public health apps more credible, meaning that change in peer PWOM has greater
effects on change in conspiracy beliefs among these individuals. In summary, we hypothesize:
H1a: Change in peer NWOM positively affects change in conspiracy beliefs: an increase
(decline) in peer NWOM causes an increase (decline) in conspiracy beliefs.
H1b: The positive effect of change in peer NWOM on change in conspiracy beliefs is enhanced
by initial conspiracy beliefs, such that the positive effect is greater at higher levels of initial
conspiracy beliefs.
H2a: Change in peer PWOM negatively affects change in conspiracy beliefs: an increase
(decline) in peer PWOM causes a decline (increase) in conspiracy beliefs.
H2b: The negative effect of change in peer PWOM on change in conspiracy beliefs is mitigated
by initial conspiracy beliefs, such that the negative effect is smaller at higher levels of initial
conspiracy beliefs.
3.3. How Change in Perceived Expert WOM Affects Change in Conspiracy Beliefs
We posit that WOM by experts also causes change in conspiracy beliefs through normative and
informational social influence. However, we further propose that the effect of change in expert
WOM on change in conspiracy beliefs varies substantially depending on both the initial level of
conspiracy beliefs and the type of expert WOM (NWOM vs. PWOM).
Page 14 of 46Journal of Product Innovation Management
15
Change in expert NWOM and change in conspiracy beliefs. The status of an expert signifies
a certain reputation (Brown et al., 2007). Thus, individuals could view experts as a desirable social
group, which would enable the experts to exert normative social influence. When an individual has
a high regard for experts, adopting the experts’ opinion will establish social identification with
them and improve the individual’s subjective social status (Kuan et al., 2014). In addition,
individuals tend to believe that experts have access to privileged information and so are susceptible
to experts’ informational social influence (Abrams and Hogg, 1990). We propose that when
individuals perceive an increase in NWOM by experts concerning public health apps, they perceive
a conspiracy to be more likely, which enhances their conspiracy beliefs. Yet, similar to peer
NWOM, we propose that the extent to which change in expert NWOM causes change in conspiracy
beliefs depends on the individual’s initial conspiracy beliefs. Thus, we posit that individuals with
higher conspiracy beliefs are more likely to embrace expert NWOM, as such information supports
their beliefs, and experts are regarded more favorably (Jolley and Douglas, 2017), which enhances
their social influence and the effect of change in expert NWOM on change in conspiracy beliefs.
Individuals with lower initial conspiracy beliefs will be more critical of expert NWOM, which
limits their social influence and the effect on change in conspiracy beliefs. We hypothesize:
H3a: Change in expert NWOM positively affects change in conspiracy beliefs: an increase
(decline) in expert NWOM causes an increase (decline) in conspiracy beliefs.
H3b: The positive effect of change in expert NWOM on change in conspiracy beliefs is enhanced
by initial conspiracy beliefs, such that the positive effect is greater at higher levels of initial
conspiracy beliefs.
Change in expert PWOM and change in conspiracy beliefs. We propose that change in
expert PWOM also affects change in conspiracy beliefs. Yet, we expect that the effect essentially
depends on an individual’s initial conspiracy beliefs. Individuals with lower levels of initial
Page 15 of 46 Journal of Product Innovation Management
16
conspiracy beliefs may view experts voicing PWOM as a desirable social group, and positive
expert WOM will have some credibility. Thus, we propose that increasing expert PWOM
concerning public health apps will exert normative and informational social influences on these
individuals, who will consider a governmental conspiracy increasingly unlikely.
However, we expect divergent effects with regard to individuals with higher initial
conspiracy beliefs. Epistemic needs draw individuals to conspiracy theories and also tend to
reinforce them (Douglas et al., 2017). Thus, when individuals who hold strong conspiracy beliefs
are confronted by increasingly contradictory information, they tend to reinterpret such information
so as to maintain a coherent system of cause and effect (Jolley and Douglas, 2017). The most
effective way to discredit information that contradicts conspiracy beliefs is to claim that the
information source is part of the conspiracy (Douglas et al., 2019). This self-sealing quality is
amplified by the characteristics that conspiracy believers tend to attribute to alleged conspirators
(Sunstein and Vermeule, 2009). Thus, most conspiracy theories imply that the conspirators are
treacherous and wield immense power. For instance, the conspiracy theory that the Gates
Foundation and the “deep state” orchestrated the COVID-19 pandemic depends on the belief that
the alleged conspirators are extremely evil and powerful to carry out a plot of this magnitude.
Accordingly, if individuals believe in conspiracy theories, it seems reasonable for them to assume
that conspirators are willing and able to spread information that contradicts the conspiracy theory.
While individuals can reinterpret peer WOM to some extent in an effort to uphold their
conspiracy beliefs (see H2b), it appears unlikely that even individuals with firm conspiracy beliefs
consider their peers to be part of a conspiracy. Individuals typically possess private information
about their peers, which makes it unlikely that those peers are part of an evil conspiracy. Moreover,
peers typically have only very limited influence on public opinion, which would make them a poor
mouthpiece for conspirators. By contrast, experts have a high media presence, and so exert great
Page 16 of 46Journal of Product Innovation Management
17
influence on public opinion. In addition, experts often interact with governments and may be seen
as part of the societal elite. Thus, individuals with strong conspiracy beliefs could infer that experts
are an effective tool used by conspirators to manipulate public opinion or are part of the conspiracy.
In light of this, we propose that individuals with strong conspiracy beliefs who are
confronted with increasing expert PWOM concerning public health apps (i.e., WOM opposing their
conspiracy beliefs) not only discredit such information, but also conclude that the conspiracy is
even bigger than initially thought. The perception that growing numbers of experts are part of the
conspiracy or that conspirators are increasingly able to control expert opinion is likely to strengthen
conspiracy beliefs. Thus, we posit that among individuals with firm conspiracy beliefs, increasing
expert PWOM strengthens conspiracy beliefs. We hypothesize:
H4: Change in expert PWOM affects change in conspiracy beliefs: when the initial conspiracy
beliefs are low, an increase (decline) in expert PWOM causes a decline (increase) in
conspiracy beliefs; when the initial conspiracy beliefs are high, an increase (decline) in
expert PWOM causes an increase (decline) in conspiracy beliefs.
3.4. Behavioral Consequences of Change in Conspiracy Beliefs
Most conspiracy theories are grounded in the notion that powerful people within governments
conceal their true motives and act against the public’s interests (Sunstein and Vermeule, 2009).
Thus, individuals who hold conspiracy beliefs tend to have little trust in government agencies and
are often skeptical of government actions. This skepticism is more pronounced when government
actions involve sensitive or high-risk issues, such as privacy or personal health. This is evident in
prior research showing that conspiracy beliefs counteract government initiatives intended to
increase vaccination rates (Jolley and Douglas, 2017). Public health apps are issued by
governments and are typically associated with significant privacy and health concerns (Trang et
al., 2020). It is reasonable to assume, therefore, that conspiracy beliefs involving governments
Page 17 of 46 Journal of Product Innovation Management
18
influence the adoption of such apps. Accordingly, we predict that an increase in conspiracy beliefs
will raise doubts about government actions and so decrease the likelihood of an individual adopting
a public health app. Conversely, as conspiracy beliefs decrease, an individual places more trust in
the government and so has a higher probability of adopting a public health app.
Furthermore, we expect that change in an individual’s conspiracy beliefs also influence how
that individual intends to communicate with peers about the app. With increasing conspiracy
beliefs, individuals are increasingly skeptical about the purpose and benefits of public health apps
and, therefore, feel an increasing need to warn their peers and discourage app adoption. Thus, we
propose that with increasing conspiracy beliefs, individuals intend to voice more negatively
valenced WOM to peers about public health apps. By contrast, with decreasing conspiracy beliefs,
individuals find it increasingly likely that an app’s advertised purpose is credible and so are likely
to voice more positively valenced WOM to peers regarding it. In summary, we hypothesize:
H5: Change in conspiracy beliefs negatively affects public health app adoption: an increase
(decline) in conspiracy beliefs causes a declining (increasing) likelihood of public health
app adoption.
H6: Change in conspiracy beliefs affects change in the WOM valence on the public health app:
an increase (decline) in conspiracy beliefs causes increasingly negatively (positively)
valenced WOM on the public health app.
4. FIELD STUDY
4.1. Research Context and Data Collection
To test the proposed model, we rely on a unique longitudinal dataset collected via a multi-wave
panel study conducted in Germany during the COVID-19 pandemic, both before and after the
official voluntary tracing app was released in 2020. We deem this setting particularly suitable for
investigating the interplay between WOM, conspiracy beliefs, and public health app adoption for
Page 18 of 46Journal of Product Innovation Management
19
three key reasons. First, tracing apps exemplify innovative public health apps that attract attention
and provoke debate. On the one hand, tracing apps invade the privacy of individuals because they
require access to sensitive information regarding their social interactions (e.g., tracing social
contacts), health status (e.g., COVID-19 test results), and other personal data (e.g., contact
information). On the other hand, they have the potential to effectively contain COVID-19 and help
society to more quickly return to normal. This tension between potential societal benefits and
possible serious risks to the individual has sparked heated debates in which advocates voice PWOM
and critics voice NWOM in both private and public settings. Second, individuals are likely to be
receptive to WOM during pandemics. Most individuals have no experience with tracing apps and
lack the technological knowledge required to assess whether using an app puts them at risk. Thus,
evaluations of the app that are communicated via WOM will strongly affect individuals’ views on
the matter. Third, COVID-19-related conspiracy theories flourished in 2020 and substantially
influenced debates regarding the tracing app.
We recruited our study participants via Clickworker, a large Western European
crowdsourcing platform, and collected data from 565 individuals. To enhance the effort invested
and avoid potential biases associated with using professional survey takers recruited through
crowdsourcing platforms (e.g., lack of attentiveness, lack of skills, non-independence of
participants), we applied various procedural remedies: including attention and comprehension
checks, offering a moderate monetary incentive as well as a warning that participants would not be
paid if they were inattentive, emphasizing the importance of the study, and choosing neutral
wording (Hulland and Miller, 2018). After the data collection, we matched the responses from the
different waves and screened them for exclusion criteria such as click-through patterns. The final
sample comprised 347 participants (40% female, Mage = 32.46) who completed the surveys during
all waves and fulfilled all the conditions, leading to an effective response rate of 61.4% across the
Page 19 of 46 Journal of Product Innovation Management
20
three survey waves. After the initial survey in April (t0), in which we asked the respondents about
time-invariant and basic personality traits, sociodemographic characteristics, and first impressions
of the tracing app, the first survey wave (t1) commenced at the end of May, when the app was
officially announced by the government but prior to its release. The second wave (t2) began at the
end of June (two weeks after the app’s release) and the third wave (t3) at the end of August (two
and half months after the app’s release).
4.2. Measures
Unless otherwise noted, we measured all the variables using validated multi-item scales, which
were adapted to the context of this study where necessary. Web Appendix A.1 provides an
overview of all the items and the construct reliabilities. We measured WOM valence (i.e., the
valence of the intended outgoing WOM) with three items adapted from Maxham and Netemeyer
(2002) using a seven-point semantic differential. We measured app adoption with a single item
capturing self-reported behavior regarding app installation. For all the remaining multi-item
variables, we used seven-point Likert scales anchored by 1 = “do not agree” and 7 = “fully agree.”
We used six items from Imhoff and Bruder (2014) to measure conspiracy beliefs. To measure
perceived peer PWOM (PPWOM), peer NWOM (PNWOM), expert PWOM (EPWOM), and
expert NWOM (ENWOM), we created three items, each based on a scale by Trenz et al. (2018).
As the control variables for app adoption and WOM valence, we used the established
drivers of user behavior in a technology acceptance context, namely perceived ease of use,
perceived usefulness (both Davis, 1989), and subjective norms (Venkatesh et al., 2003), in addition
to the sociodemographic variables of age, gender, and education. For all the multi-item constructs,
the Cronbach’s alphas were greater than .80 (lowest: .88) and the composite reliability statistics
were greater than the recommended cut-off of .70 (lowest: .93), indicating measurement reliability.
Page 20 of 46Journal of Product Innovation Management
21
Measurement model. We conducted a confirmatory factor analysis based on all the latent
variables to examine our measurement model. The model showed an acceptable model fit: χ2(369)
= 788.87, comparative fit index = .963, Tucker-Lewis index = .957, root mean square error of
approximation = .058 (90% lower-level confidence interval = .052; upper-level confidence interval
= .063), and standardized root mean square residual = .043. The descriptive statistics and the
correlation matrix are available in Web Appendix A.2.
Construct validity. To examine the construct validity, we first relied on Fornell and Larcker
(1981) approach to obtain the convergent validity. The average variance extracted for each
multiple-item construct exceeded .50, suggesting adequate convergent validity. We then employed
the heterotrait-monotrait (HTMT) method to assess the discriminant validity (Voorhees et al.,
2016). Estimation of the HTMT ratio for all the latent constructs yielded values ranging from .02
to .69, which were below the threshold of .85. The largest upper limit of the 95% bias-corrected
confidence intervals for all the constructs was .75, further indicating the discriminant validity.
4.3. Estimating Changes in Variables
In accordance with recent literature (Kraemer et al., 2020), we employed mixed-effects growth-
curve modeling to capture the temporal changes in our focal variables as slopes, rather than
computing the difference score. This allowed us to estimate the individual-specific variable
changes over time, and it also accounted for inter-individual differences in these changes. This led
to less biased and more precise estimates. Web Appendices A.3 and A.4 explain how we considered
potential common-method variance and obtained the change scores used as indicators of change in
the variables in our analysis, respectively.
4.4. Hypothesis Testing
Model specification. Testing the equation system resulting from our framework (Figure 1)
needed consideration of two key characteristics of the data. First, we considered the potential
Page 21 of 46 Journal of Product Innovation Management
22
correlation of the error terms across the resulting set of theoretically linked equations (Kashyap et
al., 2012). Second, as we combined continuous (change in conspiracy beliefs and WOM valence)
and binary (app adoption) dependent variables in the equation system, we made different
assumptions regarding their respective distributions and specified the normal distribution for the
former and the logistic distribution for the latter. Our equation system consisted of three equations
with app adoption, WOM valence, and conspiracy beliefs as the dependent variables. In each
equation with a change score as the dependent variable (i.e., change in conspiracy beliefs and
WOM valence), we controlled for the scores of the respective dependent variables at t1 to consider
the starting point of each slope. We simultaneously estimated the following equation system:
APP ADOPTIONi, t3 =
β10 + β11CB CHANGE i,t1–t3 + β12EOUi,t0 + β13PEUi,t3
+ β14SUNi,t3 + β15ICBi,t1 + β16PNWOMi
+ β17PPWOMi,t3 + β18ENWOMi,t3 + β19EPWOMi,t3
+ β110AGEi + β111FEMi + β112ACAi + 1i
(1)
WOM VALENCE CHANGEi,t1–t3 =
β20 + β21CB CHANGE i,t1–t3 + β22EOUi,t0 + β23PEUi,t3
+ β24SUNi,t3 + β25ICBi,t1 + β26WOIi, t1 + β27AGEi
+ β28FEMi + β29ACAi + 2i
(2)
CONSPIRACY BELIEFS CHANGEi,t1–t3 =
β30 + β31PNWOM CHANGEi,t1–t3 + β32PPWOM
CHANGEi,t1–t3 + β33ENWOM CHANGEi,t1–t3
+ β34EPWOM CHANGEi,t1–t3 + β35PNWOM CHANGEi,t1–
t3 × ICBi,t1 + β36PPWOM CHANGEi,t1–t3 × ICBi,t1
+ β37ENWOM CHANGEi,t1–t3 × ICBi,t1 + β38EPWOM
CHANGEi,t1–t3 × ICBi,t1 + β39ICBi,t1 + β310PNWOMi,t1
+ β311PPWOMi,t1 + β312ENWOMi,t1 + β313EPWOMi,t1
+ β314AGEi + β315FEMi + β316ACAi + 3i
(3)
where CB CHANGEi,t1–t3 refers to the empirical Bayes estimates of the change in conspiracy
beliefs; EOUi,t0 refers to the perceived ease of use at t0; PEUi,t3 refers to the perceived usefulness
at t3; SUNi,t3 refers to the subjective norms at t3; ICBi,t1 refers to the initial scores for conspiracy
beliefs at t1; PNWOMi, PPWOMi, ENWOMi, and EPWOMi refer to the absolute values of the
perceived WOM types at t1 and t3, respectively; PNWOM CHANGEi,t1–t3, PPWOM CHANGEi,t1–
t3, ENWOM CHANGEi,t1–t3, and EPWOM CHANGEi,t1–t3 refer to the empirical Bayes estimates of
Page 22 of 46Journal of Product Innovation Management
23
the changes in the perceived WOM types; AGEi refers to a subject’s age; FEMi indicates whether
the subject is female; ACAi refers to subjects who have a degree in higher education (i.e.,
academics); and ε1i, ε2i, and ε3i refer to the respective error terms of subject i.
Endogeneity and attrition bias. To correct potential endogeneity resulting from simultaneity
in the conspiracy beliefs change model (Ebbes et al., 2017), we computed Gaussian copulas
associated with the different WOM types and included them in our model estimation of Eq. 1 (see
Web Appendix A.5 for details). We also control for potential attrition bias across the three survey
waves by computing the inverse Mills ratio (Heckman correction factor) and including it in Eq.
1–3 (see Web Appendix A.6 for details). Prior to the model estimation, we orthogonalized all the
interacting covariates (i.e., perceived WOM changes per type and initial conspiracy beliefs) and
the copula terms to address any multicollinearity concerns (Sine et al., 2003).
Results. To test our hypotheses, we relied on seemingly unrelated regression to estimate our
equation system, which allowed us to jointly estimate Eq. 1–3 (Gruner et al., 2019). The choice of
model was supported by a significant Breusch-Pagan test, which indicated that the regression
equations were significantly correlated (χ2(3) = 10.832; p < .05).
Table 1 displays the results, which indicate the positive and significant effect of PNWOM
change on conspiracy belief change (β = .133, p < .01), thereby providing support for H1a. We do
not find support for H1b, as initial conspiracy beliefs do not positively moderate the relationship
between PNWOM change and conspiracy belief change (β = −.061, p > .10). The results do not
support H2a either, as PPWOM change has no significant effect on conspiracy belief change (β =
.033, p > .10). We must also reject H2b, as the interaction effect between PPWOM change and
initial conspiracy beliefs on change in conspiracy beliefs is negative and significant (β = −.087, p
< .05), whereas our hypothesis suggested it to be positive. As this represents a particularly
noteworthy result, the negative interplay is illustrated in Figure 2 (Panel A), which depicts the
Page 23 of 46 Journal of Product Innovation Management
24
predicted marginal effect of PPWOM change on change in conspiracy beliefs alongside the
observed range of initial conspiracy beliefs. For lower initial conspiracy beliefs, increasing
PPWOM leads to positive changes in conspiracy beliefs, while it leads to negative changes for
higher initial conspiracy beliefs. This is surprising, as individuals with higher initial conspiracy
beliefs do not seem to discredit increasing PPWOM; rather, they give more credence to peer
support for the tracing app as “social proof” that there is no conspiracy, thereby overturning their
prior conspiracy beliefs (leading to a negative change). By contrast, individuals with lower initial
conspiracy beliefs appear to begin deliberating conspiracy beliefs when confronted with peaks in
PPWOM. Thus, while some individuals with low conspiracy beliefs seem to show increasing
conspiracy beliefs as a consequence, others have little room to reduce their conspiracy beliefs
further (as their initial conspiracy beliefs are already close to the baseline level).
-Insert Table 1 and Figure 2 about here-
The results support H3a, showing that the effect of ENWOM change on change in
conspiracy beliefs is positive and significant (β = .267, p < .001). However, this positive effect is
not enhanced by initial conspiracy beliefs (β = −.034, p > .10), meaning that we reject H3b.
H4 postulated that change in EPWOM affects change in conspiracy beliefs, where it is
expected that for lower initial conspiracy beliefs, an increase in EPWOM will cause a decrease in
conspiracy belief change, whereas, for higher initial conspiracy beliefs, an increase in EPWOM
will cause an increase in conspiracy belief change. The results provide initial evidence in support
of this hypothesis, as the interaction effect between EPWOM change and initial conspiracy beliefs
on conspiracy belief change is positive and significant (β = .084, p < .05), whereas the main effect
of EPWOM change is insignificant (β = −.017, p > .10). To determine whether we find full support
for H4, we illustrate the effect in Figure 2 (Panel B), which shows that when individuals with lower
initial conspiracy beliefs are confronted with increasing EPWOM (in line with their lower
Page 24 of 46Journal of Product Innovation Management
25
conspiracy beliefs), they reduce their conspiracy beliefs even further. Yet, the predicted change
effects also show that individuals with higher initial conspiracy beliefs tend to retain their current
conspiracy beliefs, as the predicted change scores approach zero for higher initial conspiracy belief
values. As we postulated in H4 that such individuals would likely conclude that the conspiracy is
even bigger than initially thought, thereby resulting in positive change in conspiracy beliefs (rather
than zero), we only find partial support for H4.
Finally, the results support H5 and H6, as change in conspiracy beliefs has significant and
negative effects on both app adoption (β = −3.704, p < .05) and change in WOM valence (β =
−.518, p < .05). That is, individuals who exhibit increasing conspiracy beliefs over time are less
likely to adopt public health apps and more likely to spread more negative WOM about such apps.
5. EXPERIMENTAL VALIDATION STUDY
5.1. Study Goal
The field study on the German COVID-19-tracing app, as a prime example of an innovative public
health app, allowed us to observe the evolution of real conspiracy beliefs and actual app usage over
an extended period. To increase confidence in our findings, we conducted a controlled scenario
experiment that complemented the field study in multiple ways. We employed (1) a different type
of public health app to generalize beyond tracing apps, (2) a fictitious app to avoid any past
experience effects, (3) a context unrelated to the COVID-19 pandemic to extend beyond the
boundaries of this crisis, (4) systematic WOM manipulations in an experimental setting to achieve
high internal validity, (5) a different measure for capturing conspiracy belief outcomes to
demonstrate the robustness of the observed effects (i.e., change in conspiracy beliefs in the context
of a real app over time vs. emerging conspiracy beliefs regarding a described app), and (6) another
cultural context to expand our investigation beyond a single cultural context.
Page 25 of 46 Journal of Product Innovation Management
26
5.2. Design and Participants
We conducted a scenario experiment using a 2 (WOM source: peer vs. expert) × 2 (WOM valence:
negative vs. positive) between-subjects design. We focused on the health monitoring and data
donation context and used a fictitious public health app introduced by the actual Centers for
Disease Control and Prevention (CDC; the US national public health agency). In this way, we
aimed to balance minimizing past experience effects with maintaining a sufficient level of realism
regarding the governmental entity issuing the app. The app is designed to help users monitor their
health and collects data for research on cardiovascular diseases. We recruited 173 US participantsi
(57% female, Mage = 43) through Prolific Academic (Peer et al., 2017), a major international
crowdsourcing platform.
5.3. Materials
The utilized materials are described in Web Appendices B.1 (app description), B2, and B.3 (Twitter
Tweets). The app description page mimicked a typical consumer-focused presentation and outlined
how the app allows users to monitor their health and collects data for cardiovascular disease
research while maintaining users’ data privacy. This description was the same for all the conditions
so that participants could perceive the app in isolation from any experimental manipulation.
To manipulate the WOM, we relied on Tweets with an authentic design to ensure realistic
appeal. Based on the WOM scenario descriptions and the Tweets, we employed four different
scenarios: PNWOM, PPWOM, ENWOM, and EPWOM. The WOM scenario descriptions
manipulated the source (peer vs. expert) of the WOM. The peer source was described as a close
and trusted friend, whereas the expert source was described as a distinguished tech expert. We
adjusted the text of the Tweets to manipulate the sentiment behind the WOM message (negative
vs. positive). The negative WOM contained only unfavorable statements (e.g., “New CDC app for
a dubious cause!”) concerning the health app’s functionality, data security, and benefits, whereas
Page 26 of 46Journal of Product Innovation Management
27
the positive WOM contained only favorable statements (e.g., “New CDC app for a good cause!”).
Manipulation checks indicated the four WOM manipulations to work well (Web Appendix B.4).
5.4. Procedure
The experiment was embedded within a two-wave online questionnaire. We purposefully separated
the initial measurement of conspiracy beliefs at t1 from the experimental treatment at t2 and the
measurement of the dependent variables to reduce common-method variance. In the first
questionnaire, the participants were presented with an instructive text on the CDC and a description
of the recently released public health app (“CDC Health Monitoring & Data Donation Service”).
The participants then answered items measuring general conspiracy beliefs and control variables
driving technology acceptance based on their initial perception of the app. The second data
collection wave started at least four weeks later. After reminding the participants of the public
health app, we randomly assigned them to one of five conditions (i.e., four WOM treatments vs.
control). The participants answered questions on app-specific conspiracy beliefs, app installation
intention, WOM valence, and manipulation checks. To complement the multi-wave and repeated-
measure design adopted in the field study, we measured app-specific conspiracy beliefs instead of
measuring general conspiracy beliefs again in the second wave, to alleviate concerns about repeated
measures. Web Appendix B.6 presents the items and reliability measures, while Web Appendix
B.7 provides the descriptive statistics and correlation matrix.
5.5. Results
As in the field study, we used seemingly unrelated regressions to estimate three equations. In the
first equation, we regressed app-specific conspiracy beliefs on the four treatment dummy variables
representing peer and expert WOM with negative and positive sentiments, initial conspiracy
beliefs, and their interactions. That is, the reference groups for each of the four treatment dummies
were the control group and the other WOM treatment groups. In the other two equations, we
Page 27 of 46 Journal of Product Innovation Management
28
regressed installation intention and WOM valence on app-specific conspiracy beliefs, perceived
ease of use, perceived usefulness, subjective norms, and initial conspiracy beliefs, replicating the
right-hand side of our conceptual model (i.e., the app adoption and WOM valence models).
The results of the experimental study support the hypothesized findings of the field study.
The results of the conspiracy beliefs model suggest a positive and significant effect of the PNWOM
treatment on app-specific conspiracy beliefs (β = .672, p < .05), providing further support for H1a.
In contrast to the unexpected finding from the field study, we find no negative interaction effect
between the PPWOM treatment and initial conspiracy beliefs on app-specific conspiracy beliefs (β
= −.178, p > .10). In accordance with the field study and H3a, we find a positive and significant
effect of ENWOM on app-specific conspiracy beliefs (β = .556, p < .05).
This study provides further insights concerning H4, which postulated that EPWOM
decreases conspiracy beliefs among individuals with lower initial conspiracy beliefs and increases
conspiracy beliefs among individuals with higher initial conspiracy beliefs. As in the field study,
we find evidence of the significant positive interaction effect between EPWOM and initial
conspiracy beliefs on app-specific conspiracy beliefs (β = .350, p < .05) and a not significant main
effect of EPWOM (β =.305, p > .10). Depicting the interaction effect between EPWOM and initial
conspiracy beliefs on app-specific conspiracy beliefs (Figure 3) lends full support for H4. Among
individuals with lower initial conspiracy beliefs, exposure to EPWOM negatively influences app-
specific conspiracy beliefs. By contrast, among individuals with higher initial conspiracy beliefs,
EPWOM exposure positively affects app-specific conspiracy beliefs. All the other effects in the
conspiracy beliefs model were insignificant and, therefore, consistent with the findings of the field
study.
-Insert Figure 3 here-
Page 28 of 46Journal of Product Innovation Management
29
Finally, in line with the app adoption and WOM valence models from the field study, the
results of the experiment support H5 and H6, as app-specific conspiracy beliefs have negative and
significant effects on both installation intention (β = −.326, p < .001) and WOM valence (β = −.287,
p < .01). The effects of the control variables concerning the established drivers of technology
acceptance exhibit consistent directions and significance levels, as in the field study. Web
Appendix B.8 displays all the results.
6. GENERAL DISCUSSION
6.1. Overview of the Findings
Across a field study and a controlled experiment, we provide empirical evidence in support of our
central proposition that conspiracy beliefs impede the adoption of innovative public health apps.
Table 2 provides a comparison of the two studies that highlights their complementarity in terms of
their design and methodology. Next, we will summarize and discuss the two studies’ key findings.
-Insert Table 2 here-
The two studies confirm that the behavioral consequences of increased conspiracy beliefs
are twofold: (1) increasing conspiracy beliefs essentially reduce consumers’ willingness to adopt
public health apps, and (2) increasing conspiracy beliefs trigger consumers’ increasingly negatively
valenced WOM concerning public health apps. Moreover, the results provide substantial insights
into how WOM can change individuals’ conspiracy beliefs as well as how the level of initial
conspiracy beliefs affects this relationship. Increases in peer NWOM and expert NWOM enhance
an individual’s conspiracy beliefs substantially. In contrast to our expectations, initial conspiracy
beliefs do not moderate these effects. Accordingly, increasing peer NWOM and expert NWOM
increase conspiracy beliefs and lower adoption intentions among consumers (whether they have
high or low initial conspiracy beliefs prior to receiving the WOM).
Page 29 of 46 Journal of Product Innovation Management
30
However, initial conspiracy beliefs affect how consumers process PWOM concerning
public health apps. Consistent across both studies, we find that increasing expert PWOM has no
significant main effect on conspiracy beliefs change, but initial conspiracy beliefs exert a
significant positive moderating influence on this effect. This indicates that the effect of expert
PWOM change on conspiracy beliefs change depends entirely on the initial level of consumers’
conspiracy beliefs. Further analysis reveals that at low levels of initial conspiracy beliefs, expert
PWOM consistently reduces conspiracy beliefs, whereas at high levels, increasing expert PWOM
has no effect (field study) or even a positive effect (experimental validation study). In pointing to
the context sensitivity of the magnitude of the observed effect, these results indicate that in certain
circumstances, an expert’s WOM intended to encourage public health app usage and contradict
conspiracy theories can have the opposite effect.
A discrepancy between the two studies’ results that is worth noting concerns the fact that
we could not replicate the counterintuitive negative interaction effect between peer PWOM and
initial conspiracy beliefs on app-specific conspiracy beliefs. In other words, while the main study
indicates that peer PWOM can reduce conspiracy beliefs (and encourage app adoption) among firm
conspiracy believers, the experimental study finds an effect that points in the same direction but
remains insignificant. We conclude that peer PWOM concerning public health apps can mitigate
conspiracy beliefs among firm conspiracy believers, although this effect may depend on the volume
of the peer WOM and the personal connection to the peer. Thus, in the experimental study, peer
WOM was only manipulated through a single message, and the instruction to imagine that the
message came from a close and trusted friend may have been insufficient to simulate a personal
bond. Moreover, the discrepancy may be attributed to the different empirical settings of the studies.
The field study was set in the agitated echoverse surrounding the COVID-19 pandemic, whereas
the experiment focused on a setting (i.e., health monitoring and data donation app issued by the
Page 30 of 46Journal of Product Innovation Management
31
CDC) in which the adoption of the focal app was associated with less heated debates. Thus, in line
with our previous argument, the experimental variation in the WOM message might not have been
strong enough to impact app-specific conspiracy beliefs in a calmer setting.
6.2. Theoretical Implications
Conspiracy beliefs and the adoption of innovative public health apps. Our findings offer
novel insights into how individuals process information about innovative public health apps and
the determinants of app adoption. Prior research has uncovered factors that influence the adoption
of public health apps, such as app benefits and privacy designs (e.g., Trang et al., 2020, Walrave
et al., 2020). Our research complements these findings by showing that conspiracy beliefs—a factor
neglected in the extant research—play a crucial role in public health app adoption.
Our findings highlight how conspiracy beliefs influence public health app adoption in two
ways. First, as conspiracy beliefs imply that governments pursue secret and evil plans, individuals
who hold conspiracy beliefs tend to believe that public health apps do not perform their advertised
functions, instead being used to control the population or for some other malicious purpose. Prior
studies have identified similar effects in other areas of public health, showing that conspiracy
beliefs reduce adherence to advice about vaccination (Jolley and Douglas, 2017) or HIV treatment
(Bogart et al., 2010). However, we not only highlight similar effects for public health apps, which
are not related to medical treatment in a narrow sense, but also demonstrate the inhibitory effects
of general conspiracy beliefs. In our main study, conspiracy beliefs not related to COVID-19 or
tracing apps (but to a general belief about powerful groups operating in secrecy) inhibited app
usage, whereas previous studies (as our experimental study) analyzed the influence of conspiracy
beliefs related to specific health measures. These results highlight the dangers of a “conspiracy
mindset” (Sutton and Douglas, 2020), which likely affects not only the specific public health apps
examined in this study, but also the entire range of public health apps.
Page 31 of 46 Journal of Product Innovation Management
32
Second, a more indirect influence on public health app adoption can be ascribed to the effect
that conspiracy beliefs have on individuals’ interpretation of information concerning such apps.
Our findings outline how individuals with firm conspiracy beliefs tend to discredit expert WOM
that contradicts their conspiracy beliefs (i.e., expert PWOM on public health apps). This finding
supports the notion that conspiracy beliefs have a self-sealing quality, as “the very arguments that
give rise to them, and account for their plausibility, make it more difficult for outsiders to rebut or
even to question them” (Sunstein and Vermeule 2009, 207). Thus, when consumers believe that
public health apps play a role in a conspiracy, they also likely believe that experts are part of the
conspiracy or serve as mouthpieces of the conspirators. This finding is supported by prior studies
showing that initial conspiracy beliefs can reduce or prevent acceptance of fact-based arguments
that contradict conspiracy beliefs (Jolley and Douglas, 2017). However, our findings extend these
results, showing that this effect depends on the source of the information (i.e., whether it originates
from a peer or an expert) and that expert information contradicting conspiracy beliefs can enhance
conspiracy beliefs, thereby having the opposite outcome than intended.
These insights are crucial in terms of developing a deeper understanding of public health
app adoption. Conspiracy beliefs are not merely another factor that influences public health app
adoption; they also shape the processing of information about the apps. Thus, it is reasonable to
assume that conspiracy beliefs influence how consumers evaluate the app-related benefits and
privacy designs shown to be important factors in relation to public health app adoption (e.g., Trang
et al., 2020, Walrave et al., 2020). For instance, individuals who hold strong conspiracy beliefs
and, therefore, distrust government authorities are likely to be very critical of the collection of
sensitive user data (e.g., geo-locations) and to have greater privacy concerns.
How conspiracy beliefs spread and are reinforced. Aside from the previously described
influences of conspiracy beliefs on individual consumers, our findings reveal how conspiracy
Page 32 of 46Journal of Product Innovation Management
33
beliefs can spread, exerting effects on other consumers’ public health app adoption decisions. We
show that NWOM about public health apps increases conspiracy beliefs, which not only reduces
the likelihood of app adoption, but also motivates consumers to spread more negative WOM about
the apps. Accordingly, a consumer who receives NWOM about public health apps is more likely
to spread NWOM about such apps, thereby influencing other consumers not to adopt them and, in
turn, to further disseminate the NWOM. This indicates that due to their infectious nature,
conspiracy beliefs are more dangerous to the success of public health apps than a purely individual-
focused analysis would suggest. By spreading NWOM about public health apps, a few influential
individuals can set in motion a chain of WOM that spreads conspiracy beliefs among different
groups and leads them to resist government advice to adopt public health apps.
In accordance with the previously described mechanism, our findings provide insights into
how conspiracy beliefs are reinforced in individuals and groups. When entire social groups share
conspiracy beliefs, individuals are likely to receive less WOM contradicting conspiracy beliefs and
more WOM supporting them. Thus, group interaction and social pressure uphold or even reinforce
conspiracy beliefs. In groups in which members show substantial increases in conspiracy beliefs
(e.g., as the result of an acute crisis), conspiracy beliefs may spiral into a self-reinforcing feedback
loop (or vicious cycle) fueled by social interaction between group members (Kraemer et al., 2020,
Sunstein and Vermeule, 2009). Accordingly, our findings indicate that social interaction that
reinforces conspiracy beliefs also contributes to the previously described self-sealing quality of
conspiracy beliefs. In other words, the social reinforcement of conspiracy beliefs makes it even
more difficult to convince individuals who identify with social groups whose members share
conspiracy beliefs that a conspiracy theory represents a false and dangerous belief.
How WOM sources and pre-existing consumer attitudes affect WOM influence. Our
findings extend innovation research on the role of WOM in adoption processes beyond the subject
Page 33 of 46 Journal of Product Innovation Management
34
of conspiracy beliefs and public health apps. Our findings suggest that simultaneously considering
the WOM source and pre-existing consumer attitudes is crucial to understanding the influence of
WOM on consumers’ adoption decisions. Prior studies that considered only WOM sender
characteristics suggest that the sender’s expertise promotes the influence of WOM on receivers
(Bansal and Voyer, 2000, Bone, 1995). However, we provide a more comprehensive perspective,
showing that expert PWOM does not encourage app adoption among individuals with high initial
conspiracy beliefs and can even have the opposite effect. A consumer’s baseline attitude at a given
time (initial conspiracy beliefs) can nullify or even reverse the effect of expert WOM, whereas
such an influence was not found in the case of peer WOM. By considering the interplay between
WOM sender characteristics (e.g., expertise, social ties) and pre-existing attitudes that can relate
to factors other than conspiracy beliefs (e.g., brand or risk attitudes), innovation research could
gain deeper insights into adoption processes. This would complement prior innovation studies
highlighting the influence that different communication channels (e.g., personal vs. virtual) have
on the impact of WOM (e.g., Kawakami and Parry, 2013, Parry et al., 2012).
6.3. Practical Implications
Marketing innovative public health apps. This study provides novel insights into factors
that determine public health app adoption, enabling us to provide valuable guidance for those
marketing these innovative apps. Our findings highlight how conspiracy beliefs can substantially
inhibit public health app adoption. Consequently, when launching novel public health apps, health
agencies should take into account the possibility that conspiracy theories could limit an app’s
diffusion. As the effectiveness of a public health app largely depends on its widespread adoption,
popular conspiracy theories could substantially limit an app’s prospects of success.
However, public health agencies can engage in targeted marketing campaigns to increase
app adoption. Marketers should analyze how widespread conspiracy theories are in specific
Page 34 of 46Journal of Product Innovation Management
35
consumer segments and then adapt their marketing campaigns accordingly. Thus, the interpretation
of WOM regarding public health apps depends on the level of conspiracy beliefs. Consumer
segments with low levels of conspiracy beliefs could be targeted by employing expert WOM to
promote the benefits of public health apps. Prior research indicates that WOM by well-known and
reputable experts is particularly successful in influencing opinion (Bone, 1995, Jolley and Douglas,
2017). These marketing activities should help to repress emerging conspiracy beliefs and increase
public health app adoption in these segments.
In consumer segments in which conspiracy beliefs are widespread, expert WOM proves
ineffective at mitigating such beliefs and may even reinforce them. Thus, targeting these segments
with expert WOM promoting the public health app represents a waste of resources at best and a
counterproductive measure at worst. However, peer WOM supporting public health apps can
reduce conspiracy beliefs and encourage app adoption among firm conspiracy believers. Thus,
when targeting these segments, government agencies should focus on promoting and disseminating
peer PWOM. Reaching potential users with peer WOM supporting public health apps could be
achieved by providing shareable content (e.g., user experiences, appeals for societal responsibility),
integrating recommendation functionality into the apps, or targeting key influencers. Yet, health
agencies must ensure that the solicited peer WOM is credible. Moreover, they should avoid giving
the impression that the message originates from the government, as conspiracy believers may then
view it as an effort to conceal a conspiracy, which may reinforce their conspiracy beliefs.
Although peer WOM can help to reduce conspiracy beliefs and market public health apps
in groups with widespread conspiracy beliefs, it is important to recognize that this is a difficult task
for government agencies. Social interaction upholds and reinforces conspiracy beliefs in these
groups, which limits information diversity and makes it difficult to attract peer WOM that
contradicts conspiracy beliefs and promotes public health apps. Our results regarding the (lack of)
Page 35 of 46 Journal of Product Innovation Management
36
effectiveness of peer PWOM in the context of high initial conspiracy beliefs also indicate that
prevention is likely to prove substantially more effective than intervention in certain situations
(Jolley and Douglas 2017). Existing approaches such as flagging misinformation on social media
appear to be promising in this regard (Kreko, 2020), and they could be complemented by expert
WOM contradicting conspiracy beliefs as discussed above.
Implications for commercial actors. Although our conceptual development and empirical
analysis focus on public health apps and, therefore, on implications for public agencies, the findings
also have valuable implications for commercial actors. First, it must be recognized that companies
and their innovations can also become the targets of conspiracy theories. For example, a wide array
of conspiracy theories surrounds pharmaceutical companies, claiming that they conceal damages
caused by vaccinations or make up diseases to generate profits (Jolley and Douglas, 2017). In
addition, various conspiracy theories focus on technology companies, for example, stating that the
Google algorithm only searched out unfavorable news about former US president Donald Trump
in order to sway the electorate. Insights into conspiracy theories suggest that media products and
products addressing sensitive topics such as health or collecting sensitive user data are particularly
susceptible to conspiracy theories (Douglas et al., 2019, Uscinski and Parent, 2014). Our results
indicate that such firms need to be cautious when actively opposing conspiracy theories. Targeting
consumers who exhibit high levels of conspiracy beliefs with fact-based expert opinions in an effort
to debunk conspiracy theories is likely to prove ineffective or may even backfire by reinforcing
conspiracy beliefs and situating the firm increasingly in the focus of conspiracy believers. Instead,
firms should fight existing conspiracy beliefs by encouraging the dissemination of peer WOM that
contradicts such theories and preventing the emergence of new conspiracy theories by adopting
response strategies for mitigating blistering WOM firestorms (Herhausen et al., 2019).
Page 36 of 46Journal of Product Innovation Management
37
6.4. Limitations and Future Research Directions
This study has limitations that should be taken into account, which, however, also offer promising
directions for future research. First, when analyzing the effects of WOM, we differentiated between
two sources: peers and experts. Yet, within these broad categories, specific WOM senders are likely
to be perceived differently, which may influence the effects of their WOM on conspiracy beliefs.
For instance, WOM from peers with whom an individual is very close (e.g., family members) is
likely to have a greater effect than WOM from more distant peers (e.g., online acquaintances)
(Brown and Reingen, 1987, Hofstetter et al., 2018). The characteristics of experts, such as their ties
to the government, could also influence the effects of expert WOM. Similarly, a WOM sender’s
network position could influence the effects of WOM on the receivers. For example, it is reasonable
to assume that opinion leaders on social media exert greater effects than individuals who occupy
less central network positions. Thus, future studies should complement our aggregated perspective
with an individual-level analysis that examines the effects of specific WOM sender characteristics.
Second, when analyzing the effects of WOM, we relied on perceived WOM (i.e., the extent
to which individuals noticed PWOM and NWOM by peers and experts) to determine how WOM
from different sources is processed by individual receivers. However, it is possible that conspiracy
beliefs not only affect how individuals interpret WOM, but also the extent to which they notice
different types of WOM. For example, individuals who hold firm conspiracy beliefs might be able
to recall WOM supporting their conspiracy beliefs better than WOM contradicting their beliefs.
Therefore, future studies should analyze whether conspiracy beliefs promote a selective perception
of WOM and, if so, how it influences public health app adoption.
Third, when testing the relationships in our model, we relied on samples of German and US
consumers, which suggests that our results hold for different cultural settings. Yet, we only looked
at consumers from two different countries and did not account for the influence of specific cultural
Page 37 of 46 Journal of Product Innovation Management
38
factors. It is likely that the central variables in our model, such as WOM activities, reactions to
WOM, and conspiracy beliefs, and the relationships between them are affected by cultural factors
(e.g., Broekhuizen et al., 2011). Thus, future studies should test our model in other cultural contexts
and explicitly analyze the influence of culture.
Finally, while we analyzed how conspiracy beliefs develop from a certain starting point,
we cannot provide insights into the factors that explain this starting point. However, such insights
are crucial to fighting conspiracy beliefs and increasing public health app adoption. Although prior
studies have identified general factors that contribute to the long-term development of conspiracy
beliefs (e.g., education, social status) (Freeman and Bentall, 2017), further research is required to
support public agencies in their efforts to reduce conspiracy beliefs and improve public health.
REFERENCES
Abrams, D. and Hogg, M.A. (1990). Social identification, self-categorization and social influence.
European Review of Social Psychology 1(1), 195-228.
Anderson, E.W. (1998). Customer satisfaction and word of mouth. Journal of Service Research
1(1), 5-17.
Ashforth, B.E. and Mael, F. (1989). Social identity theory and the organization. The Academy of
Management Review 14(1), 20-39.
Babić Rosario, A., Sotgiu, F., De Valck, K. and Bijmolt, T.H. (2016). The effect of electronic word
of mouth on sales: A meta-analytic review of platform, product, and metric factors. Journal of
Marketing Research 53(3), 297-318.
Bansal, H.S. and Voyer, P.A. (2000). Word-of-mouth processes within a services purchase decision
context. Journal of Service Research 3(2), 166-177.
Bogart, L.M., Wagner, G., Galvan, F.H. and Banks, D. (2010). Conspiracy beliefs about HIV are
related to antiretroviral treatment nonadherence among African American men with HIV.
Journal of Acquired Immune Deficiency Syndromes (1999) 53(5), 648.
Bone, P.F. (1995). Word-of-mouth effects on short-term and long-term product judgments. Journal
of Business Research 32(3), 213-223.
Page 38 of 46Journal of Product Innovation Management
39
Broekhuizen, T.L., Delre, S.A. and Torres, A. (2011). Simulating the cinema market: How
crosscultural differences in social influence explain box office distributions. Journal of
Product Innovation Management 28(2), 204-217.
Brown, J., Broderick, A.J. and Lee, N. (2007). Word of mouth communication within online
communities: Conceptualizing the online social network. Journal of Interactive Marketing
21(3), 2-20.
Brown, J.J. and Reingen, P.H. (1987). Social ties and word-of-mouth referral behavior. Journal of
Consumer Research 14(3), 350-362.
Budd, J., Miller, B.S., Manning, E.M., Lampos, V., Zhuang, M., Edelstein, M., Rees, G., Emery,
V.C., Stevens, M.M. and Keegan, N. (2020). Digital technologies in the public-health response
to COVID-19. Nature Medicine 26(8), 1183-1192.
CDC (2022). Centers for Disease Control and Prevention: Mobile Apps.
https://www.cdc.gov/mobile/mobileapp.html.
Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information
technology. MIS Quarterly 13(3), 319-340.
Deutsch, M. and Gerard, H.B. (1955). A study of normative and informational social influences
upon individual judgment. The Journal of Abnormal and Social Psychology 51(3), 629.
Douglas, K.M., Sutton, R.M. and Cichocka, A. (2017). The psychology of conspiracy theories.
Current Directions in Psychological Science 26(6), 538-542.
Douglas, K.M., Uscinski, J.E., Sutton, R.M., Cichocka, A., Nefes, T., Ang, C.S. and Deravi, F.
(2019). Understanding conspiracy theories. Political Psychology 40, 3-35.
Ebbes, P., Papies, D. and van Heerde, H.J. (2017). Dealing with endogeneity: a nontechnical guide
for marketing researchers. Handbook of Market Research, 1–37.
Enders, A.M., Uscinski, J.E., Seelig, M.I., Klofstad, C.A., Wuchty, S., Funchion, J.R., Murthi,
M.N., Premaratne, K. and Stoler, J. (2021). The relationship between social media use and
beliefs in conspiracy theories and misinformation. Political Behavior, 1-24.
Fornell, C. and Larcker, D.F. (1981). Structural equation models with unobservable variables and
measurement error: Algebra and statistics. Journal of Marketing Research 18(3), 382-388.
Freeman, D. and Bentall, R.P. (2017). The concomitants of conspiracy concerns. Social Psychiatry
and Psychiatric Epidemiology 52(5), 595-604.
Goertzel, T. (1994). Belief in conspiracy theories. Political Psychology, 731-742.
Page 39 of 46 Journal of Product Innovation Management
40
Gruner, R.L., Vomberg, A., Homburg, C. and Lukas, B.A. (2019). Supporting new product
launches with social media communication and online advertising: sales volume and profit
implications. Journal of Product Innovation Management 36(2), 172-195.
Grzesiak-Feldman, M. (2013). The effect of high-anxiety situations on conspiracy thinking.
Current Psychology 32(1), 100-118.
Herhausen, D., Ludwig, S., Grewal, D., Wulf, J. and Schoegel, M. (2019). Detecting, preventing,
and mitigating online firestorms in brand communities. Journal of Marketing 83(3), 1–21.
Hofstetter, R., Aryobsei, S. and Herrmann, A. (2018). Should you really produce what consumers
like online? Empirical evidence for reciprocal voting in open innovation contests. Journal of
Product Innovation Management 35(2), 209-229.
Hu, X., Chen, X. and Davison, R.M. (2019). Social support, source credibility, social influence,
and impulsive purchase behavior in social commerce. International Journal of Electronic
Commerce 23(3), 297-327.
Hulland, J. and Miller, J. (2018). “Keep on Turkin”? Journal of the Academy of Marketing Science
46(5), 789-794.
Imhoff, R. and Bruder, M. (2014). Speaking (un-)truth to power: Conspiracy mentality as a
generalised political attitude. European Journal of Personality 28(1), 25-43.
Jolley, D. and Douglas, K.M. (2014). The effects of anti-vaccine conspiracy theories on vaccination
intentions. PloS One 9(2), e89177.
Jolley, D. and Douglas, K.M. (2017). Prevention is better than cure: Addressing antivaccine
conspiracy theories. Journal of Applied Social Psychology 47(8), 459-469.
Jolley, D., Douglas, K.M. and Sutton, R.M. (2018). Blaming a few bad apples to save a threatened
barrel: The systemjustifying function of conspiracy theories. Political Psychology 39(2), 465-
478.
Jost, J.T. and Andrews, R. (2011). System justification theory. The Encyclopedia of Peace
Psychology.
Kashyap, V., Antia, K.D. and Frazier, G.L. (2012). Contracts, extracontractual incentives, and ex
post behavior in franchise channel relationships. Journal of Marketing Research 49(2), 260-
276.
Kawakami, T., Kishiya, K. and Parry, M.E. (2013). Personal word of mouth, virtual word of mouth,
and innovation use. Journal of Product Innovation Management 30(1), 17-30.
Page 40 of 46Journal of Product Innovation Management
41
Kawakami, T. and Parry, M.E. (2013). The impact of word of mouth sources on the perceived
usefulness of an innovation. Journal of Product Innovation Management 30(6), 1112-1127.
Keh, H.T. and Sun, J. (2018). The differential effects of online peer review and expert review on
service evaluations: the roles of confidence and information convergence. Journal of Service
Research 21(4), 474-489.
Kraemer, T., Weiger, W., Gouthier, M. and Hammerschmidt, M. (2020). Toward a theory of
spirals: The dynamic relationship between organizational pride and customer-oriented
behavior. Journal of the Academy of Marketing Science 48(6), 1095-1115.
Kreko, P. (2020). Countering conspiracy theories and misinformation. Routledge Handbook of
Conspiracy Theories, 242-256.
Kuan, K.K., Zhong, Y. and Chau, P.Y. (2014). Informational and normative social influence in
group-buying: Evidence from self-reported and EEG data. Journal of Management
Information Systems 30(4), 151-178.
Lin, T.M. and Fang, C.-H. (2006). The effects of perceived risk on the word-of-mouth
communication dyad. Social Behavior and Personality: An International Journal 34(10),
1207-1216.
Maxham, J.G. and Netemeyer, R.G. (2002). A longitudinal study of complaining customers’
evaluations of multiple service failures and recovery efforts. Journal of Marketing 66(4), 57-
71.
Orosz, G., Krekó, P., Paskuj, B., Tóth-Király, I., Bőthe, B. and Roland-Lévy, C. (2016). Changing
conspiracy beliefs through rationality and ridiculing. Frontiers in Psychology 7, 1525.
Parry, M.E., Kawakami, T. and Kishiya, K. (2012). The effect of personal and virtual
wordofmouth on technology acceptance. Journal of Product Innovation Management 29(6),
952-966.
Peer, E., Brandimarte, L., Samat, S. and Acquisti, A. (2017). Beyond the Turk: Alternative
platforms for crowdsourcing behavioral research. Journal of Experimental Social Psychology
70, 153-163.
Ram, S. and Sheth, J.N. (1989). Consumer resistance to innovations: the marketing problem and
its solutions. Journal of Consumer Marketing 6(2), 5-14.
Seto, E., Challa, P. and Ware, P. (2021). Adoption of COVID-19 contact tracing apps: A balance
between privacy and effectiveness. Journal of Medical Internet Research 23(3).
Page 41 of 46 Journal of Product Innovation Management
42
Sine, W.D., Shane, S. and Gregorio, D.D. (2003). The halo effect and technology licensing: The
influence of institutional prestige on the licensing of university inventions. Management
Science 49(4), 478-496.
Sunstein, C.R. and Vermeule, A. (2009). Conspiracy theories: Causes and cures. Journal of
Political Philosophy 17(2), 202-227.
Sutton, R.M. and Douglas, K.M. (2020). Conspiracy theories and the conspiracy mindset:
Implications for political ideology. Current Opinion in Behavioral Sciences 34, 118-122.
Trang, S., Trenz, M., Weiger, W.H., Tarafdar, M. and Cheung, C.M. (2020). One app to trace them
all? Examining app specifications for mass acceptance of contact-tracing apps. European
Journal of Information Systems 29(4), 415-428.
Trenz, M., Huntgeburth, J. and Veit, D. (2018). Uncertainty in cloud service relationships:
Uncovering the differential effect of three social influence processes on potential and current
users. Information & Management 55(8), 971-983.
Uscinski, J.E. and Parent, J.M. (2014). American conspiracy theories: Oxford University Press.
van Prooijen, J.-W. and Van Vugt, M. (2018). Conspiracy theories: Evolved functions and
psychological mechanisms. Perspectives on Psychological Science 13(6), 770-788.
Venkatesh, V., Morris, M.G., Davis, G.B. and Davis, F.D. (2003). User acceptance of information
technology: Toward a unified view. MIS Quarterly 27(3), 425-478.
Voorhees, C.M., Brady, M.K., Calantone, R. and Ramirez, E. (2016). Discriminant validity testing
in marketing: an analysis, causes for concern, and proposed remedies. Journal of the Academy
of Marketing Science 44(1), 119-134.
Walrave, M., Waeterloos, C. and Ponnet, K. (2020). Ready or not for contact tracing? Investigating
the adoption intention of COVID-19 contact-tracing technology using an extended unified
theory of acceptance and use of technology model. Cyberpsychology, Behavior, and Social
Networking 24(6), 377-383.
Page 42 of 46Journal of Product Innovation Management
43
FIGURES AND TABLES
Figure 1 Overview of the conceptual model.
Figure 2 Analysis of the interaction effects, field study.
A: Marginal effect of PPWOM change on conspiracy
beliefs change for different levels of initial conspiracy
beliefs
-.1 -.05 0 .05 .1 .15
Effect of PPWOM Change
on Conspiracy Beliefs Change
-2 0 2 4
Initial Conspiracy Beliefs
B: Marginal effect of EPWOM change on conspiracy
beliefs change for different levels of initial conspiracy
beliefs
-.35 -.3 -.25 -.2 -.15 -.1
Effect of EPWOM Change
on Conspiracy Beliefs Change
-2 0 2 4
Initial Conspiracy Beliefs
Page 43 of 46 Journal of Product Innovation Management
44
Figure 3 Analysis of the interaction effects, experimental study.
The marginal effect of EPWOM on app-specific conspiracy beliefs
for different levels of general conspiracy beliefs
-1 -.5 0 .5 1 1.5
Effect of EPWOM
on App-Specific Conspiracy Beliefs
-4 -2 0 2 4
General Conspiracy Beliefs
Page 44 of 46Journal of Product Innovation Management
45
Table 1 Results of field study
Conspiracy beliefs
changet1-t3
Variable
Coef.a
SEa
Constant
.555
.498
WOM change effects
PNWOM changet1-t3
.133
**
.050
PPWOM changet1-t3
.033
.059
ENWOM changet1-t3
.267
***
.060
EPWOM changet1-t3
.017
.061
Interactions with initial conspiracy beliefs
PNWOM change × conspiracy beliefst1
.061
.044
PPWOM change × conspiracy beliefst1
.087
*
.042
ENWOM change × conspiracy beliefst1
.034
.048
EPWOM change × conspiracy beliefst1
.084
*
.041
Controls
Conspiracy beliefst1
.300
***
.053
PNWOMt1
.016
.039
PPWOMt1
.029
.040
ENWOMt1
−.106
*
.043
EPWOMt1
.071
.044
Age
.019
.019
Female
.153
.089
Academics
.087
.099
Inverse Mills ratio
.588
.631
PNWOM changeCopula
.000
.043
PPWOM changeCopula
.013
.039
ENWOM changeCopula
.041
.046
EPWOM changeCopula
.057
.045
R2
.274
App Adoptiont3
WOM Valence Change
Variable
Coef.
SE
Coef.
SE
Constant
−2.731
1.613
.303
.167
Conspiracy belief change effect
Conspiracy beliefs changet1-t3
3.704
*
1.766
.518
*
.260
Technology acceptance controls
Perceived ease of use
.250
.142
.071
***
.017
Perceived usefulness
.247
*
.099
.066
***
.016
Subjective norms
.448
***
.109
.061
***
.014
Other controls
Conspiracy beliefst1
.332
**
.111
.080
***
.013
PNWOMt3
.020
.135
PPWOMt3
.101
.107
ENWOMt3
.075
.127
EPWOMt3
.026
.110
WOM valencet1
.144
***
.016
Age
.003
.062
.010
.008
Female
.280
.277
.045
.037
Academics
.407
.302
.016
.041
Inverse Mills ratio
2.097
2.095
.653
**
.249
R2
.244
.310
Notes. N = 347. All coefficients are unstandardized. The highest variance inflation factor is 2.63, which
is within the acceptable range (O’Brien 2007). a Multiplied by 10 for better interpretability.
p .10, * p .05, ** p .01, *** p .001.
Page 45 of 46 Journal of Product Innovation Management
46
Table 2 Study comparison
Comparison
Standard
Field Study
Experimental Validation Study
Setting
COVID-19 tracing app
Health monitoring and data donation app
Study type
Three-wave field study (establishing
external validity)
Two-wave scenario experiment (establishing
internal validity)
WOM measures
Perceptions (surveyed)
Manipulated treatments (scenario-based)
Conspiracy
beliefs measure
DV: Change scores for conspiracy
beliefs from t1–t3 estimated via mixed-
effects growth-curve modeling
MV: Initial conspiracy beliefs at t1
DV: App-specific conspiracy beliefs at t2
MV: Initial conspiracy beliefs at t1
Behavioral
consequences
measures
App adoption: Actual installation
decision (self-reported)
WOM valence (surveyed)
App adoption: Installation intention
(surveyed)
WOM valence (surveyed)
Primary goal
Examine the overall framework
Validate the findings from the field
Most important
finding(s)
Change in conspiracy beliefs
negatively affects public health app
adoption and WOM valence
Change in peer and expert NWOM
positively affects change in conspiracy
beliefs
When initial conspiracy beliefs are
low, an increase in expert PWOM
causes a decline in conspiracy beliefs
Replication of hypothesized field study
findings in a controlled setting
When initial conspiracy beliefs are low, an
increase in expert PWOM causes a decline
in app-specific conspiracy beliefs; when
initial conspiracy beliefs are high, an
increase in expert PWOM causes an
increase in app-specific conspiracy beliefs
Notes. WOM = word of mouth, NWOM = negative words of mouth, PWOM = positive word of mouth, DV =
dependent variable, MV = moderating variable.
i We excluded 92 participants who participated in the first wave of the survey, but not in the second wave.
Page 46 of 46Journal of Product Innovation Management
... Certainly, it refers to only a small part of society; however, such people do exist. The externalities are the same as in the previous case: lost information and social exclusion [150,[160][161][162]. ...
Article
Full-text available
The concept of a smart city is widely implemented all over the world, and this fact creates both possibilities and new challenges for all participants and stakeholders of the process. This study examines the implementation of smart governance in the context of smart cities. The goal of the research is to distinguish between the effects and externalities of the smart governance domain, both positive and negative ones; the effects and externalities are elicited from the outcomes of smart governance implementation revealed from a review of scientific publications devoted to the results, barriers, and facilitators of smart governance functioning. The publications were selected according to a systematic review methodology, then the selected articles were analyzed and the factors that foster the processes of smart governance implementation (facilitators) or vice versa hamper the acquisition of results (barriers), as well as the outcomes of smart governance, were extracted. The extracted factors were attributed to six areas: Information, Efficiency, Citizen-Centricity, Transparency, Digital Divide, and Regulation. Further, the outcomes of smart governance implementation were distinguished as effects and externalities, which were both positive and negative.
... In addition, to tailor services to citizens, smart cities rely on vast amount of personal data, which raises data privacy and security concerns. The introduction of such invasive smart public technologies may even lead citizens to suspect conspiracies (Krämer et al., 2022). The implementation of smart city infrastructures is additionally complicated by the need to integrate different systems and the associated high investment and maintenance costs. ...
Chapter
Full-text available
There are many research studies on the functions and background motives underlying belief in conspiracy theories (Douglas et al. 2017) and also the negative consequences of conspiracy theories (Douglas et al. 2015; see also Chapter 2.7 in this volume). However, we so far only have a limited knowledge of the most practical implications of conspiracy theories: How they can be changed, debunked and modified. This chapter tries to systemically overview and summarise the most important research so far concerning the possibilities of changing conspiracy beliefs via targeted interventions. I do not take it for granted that there is a consensus over the need for interventions. In the beginning of the chapter, we take a look at the epistemological, moral and democratic arguments on whether, and when, we need to use interventions to reduce conspiracy beliefs. Then we briefly overview some psychological obstacles in the way of interventions. In the next section, we propose a matrix as a theoretical framework for categorising the possible interventions and overview the available academic literature as well as some practical experiences concerning efficient ways of reducing conspiracy beliefs. In the final section, we identify a broader avenue for future research.
Article
Full-text available
Numerous studies find associations between social media use and beliefs in conspiracy theories and misinformation. While such findings are often interpreted as evidence that social media causally promotes conspiracy beliefs, we theorize that this relationship is conditional on other individual-level predispositions. Across two studies, we examine the relationship between beliefs in conspiracy theories and media use, finding that individuals who get their news from social media and use social media frequently express more beliefs in some types of conspiracy theories and misinformation. However, we also find that these relationships are conditional on conspiracy thinking--the predisposition to interpret salient events as products of conspiracies--such that social media use becomes more strongly associated with conspiracy beliefs as conspiracy thinking intensifies. This pattern, which we observe across many beliefs from two studies, clarifies the relationship between social media use and beliefs in dubious ideas. Supplementary information: The online version contains supplementary material available at 10.1007/s11109-021-09734-6.
Article
Full-text available
Unstructured: With the relative ubiquity of smartphones, contact tracing and exposure notification apps have been looked to as novel methods to help reduce the transmission of the COVID-19 virus. Many countries have created apps that lie across a spectrum, from privacy-first approaches to those that have very few privacy measures. The level of privacy incorporated into an app is largely based on the societal norms and values of a particular country. Digital health technologies can be highly effective and preserve privacy at the same time, but in the case of contact tracing and exposure notification apps, there is a trade-off between increased privacy measures and the effectiveness of the app. In this Viewpoint, examples from various countries are used to highlight how charactertistics of contract tracing and exposure notification apps contribute to the perceived levels of privacy awarded to citizens and, consequently, how this impacts an app's effectiveness. We conclude that finding the right balance between privacy and effectivess, while critical, is challenging because it is highly context-specific.
Article
Full-text available
To diminish the risk of spreading COVID-19 as society exits the lockdowns, several apps have been developed for contact tracing. These apps register which users have been in proximity of each other. If a user is diagnosed with COVID-19, app users who have been recently in proximity to this person are notified. The effectiveness of these apps highly depends on public support. Therefore, this study investigated the factors that influence app use intention, based on an extended unified theory of acceptance and use of technology model. A survey was administered in Belgium (Flanders) to 1,500 participants aged 18 to 64 years old. Structural equation modeling was used to test the relationships among the model's constructs. Our results indicated that 48.70 percent of the respondents wanted to use the app. The model explained 39 percent of the variance in app use intention. The most important predictor was performance expectancy, followed by facilitating conditions and social influence. Effort expectancy was not related to intention. Moreover, individuals' innovativeness was positively related with app use intention, whereas app-related privacy concerns negatively influenced intention. Based on the results, suggestions are made for policy makers and developers.
Article
Full-text available
Digital technologies are being harnessed to support the public-health response to COVID-19 worldwide, including population surveillance, case identification, contact tracing and evaluation of interventions on the basis of mobility data and communication with the public. These rapid responses leverage billions of mobile phones, large online datasets, connected devices, relatively low-cost computing resources and advances in machine learning and natural language processing. This Review aims to capture the breadth of digital innovations for the public-health response to COVID-19 worldwide and their limitations, and barriers to their implementation, including legal, ethical and privacy barriers, as well as organizational and workforce barriers. The future of public health is likely to become increasingly digital, and we review the need for the alignment of international strategies for the regulation, evaluation and use of digital technologies to strengthen pandemic management, and future preparedness for COVID-19 and other infectious diseases. The COVID-19 pandemic has resulted in an accelerated development of applications for digital health, including symptom monitoring and contact tracing. Their potential is wide ranging and must be integrated into conventional approaches to public health for best effect.
Article
Full-text available
We consider the significance of belief in conspiracy theories for political ideologies. Although there is no marked ideological asymmetry in conspiracy belief, research indicates that conspiracy theories may play a powerful role in ideological processes. In particular, they are associated with ideological extremism, distrust of rival ideological camps, populist distrust of mainstream politics, and ideological grievances. The “conspiracy mindset” characterizes the ideological significance of conspiracy belief, and is associated with measuring conspiracy belief by means of abstract propositions associated with aversion and distrust of powerful groups. We suggest that this approach does not pay sufficient attention to the nonrational character of specific conspiracy beliefs and thus runs the risk of mischaracterizing them, and mischaracterizing their ideological implications.
Article
Full-text available
While previous studies have demonstrated that organizational pride (OP) enhances frontline employees’ customer-oriented behavior (COB), they have neglected to address the dynamics of the relationship. This research helps close this gap by elaborating on a theory of spirals positing that the extent of COB depends not only on current levels of OP but also on the direction and rate of OP change. In addition, the authors challenge the prevalent view that OP affects COB unidirectionally, instead predicting reciprocal loops. Hence, they propose that increases in OP repeatedly amplify COB and trigger an upward spiral, whereas decreases trigger a downward spiral. The results of a six-wave panel study support these predictions. Furthermore, the authors identify lower and upper boundaries of the spiral: while a certain threshold of OP is required to create momentum, the effects of further increases in the same variables diminish at high levels of OP and COB.
Article
Full-text available
Social commerce (s-commerce)—the use of social media to support electronic commerce—has become pervasive. This paper aims to investigate an important type of consumer behaviour that could generate considerable economic value: impulsive purchase behaviour. Specifically, we focus on the role of peer influence. Social influence theory posits that the process via which peers change a consumer’s behaviour can be interpreted along two dimensions: informational and normative. Furthermore, drawing from literature, source credibility and social support are proposed as the antecedent factors of the influencing processes in this context. We surveyed 303 s-commerce participants in Sina Weibo to empirically test the research model. The results indicate that peers’ expertise and trustworthiness are significantly related to both types of social influence that could exert an influence on a consumer. Further, consumers’ exchange of informational and emotional social support significantly facilitates social influence among them. This study contributes to both the s-commerce and the impulsive purchase literature by revealing the role of peer influence in consumers’ impulsive consumption behaviour in the s-commerce setting. The practical implications are also illustrated in the paper.
Article
The current COVID-19 crisis has seen governments worldwide mobilising to develop and implement contact-tracing apps as an integral part of their lockdown exit strategies. The challenge facing policy makers is that tracing can only be effective if the majority of the population uses the one app developed; its specifications must therefore be carefully considered. We theorise on tracing apps and mass acceptance and conduct a full-factorial experiment to investigate how app installation intention is influenced by different app specifications based on three benefit appeals, two privacy designs, and two convenience designs. By applying quantile regression, we not only estimate the general effect of these app specifications but also uncover how their influence differs among citizens with different propensities for acceptance (i.e. critics, undecided, advocates)—a crucial insight for succeeding with mass acceptance. This study contributes to research in three ways: we theorise how mass acceptance differs from established app acceptance, we provide a fine-grained approach to investigating the app specifications salient for mass acceptance, and we reveal contextualised insights specific to tracing apps with multi-layered benefit structures. Our findings can guide policy makers by providing specification recommendations for facilitating mass acceptance of tracing apps during pandemics or other societal crises. Full text: https://uni-goettingen.de/en/one_app_to_trace_them_all/627132.html