American Behavioral Scientist
© 2020 SAGE Publications
Article reuse guidelines:
Publics, and the Failing
Marketplace of Ideas: Theory
Jeong-Nam Kim1,2 and Homero Gil de Zúñiga3,4,5
The explosive usage in recent years of the terms “fake news” and “posttruth”
reflects worldwide frustration and concern about rampant social problems created
by pseudo-information. Our digital networked society and newly emerging media
platforms foster public misunderstanding of social affairs, which affects almost all
aspects of individual life. The cost of lay citizens’ misunderstandings or crippled
lay informatics can be high. Pseudo-information is responsible for deficient social
systems and institutional malfunction. We thus ask questions and collect knowledge
about the life of pseudo-information and the cognitive and communicative modus
operandi of lay publics, as well as how to solve the problem of pseudo-information
through understanding the changing media environment in this “truth-be-damned”
era of information crisis.
disinformation, fake news, information crisis, lay informatics, misinformation, pseudo-
information, publics, situational theory of problem solving, social media
In the story of the Tower of Babel, all people spoke the same language. Their civi-
lization became capable of building a tower to reach heaven. Seeing the tower but
also the human arrogance that built it, God cursed the people by splitting their
1The University of Oklahoma, Norman, OK, USA
2Debiasing and Lay Informatics (DaLI) Lab, Norman, OK, USA
3University of Salamanca, Salamanca, Spain
4The Pennsylvania State University, State College, PA, USA
5Diego Portales University, Santiago, Chile
Jeong-Nam Kim, Gaylord College of Journalism and Mass Communication, University of Oklahoma,
Norman, OK 73019, USA.
950606ABSXXX10.1177/0002764220950606American Behavioral ScientistKim and de Zúñiga
2 American Behavioral Scientist 00(0)
common language into many. He confounded communication, and we were no
longer able to understand each other. We fell into incommunicado and were scat-
tered physically as well.
In our digital networked society, information and communications technologies
(ICTs) and network technologies connect almost all the dots for communicators in
much the same way as the Tower of Babel. This seemingly equalizes all people and
levels the playing field for lay publics, as opposed to experts or elites. Like the Tower,
these new communication and network technologies elevate our capacity to control
and exchange knowledge. But then we came on a curse again: another sort of incom-
municado, an indiscernibility of fact and fiction in the sea of information. Information
is often obscured and leads us to erroneous conceptions of the world and events
(Bimber & Gil de Zúñiga, 2020; Kim, 2018). We are now cognitively lost in an infor-
mational paradise and left to virtually scatter.
This special issue is about problems of pseudo-information, media, and publics in a
digitalized network society.1 Oxford Dictionary selected “posttruth” as its word of the
year in 2016 (Flood, 2016), and since then, the explosive usage of the terms “fake
news” and “posttruth” has reflected worldwide frustration and concerns about social
problems caused by misinformation and disinformation. In today’s digital networked
society and media environment, it is all too easy to confuse opinion, fiction, and
incomplete or inaccurate ideas with established historical or scientific canonical facts.
The costs of pseudo-information and public confusion are high. Pseudo-information in
particular is responsible for deficient social systems and institutional malfunction due
to distrust. We thus aim to ask questions and collect knowledge about pseudo-informa-
tion and the crippled lay informatics of its users in order to understand the information
crisis that we live in.
Information and Pseudo-Information
We refer to information as bits of evaluated knowledge and data available and shared
among communicative actors in a problematic situation (Kim & Grunig, 2011).
“Misinformation” (2020) is defined by Merriam-Webster as “incorrect or misleading
information,” whereas “disinformation” (2020) is defined as “false information delib-
erately and often covertly spread (as by the planting of rumors) in order to influence
public opinion or obscure the truth.” The term misinformation has been used in a more
generic and inclusive sense to describe all false and inaccurate information at its ori-
gin. However, in the recent information crisis, the usage of misinformation has become
taxonomically narrower to distinguish between incorrect information which is acci-
dentally or unintentionally false.
Misinformation and disinformation are similar in that both refer to information
lacking in veracity. But they are different in terms of the purposefulness of their
development and sharing. Misinformation refers to an accidental lack of veracity;
disinformation is deliberately false or inaccurate to serve its creator’s interest. The
evolving usage of the term misinformation reflects the changes to communicative
environments and the frequent problems created by false information over emerging
media and communication networks. In this special issue, we suggest an umbrella
Kim and de Zúñiga 3
term, pseudo-information, to include all types false or inaccurate information, as well
as the trafficking of it in various ways and fields. We acknowledge that pseudo-infor-
mation is still a kind of “information,” despite the likely harm or problematic conse-
quences to its bearers (Figure 1).
Pseudo-information is not as a counterconcept to information. Rather, it is still
under the umbrella of “information,” but discerns information causing harmful conse-
quences or social externalities on information subscribers (Table 1). Information in
itself can vary from accurate to inaccurate, beneficial to depriving, factual to fictional,
and realistic to illusory. The interaction between information and its subscribers gener-
ates unique interpretations and utility in a given individual’s life situation. Thus, infor-
mation and its related phenomena should be understood in terms of the individuality
and subjectivity of its creators, traffickers, and users—the information actors—and
their contextual environments and life situations. Therefore, the meaning, value, util-
ity, and benefit or harm of given information are subject to intersubjectivity and con-
structed contextually with that information’s presence in information actors’ personal
and social time-space. In fact, information is the joint product of interactions among
people, environments, and situations.
This notion is revealed in many examples. For instance, a lie intended to comfort
an anxious person is false. As it is delivered to the person, however, the bits of evalu-
ated knowledge or data could generate a utility, such as calming down. For an ill
patient and their family, wishful conversation about the outcome of treatment is illu-
sory, but could generate a substantial coping effect for the patient. Wartime disinfor-
mation tactics by Russia generated gains for its crafters and losses among Nazi
commanders. More dangerously, rumormongering of conspiracy theories related to
Corona virus or linking 5G network towers with the spread of COVID-19 sparked
arson attacks (Cerulus, 2020) and bolster stigma and violence against Asian neighbors
(Do, 2020). Such troubling states result from the interactions of information actors
(creator, trafficker, or subscriber) and their situated conditions.
Figure 1. Information classification.
4 American Behavioral Scientist 00(0)
A Failing Digital Marketplace of Ideas and Lay Informatics
In organizing our collection, we refer to the information environment using the ana-
logic term “marketplace.” Our conception inherits the analogy of the “marketplace of
ideas” in trading information. Originally coined John Stewart Mill’s 1869 book On
Liberty, this famous analogy to the economic marketplace has provided the foundation
of legal philosophy regarding freedom of speech. It is Mill’s analogy around which
First Amendment jurisprudence spins (van Mill, 2018). It and its underlying assump-
tions have helped society secure and advocate for the human right of uncensored and
almost unlimited freedom of expression for individuals.
In a digital society, our conception of the marketplace of ideas has evolved and
expanded into virtual space. Accordingly, the social nature of expressing and trading
ideas has quantitatively and qualitatively changed (Gil de Zúñiga, 2015; Grunig,
2009). There are too many traders of information, and they are hyper-lively when it
comes to social problems and world affairs. The explosive rise and fall of local mar-
ketplaces have increased participants’ choices and control on market behaviors, and
the participating traders crisscross many marketplaces concurrently using multiple
identities. With anonymity and the burying effect of identity due to innumerable par-
ticipants, people are less afraid to express their ideas.
This so-called fourth industrial revolution fused the physical, biological, and
digital worlds. All things are connected, and those connections enable participants to
increase their communicative behaviors in digital information markets (Kim et al.,
2010; Kim et al., 2018). Much of this revolution is indebted to the explosion of the
Table 1. Pseudo-Information: Types and Its Consequences to Information Subscribers.
example, imminent risk
on safety, well-being, or
interest of information
High Harmful without serving
crafter’s interest: Could
states such as joy,
terror, anger, rage, and
loathing for information
Harmful and serving
crafter’s interest: Could
states such as joy,
terror, anger, rage, and
loathing for information
Low Unharmful without serving
crafter’s interest: Could
states such as pleasure,
discomfort, fear, disgust,
and contempt for
Unharmful but serving
crafter’s interest: Could
states such as pleasure,
discomfort, fear, disgust,
and contempt for
Kim and de Zúñiga 5
digital marketplace of ideas which, enabled the era of big data. Lay people and the
contents created and exchanged by their communicative actions generated an inde-
pendent media economy and the blooming byproduct industries of data analytics
In the predigital marketplace of ideas, the traditional traders of information, such as
social institutions (e.g., politicians or mass media) and intellectual authorities (e.g.,
scientists), commanded louder voices and exercised greater influence on the general
populace (Kim, 2014). Hence, there was a need for protective intervention for lay
individuals to preserve their freedom to speak out. The marketplace of ideas evolved
on the rationale of harnessing control from the elite and eliminating pressure on lay
citizens and publics, such as censorship from formal and powerful social institutions.
Yet in the digital society, the sheer number of expressive lay individuals, as
opposed to social elites or experts, and the amount of their ideas has created an
overtrading problem of pseudo-information. Frequently, Mill’s assumption of the
economic market—that superior products will survive over inferior ones by compe-
tition—fails (Stanley, 2018). Truth prevails less often than expected. Various alter-
native ideas challenge and compete with the traditional authority of institutional
participants such as governments, universities, or scientists. A significant number
of lay participants in the digital information markets adhere to troubling claims and
fragmented facts. In some extreme cases, traditional actors with endowed source
credibility are ousted or overwhelmed.
The failure of the digital marketplace of ideas could be defined as the ability of
invalid or uncertain ideas—pseudo-information—to spread. Still the failure is partial
and local. In marketplaces sustained with scientific epistemology, such as research
fora or governmental institutions, the exchange of information is relatively secure and
advanced by the evaluative process of competing merits. There, open debates take
place and scientific epistemology guides knowledge creation through systematic pro-
cedures for scrutinizing evidence and claims for collective interpretation. What really
distinguish scientific epistemics from lay epistemics or lay informatics are metacogni-
tion and metacommunication (Kim & Grunig, 2019).
Scientists or those delegates for policy and decisions are in the habit of thinking
about their thinking and communicating their methods of information use. In contrast,
the greater number of lay information traders rely on lay informatics—subjective or
arbitrary processes of ideation and evaluation (Kim, 2014; Kim & Krishna, 2014)—
and enjoy self-exemption from metacognition and metacommunication in the use of
information regarding their claims (Kim & Grunig, 2019). Thus, pseudo-information
prevails, and incomplete or inadequate interpretations become predominant.
First and Secondary Information Markets in Digital
Society and the Problems
Given the evolution of media and ICT and the devolving nature of lay publics’ use of
information, we distinguish two layers in the digital marketplace of ideas: the first and
6 American Behavioral Scientist 00(0)
secondary information markets. The first information market refers to the physical and
virtual time-spaces wherein the traders of ideas crafted by formal social institutions
such as governments, media, and science organizations or organized social interests
(e.g., corporations or interest groups). The market selection of ideas in the first infor-
mation market is guided by scientific epistemics, despite occasional imperfections,
and subject to the systematic procedures of counterevidence-based, refutation-focused,
entirety-focused, and collective interpretation of merits, as well as tentativeness of
superiority. It is imperfect, yet vying for Mill’s ideal state of market competition.
In contrast, the secondary information market refers to those physical and virtual
places wherein individuals come together to trade ideas (contents) crafted by nonex-
pert lay citizens, either individually or collectively. Market selection of ideas in the
secondary information market is frequently informal and relatively evidence-based,
confirmatory-focused, individualistic or factional in its interpretation of merits, and
conclusiveness of its superiority. Here, in the secondary marketplace of ideas, the pro-
cess of competition is vibrant, with a lack of censorship, but blurs the merits and flaws
of each claimed idea. This different interpretation of the merits of claims and the sub-
sequent selection of ideas could be multifarious through lay individuality, and often
reflects fractions of the general populace and their subjective preferences.
The failure of the first market is in the rapid takeover of traditional information
institutions by ICT and networking (Figure 2), and in the market’s subsumption (cf.
capitalist subsumption of labor and economic process, Karl Marx, 1867) to the
emerging secondary marketplace. There, individual citizens can participate in trades
across networked digital marketplaces. Traditional information producers and trad-
ers (e.g., legacy media) are drowned out by the billions of information traffickers
and millions of local clusters crowded by nonexpert lay people. Social systems of
delegation to experts are sometimes dethroned, and the source credibility of tradi-
tional information traders can be overthrown by the networked myriad of individual
In contrast, the secondary information market fails due to crippled lay informatics.
Newly endowed with the power of communication and connection, people could not
match cognitively or communicatively. The amount of available knowledge or data
only becomes information through evaluative tasks by the communicator (Kim &
Krishna, 2014). The quality and quantity of evaluative products—that is, information—
is largely defined by one’s cognitive and communicative actions (e.g., justificatory
information forefending; Kim et al., 2018). The modus operandum of the lay individual
is obfuscated easily, unless extraordinary care is taken to conduct thoughtful judging
(cognitive retrogression in problem solving; Kim & Grunig, 2019).
Defining Questions for the Information Crisis
Two Types of Knowledge on Pseudo-Information, Publics, and Media
To understand and work for an eventual resolution of those market failures, we aim
to begin scholarly discussions and assemble theoretical bases to describe the state of
and sources of the problems surrounding this information crisis. In particular, in this
Kim and de Zúñiga 7
introductory theoretical piece of the special issue, we seek to elaborate on two types
of knowledge: “descriptive knowledge,” to acquire understanding of the “process” of
the phenomenon, and “prescriptive knowledge,” to acquire understanding of and
guidance for what “procedures” we would take for problem resolution (Carter, 1972;
Grunig, 2003). Furthermore, we examine the two responsible actors—both media and
publics who traffic and enact pseudo-information in their daily life and social actions
Specifically, we seek systematic inquiries and answers for the descriptive (“what
is”) questions, to capture and diagnose the troubling phenomena of pseudo-informa-
tion. In doing so, we pursue a better understanding of causes, processes, and conse-
quences of misinformation and disinformation, as well as the roles, mechanisms, and
motives of agents and actors who craft and traffic it. Following this rational, we also
seek prescriptive knowledge (“what should be done”) relating to solutions to improve
the social problems arising from pseudo-information. Better procedures can only be
constructed when we first know “what is” as well as how and why it happens. Laying
the ground for a better understanding of these two sets of knowledge is one of the aims
sought from researchers in this special issue.
This special issue will not provide answers to all possible theoretical lingering
issues and questions with regards to pseudo-information. Yet it will prove useful at
identifying key questions to open fora for theoretical and practical solutions. To that
end, here we propose some central research questions:
Descriptive Knowledge of Pseudo-Information, Media, and Publics
What are the nature and impacts of pseudo-information in the digital networked
What actors and factors contribute to the troubling consequences of
Figure 2. Marketplace of ideas and changes: Market subsumption to market failure.
8 American Behavioral Scientist 00(0)
What are the underlying processes or mechanisms associated with the accepting
and sharing of misinformation and disinformation with others? How and why
do lay publics become receptive to pseudo-information, both in terms of sub-
scribing to it and spreading it among their social networks?
What are the functions or values of pseudo-information for its crafters, traf-
fickers, and adopters? What are the incentives for carrying and spreading
pseudo-information for different actors (e.g., campaigners, rivals, opinion
What public and media factors increase and decrease gullibility toward pseudo-
information among lay citizens?
What are the hardware aspects (e.g., ICT systems, network policy, mass media,
online platforms) and software aspects, such as social conditions (e.g., network
traits, sociodemographic attributes), political conditions (e.g., political cli-
mates, ideological dominance, trust), personal psychological conditions (e.g.,
lay epistemics, sociocognitive traits), conducive to the crafting, trafficking, and
subscribing to of pseudo-information?
Prescriptive Knowledge of Pseudo-information, Media, and Publics
What are the preventive and promotive factors of improved use of information
(e.g., news literacy, digital information literacy)?
How can we equip individual citizens to depreciate pseudo-information such as
lies and fake news? What can we do to increase immunity to pseudo-informa-
tion among lay audiences (e.g., inoculation strategy)? What countermeasures
can we employ to enhance information literacy among individual opinion lead-
ers and influential social media users (e.g., power bloggers)?
What are the roles and boundaries of government intervention to resolve the
problems of information crises?
What will be the role of marketers of information such as Google, Facebook,
Twitter, or traditional media, and how should ‘information patrolling’ be
approached in the marketplace of ideas?
What are the ethical considerations and limits of “information policing” in the
marketplace of ideas?
What strategies should traditional and social media use to restore credibility in
the fight against pseudo-information?
Theoretical Essays in the Current Volume. In the pursuit of cutting-edge theoretical
accounts that would move pseudo-information research forward, we purposively
edited a handful of novel, thought-provoking theoretical pieces. These essays encap-
sulated many of the pressing issues revolving misinformation and disinformation
today, with a cross-disciplinarity angle.
Molina et al. (2019) kick off the special issue, underscoring how difficult it is for
computer-aided and automated machine learning mechanisms to detect pseudo-infor-
mation online. Drawing on a concept explication paradigm (Chaffee, 1991), their goal
Kim and de Zúñiga 9
is to provide scholars and policy makers alike with a theoretical foundation of what
may be considered fake news. In doing so, the article introduces an eight-dimension
typology to single out fake news content (i.e., false news, polarized content, satire,
etc.) by contrasting them with factual, legitimate news at four levels: message, source,
structure, and network.
For instance, for real news to be detected at the message and linguistic level, read-
ers should pay attention to proofed, fact-checked, and impartial reporting that will
attribute sources with names. The content may also include evidence or scientific
grounded content and reports about research-based information and statistical data.
Source and information (message) intention also matters. Are the sources of informa-
tion verified? Do they include heterogeneous, balanced and fair information sources
and quotations? Factual professional news should also include a certain degree of
“source pedigree,” of the organization behind the information, and transparency
regarding from whom specifically the reporting or “news” is coming from may also
Likewise, the article argues that structural and network characteristics could poten-
tially give away symptoms of what the authors refer to as features of real news. Data
such as whether the information derives from a reputed URL, includes metadata with
authentic indicators, has clear contact or about sections, and shares active credible
e-mail addresses where content creators may be reached, are all valuable and practical
mechanisms to more accurately spot factual news. All in all, the essay provides a very
useful route log for academics, policy makers, and citizens to better detect misinfor-
mation in today’s digital world.
Kim and Grunig’s (2019) essay takes a much broader scope when dealing with
information behavior and human problem solving. Based on the situational theory of
problem solving (Kim & Grunig, 2011; Kim & Krishna, 2014), the authors explicate
which information is distinct from data and knowledge, and how cognitive problem
solving occurs in problematic life situations. Kim and Grunig conceptualize “cogni-
tive retrogression in problem solving” to account for lay individual’s crippled infor-
matics—the less-than-ideal communicative and cognitive modus operandi that lay
individuals use in everyday life.
Initially, the essay clarifies why and how information is indeed beneficial to society
at large, particularly when accurate and factual information serves as a sine qua non
condition for citizens and lay publics to understand complex issues and solve their
daily problems. Under this condition, the more information, the better. Paradoxically,
in today’s world where ordinary citizens encounter practically unlimited amounts of
information at their disposal, the axiom that more information is better may be at odds
with the most efficient way to interpret complex issues and solve problems. To accom-
modate and examine this disparity, the authors introduce two cognitive processes that
shed light on individuals’ means of dealing with vast amounts of information to
unravel complex issues and solve daily informational quandaries. On the one hand,
cognitive retrogression takes place when individuals attempt to solve problems swiftly,
reach a rapid conclusion, and engage after the fact in a cognitive effort only once a
conclusion is already grasped. On the other hand, cognitive progression represents a
10 American Behavioral Scientist 00(0)
reasoning strategy that encompasses greater mental involvement and processing
before a conclusion is reached. The former strategy helps individuals optimize prob-
lem-solving situations for a decision based on limited or sufficient evidence.
Conversely, the mental processes of cognitive progression contribute to solving any
daily problem or quandary in the most optimal informational context.
Kim and Grunig then explain cognitive arrest in problem solving: an epistemic
endeavor made retrogressively by a problem solver which only heaps self-warranting
evidence and lead to growing epistemic inertia. This theorization accounts for how
and why lay publics are entrapped and paralyzed by justificatory information and
become susceptible to pseudo-information. At the end, they apply a theoretical account
to conspiratorial thinking as an exemplary case of public close-mindedness.
Further building on the characteristics of misinformation and fake news, Chiu and
Oh (2020) propose a creative theoretical distinction between what may constitute fake
news and what could be identified as a personal lie. This distinction is not trivial. As
the authors argue, social media moguls can easily spot fake news based on this char-
acteristic distinction, and thus seamlessly eliminate fake information from their plat-
forms. Although both personal lies and fake news in social media contexts seek to
deceive the audience or the public, they do so in different ways. The key element lies
in the ultimate goal or purpose. Fake news craves attention and seeks to overwhelm-
ingly capture and spark action on the part of the audience. On the contrary, personal
lies pursue unnoticeable means of misinformation dissemination to maintain social
ties and primarily elicit audience inaction.
More specifically, drawing on many distinct theories, the article reveals six dimen-
sions on which fake news and personal lies differ: speaker–audience relationship,
goal, emotion, information, number of participants, and citation of sources. Usually,
when it comes to personal lies, the personal relationship between the speaker and the
audience is a close one, whereas in the case of fake news, this personal connection is
more distant. The main purpose of a personal lie is a defensive one, to achieve audi-
ence inaction. Yet when fake news is generated, the main goal is the opposite. The
more impact and greater audience engagement with the content, the more successful
the fake news will be. Based on appraisal theory, the level of emotional involvement
in the context of personal lies is low, as the speaker seeks somewhat to achieve emo-
tional distraction from the audience. For fake news, emotions play a much more rele-
vant role, where information usually includes massive doses of emotional manipulation
In terms of the information dimension, the authors explain that personal lies tend to
be quite mundane and include vague or little information. Fake news, on the other
hand, will contain vivid and detailed information to persuade the audience. Finally,
personal lies will typically make use of fewer sources and a lower number of partici-
pants, while fake news will attempt to capture the largest number of participants pos-
sible and will embrace the use of more sources.
Lee and Shin (2019) delve into some of the most germane factors to understand
why misinformation is disseminated online. Borrowing from strands of political per-
suasion literature, as well as from works on the credibility of online information and
Kim and de Zúñiga 11
digital deception, the article focuses on pragmatic factors that cause ordinary citizens
to take the information they are exposed to through social and digital media as truthful.
They highlight some traits and attributes about the sources of information, the mes-
sage, the channel, and about the receiver. For instance, the number of sources used, the
perceived expertise of the sources, and how similar the source appears to be with
respect to the information receiver, are all important persuasive instruments for fake
news to thrive. Similarly, when a message is congruent with the receiver’s views, is
presented in a repetitive and easy-to-understand manner, and is based on information
that may be perceived as factual, the audience will be more likely to take the message
as believable. Finally, some features about the channel through which the misinforma-
tion is spread are also observed.
Building on prior theoretical accounts included in this volume, Weeks and Gil de
Zúñiga (2019) offer six major areas for scholars to hone their efforts when dealing
with pseudo-information. According to the authors, this list will not be exhaustive,
and many other observations may be equally worth pursuing. However, in order to
foster and encourage an interdisciplinary literature on pseudo-information, academ-
ics should also focus on discovering how misleading pseudo-information permeates
through society, whether it matters for society, and what its level of influence is over
regular citizens. In other words, where and how do people get exposed to pseudo-
information, and when people are exposed to misleading or bad information, what
kind of effects does it have? For instance, it stands to reason that today, a large portion
of this information and the extent to which individuals believe it, greatly depends on
individuals’ social media networks. Having a deeper understanding of the composi-
tion of these networks and how they work would be helpful to better understand this
phenomenon. Another area of improvement revolves around pseudo-information dis-
tribution. Politicians, political elites, opinion leaders, and government officials are all
part of an information conglomerate prone to creating or disseminating falsehoods.
Studies seldom pay attention to these influential figures and their relationships with
misinformation. The essay concludes with suggestions geared toward improving
today’s modern informational ecosystem, and toward better engraining citizens with
healthier democratic habits. On the one hand, understanding people’s emotions and
how these characteristics may influence levels of false information acceptance should
also be analyzed. Furthermore, studies need to help identify more efficient tools and
mechanisms to confront false information, because solely providing corrective mes-
sages that simply highlight factual or contextual information may not be enough to
facilitate accurate beliefs.
Ha et al. (2019) seek to integrate and describe the growing importance of an aca-
demic body of research on fake news and misinformation. Relying on articles pub-
lished on Google Scholar in the past 10 years with the keywords “fake news” and
“misinformation,” the authors construct a pragmatic explorative coding scheme to
showcase the current research state across disciplines. Most of the studies come
from Communication or Psychology journals, with Journal of Communication and
Memory & Cognition leading the trend. Furthermore, the studies published tend to
be either quantitative or conceptual, with a marginal proportion of studies (a little
12 American Behavioral Scientist 00(0)
over 20%) using either qualitative or mixed-method approaches. The article also
provides many other benchmarks to better understand how research around these
issues continues to evolve.
With a similar goal in mind, Krishna and Thompson (2019) attempt to capture how
salient misinformation research has become with regards to health-related topics.
Within this framework, the authors specifically review studies dealing with this sub-
ject in Health Communication and the Journal of Health Communication. From this
work, we learn that one of the earliest strands of the literature dealt with misinforma-
tion and medication issues, particularly, the disconnect between what some studies
find and how the media report or frame the main results (e.g., the misuse of aspirin
among heart disease patients). Other topics of interest underscored in the study include
food and nutrition as a battleground where misleading information thrived; misinfor-
mation about an array of cancer types, treatments, and pseudo-cures; dissemination of
information on wrongfully attributed or exaggerated epidemics such as Ebola; and
false information revolving vaccines and autism or vaping e-cigarettes. All in all, the
authors depict a realistic if somewhat gloomy vision of the pervasive pseudo-informa-
tion epidemic in which health science is currently embroiled.
Drawing on an innovative strategic amplification paradigm, Donovan and Boyd
(2019) establish a suggestive guideline to better address what, to their eyes, is an
endemic journalistic problem: strategic silence. According to the authors, the spe-
cific gatekeeping rules and reporting strategies enacted by media corporations and
journalists when reporting certain issues may encapsulate an evolution in disinfor-
mation and misinformation. Specifically, the authors visibly expand this notion with
regards to the ways in which news on violence and suicide have historically been
reported in the United States. The essay sustains and clarifies strategies for profes-
sionals and the media in general to abide to greater levels of deontological responsi-
bility, which in turn may facilitate and cultivate a stronger, healthier, and egalitarian
informed public opinion.
How does health misinformation affect the politics of a general election campaign?
How are the media dealing with these politically divisive and polarizing topics? Lovari
et al. (2020) center their article on the vaccination debate in online media and politics
in Italy. Relying on a mixed-method approach that combined social media trace data
as well as community experts’ in-depth interviews, the authors shed light on the influ-
ence of the “vaccine information crisis” political agenda during the 2018 Italian gen-
eral elections. Overall, the study represents a persuasive account on the opportunities
and interventions to be followed by policy makers to establish better information dis-
semination patterns among citizens. Their study highlights potential solutions to health
communication crises, which ultimately may also contribute to diminishing misinfor-
mation waves regarding broader topics of public opinion and interest.
In the pursuit of efficient tools to challenge and contest misinformation in digital
and social media, Jang et al. (2019) present a study by which citizens will be well-
positioned to face, identify, and confront false information. In their study, different
types of media literacy interventions are tested to observe whether participants’ self-
reported literacy scales serve to help recognize fake news online. Among all the
Kim and de Zúñiga 13
possible competences that most citizens can develop, information literacy seems to be
the one that most efficaciously prevents the assimilation of pseudo-information among
individuals. As opposed to media literacy, news readership, or digital skills, informa-
tion literacy clearly helps individuals understand complex informational dilemmas or
quandaries, evaluate information, and then search and find useful and factual evidence
and facts, which in turn will also help make more sophisticated and savvier use of the
overall information gathered. All in all, helping educate citizens to cultivate their
information literacy abilities may become a powerful tool in inoculating public opin-
ion against fake news and falsehoods.
The volume wraps up with a piece by Oh and Park (2019), who develop a machine
learning algorithm-based technique to detect misinformation embedded in deceptive
comments. The authors identify a plain pervasive problem across society: people are
simply not very good at detecting deception. This problem is particularly salient in the
distinction between fact-checked information and users’ opinions. Accordingly, the
authors develop an algorithm to make the distinction between truthful and deceptive
news comments. They do so in the Korean language, which is also largely understud-
ied. Furthermore, their proposed machine learning technique achieves a promising
accuracy rate in predicting and classifying untruthful opinions (fake comments) about
different social issues.
This special issue is about the information crisis that we live in today. We have assem-
bled 11 essays as the first volume to develop theoretical ground for building possible
solutions. We first look for theory-based propositions, supported by empirical tests
when possible beyond anecdotal or episodic snapshots. In doing so, we focus on two
main actors, media and publics, and their interplay as it relates to pseudo-information.
Media, both traditional mass media and emerging social media platforms, shape infor-
mation environments for publics and are shaped by publics’ communicative actions.
An understanding of these key actors will be necessary to combat the obstinacy of
In addition, this special issue looks toward a common body of knowledge about the
nature of pseudo-information (its birth, growth, and death) in our digital networked
society. Descriptive theorizing on the phenomena of misinformation and disinforma-
tion and their agents and actors helps devise prescriptive procedures to reduce their
prevalence among lay citizens, to make lay informatics more resistant to pseudo-infor-
mation, and to develop and test policies and institutional countermeasures against
social problems that arise from pseudo-information.
Still we need more and better defining theoretical and empirical works about
pseudo-information, publics, and media from thought leaders in the areas of commu-
nication, public opinion, journalism, and other media, and should address the details
of conceptual challenges and unnoticed factors regarding the phenomena (also see
Bimber & Gil de Zúñiga, 2020; Kim, 2018).
14 American Behavioral Scientist 00(0)
All in all, in this special issue, we present theoretical bases about the life of pseudo-
information—its birth, growth, and death in the marketplace of ideas. We hope this
issue sparks deeper thinking among readers and all stakeholders, such as journalists,
researchers, and policy makers. Likewise, we hope this collection will elicit these
agents to engage into further and more meaningful communication about pseudo-
information and a failing digital marketplace of ideas. Finally, the articles in this vol-
ume can shed light on the intertwined roles of publics and media, both of which are
responsible for the crippling of lay informatics in our digital networked society.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
The author(s) received no financial support for the research, authorship, and/or publication of
1. We refer to information as bits of evaluated knowledge and data available and shared
among communicative actors in a problematic situation (Kim & Grunig, 2011; Kim &
Krishna, 2014). Publics as a plural form refers to some problem- or issue-specific situ-
ational entities (vs. general population) who are epistemically motivated, communicatively
active, and create social dynamics (Grunig & Kim, 2017). Media refers to both traditional
and digital mass media and new social, people’s network-based media.
Bimber, B., & Gil de Zúñiga, H. (2020). The unedited public sphere. New Media & Society,
22(4), 700-715. https://doi.org/10.1177/1461444819893980
Carter, R. F. (1972, September). A general system characteristic of systems in general [Paper pre-
sentation]. Far West Region Meeting, Society for General Systems Research, Portland, OR.
Chaffee, S. (1991). Explication. Sage.
Cerulus, L. (2020, April 29). How anti-5G anger sparked a wave of arson attacks. Politico.
Chiu, M. M., & Oh, Y. W. (2020). How fake news differ from personal lies. American Behavioral
Scientist. Advance online publication. https://doi.org/10.1177/0002764220910243
disinformation. (2020). In Merriam-Webster.com. https://www.merriam-webster.com/diction-
Do, A. (2020, July 5). “You started the corona!” As anti-Asian hate incidents explode, climbing
past 800, activists push for aid. Los Angeles Times. https://www.latimes.com/california/
Donovan, J., & Boyd, D. (2019). Stop the presses? Moving from strategic silence to strategic
amplification in a networked media ecosystem. American Behavioral Scientist. Advance
online publication. https://doi.org/10.1177/0002764219878229
Flood, A. (2016, November 15). “‘Post-truth’ named word of the year by Oxford Dictionaries.”
The Guardian. https://www.theguardian.com/books/2016/nov/15/post-truth-named-word-
Kim and de Zúñiga 15
Gagliardone, I. (2018). World trends in freedom of expression and media development
2018: Global Report 2017/2018. https://www.researchgate.net/publication/322977599_
Gil de Zúñiga, H. (2015). Toward a European public sphere? The promise and perils of modern
democracy in the age of digital and social media. International Journal of Communication,
9, 3152-3160. https://ijoc.org/index.php/ijoc/article/view/4783
Grunig, J. E. (2003). Constructing public relations. In B. Dervin, S. H. Chaffee, & L. Foreman-
Wernet (Eds.), Communication, a different kind of horse race: Essays honoring Richard F.
Carter (pp. 85-115). Hampton Press.
Grunig, J. E. (2009). Paradigms of global public relations in an age of digitalisation. Prism, 6(2).
Grunig, J. E., & Kim, J-N. (2017). Publics approaches to health and risk message design and
processing. Oxford encyclopedia of health and risk message design and processing. https://
Ha, L. H., Perez, L. A., & Ray, R. (2019). Mapping recent development in scholarship on
fake news and misinformation 2008-2017: Disciplinary contribution, topics and impact.
American Behavioral Scientist. Advance online publication. https://doi.org/10.1177/
Jang, S. M., Mortensen, T., & Liu, J. (2019). Does media literacy help identification of fake
news? Information literacy helps, but other literacies don’t. American Behavioral Scientist.
Advance online publication. https://doi.org/10.1177/0002764219869406
Kim, J.-N. (2014). Lay consumer informatics and fast-choicism: Instant trust and prompt loy-
alty in digitalized, networked marketplace. Communication Insight, 3, 10-31. [Written in
Kim, J.-N. (2018). Digital networked information society and public health: Problems and
promises of networked health communication of lay publics. Health Communication,
33(1), 1-4, https://doi.org/10.1080/10410236.2016.1242039
Kim, J.-N., Grunig, J. E., & Ni, L. (2010). Reconceptualizing the communicative action of
publics: Acquisition, selection, and transmission of information in problematic situations.
International Journal of Strategic Communication, 4(2) 126-154.
Kim, J.-N., & Grunig, J. E. (2011). Problem solving and communicative action: A situa-
tional theory of problem solving. Journal of Communication, 61(1), 120-149. https://doi.
Kim, J.-N., & Grunig, J. E. (2019). Lost in informational paradise: Cognitive arrest to epistemic
inertia in problem solving. American Behavioral Scientist. Advance online publication.
Kim, J.-N., & Krishna, A. (2014). Publics and lay informatics: A review of the situational theory
of problem solving. Annals of the International Communication Association, 38(1), 71-105.
Kim, J.-N., Oh, Y. W., & Krishna, A. (2018). Justificatory information forefending in digital
age: Self-sealing informational conviction of risky health behavior, Health Communication,
33(1), 85-93, https://doi.org/10.1080/10410236.2016.1242040
Krishna, A., & Thompson, T. (2019). Misinformation about health: A review of health commu-
nication and misinformation scholarship. American Behavioral Scientist. Advance online
Lee, E. J., & Shin, S. Y. (2019). Mediated misinformation: Questions answered, more ques-
tions to ask. American Behavioral Scientist. Advance online publication. https://doi.
16 American Behavioral Scientist 00(0)
Lovari, A., Martino, V., & Righetti, N. (2020). Blurred shots: Investigating information crisis
around vaccination in Italy. American Behavioral Scientist. Advance online publication.
Marx, K. (1867). Das Kapital: Kritik der politischen Oekonomie: Der Produktionsprozess des
Kapitals (Vol. 1, 1st ed.) [Capital: A Critique of Political Economy. Volume I: The Process
of Capitalist Production]. Verlag von Otto Meissner. https://doi.org/10.3931/e-rara-25773
Mill, J. S. (1869). On liberty. Longman, Roberts & Green. www.bartleby.com/130/
misinformation. (2020). In Merriam-Webster.com. https://www.merriam-webster.com/diction-
Molina, M., Sundar, S., Le, T., & Lee, S. (2019). “Fake news” is not simply false information:
A concept explication and taxonomy of online content. American Behavioral Scientist.
Advance online publication. https://doi.org/10.1177/0002764219878224
Oh, Y. W., & Park, C. H. (2019). Machine-cleaning of online opinion spam: Developing a
machine-learning algorithm for detecting deceptive comments. American Behavioral
Scientist. Advance online publication. https://doi.org/10.1177/0002764219878238
Stanley, J. (2018, September 4). What John Stuart Mill got wrong about freedom of speech.
Boston Review. http://bostonreview.net/politics-philosophy-religion/jason-stanley-what-
Stearns, J. (2016, September 30). Why do people share rumours and misinformation in breaking
news? First Draft. https://firstdraftnews.com/people-share-misinformation-rumors-online-
van Mill, D. (2018). Freedom of speech. In E. N. Zalta (Ed.), The Stanford encyclopedia of phi-
losophy (Summer 2018 ed.). https://plato.stanford.edu/archives/sum2018/entries/freedom-
Weeks, B., & Gil de Zúñiga, H. (2019). What’s next? Six observations for the future of politi-
cal misinformation research. American Behavioral Scientist. Advance online publication.
Jeong-Nam Kim is the gaylord family endowed Chair in Strategic Communications and the
founder of the Debiasing and Lay Informatics (DaLI) lab. Kim has studied public relations
and communication focusing on “public behavior”. Kim constructed the situational theory of
problem solving (STOPS) with James E. Grunig, and the situational theory has been applied to
multiple fields and disciplines across the globe. His work identifies both opportunities and chal-
lenges generated by lay publics and how their communicative actions either contribute to or
detract from a civil society.
Homero Gil de Zúñiga, Ph.D., in politics at Universidad Europea de Madrid and Ph.D. in Mass
Communication at University of Wisconsin – Madison, serves as Distinguished Research Professor
at University of Salamanca where he directs the Democracy Research Unit (DRU), as Professor at
Pennsylvania State University, and as Senior Research Fellow at Universidad Diego Portales,
Chile. His research addresses the influence of new technologies and digital media over people’s
daily lives, as well as the effect of such use on the overall democratic process. He has published
nearly a dozen books/volumes and over 100 JCR peer reviewed journal articles (i.e. Journal
of Communication, Journal of Computer-Mediated Communication, Political Communication,
Human Communication Research, New Media & Society, Communication Research, etc).