ArticlePublisher preview available

Extreme opponents of genetically modified foods know the least but think they know the most

To read the full-text of this research, you can request a copy directly from the authors.

Abstract and Figures

There is widespread agreement among scientists that genetically modified foods are safe to consume and have the potential to provide substantial benefits to humankind3. However, many people still harbour concerns about them or oppose their use. In a nationally representative sample of US adults, we find that as extremity of opposition to and concern about genetically modified foods increases, objective knowledge about science and genetics decreases, but perceived understanding of genetically modified foods increases. Extreme opponents know the least, but think they know the most. Moreover, the relationship between self-assessed and objective knowledge shifts from positive to negative at high levels of opposition. Similar results were obtained in a parallel study with representative samples from the United States, France and Germany, and in a study testing attitudes about a medical application of genetic engineering technology (gene therapy). This pattern did not emerge, however, for attitudes and beliefs about climate change.
This content is subject to copyright. Terms and conditions apply.
1Leeds School of Business, University of Colorado, Boulder, CO, USA. 2Olin Business School, Washington University in St. Louis, St. Louis, MO, USA.
3Department of Psychology, University of Toronto, Toronto, Ontario, Canada. 4Department of Psychology, University of Pennsylvania, Philadelphia, PA,
USA. *e-mail:
There is widespread agreement among scientists that geneti-
cally modified foods are safe to consume1,2 and have the
potential to provide substantial benefits to humankind3.
However, many people still harbour concerns about them or
oppose their use4,5. In a nationally representative sample of
US adults, we find that as extremity of opposition to and con-
cern about genetically modified foods increases, objective
knowledge about science and genetics decreases, but per-
ceived understanding of genetically modified foods increases.
Extreme opponents know the least, but think they know the
most. Moreover, the relationship between self-assessed and
objective knowledge shifts from positive to negative at high
levels of opposition. Similar results were obtained in a paral-
lel study with representative samples from the United States,
France and Germany, and in a study testing attitudes about a
medical application of genetic engineering technology (gene
therapy). This pattern did not emerge, however, for attitudes
and beliefs about climate change.
Genetically modified (GM) foods are judged by the majority of
scientists to be as safe for human consumption as conventionally
grown foods1,2, and have the potential to provide substantial benefits
to humankind, such as increased nutritional content, higher yield
per acre, better shelf life and crop disease resistance3—yet there is
substantial public opposition to their use around the world4,5. In the
United States, a poll by the Pew Research Center found that 88% of
scientists thought GM foods were safe to eat, while only 37% of lay-
people thought so, the largest gap for any of the issues tested6. Public
opposition to science is often attributed to a lack of knowledge79.
However, findings on the association between knowledge and atti-
tudes about GM foods are mixed, and there is little evidence that edu-
cational interventions can meaningfully change public attitudes10,11.
Sometimes, they even backfire12,13. While research on opposition to
GM foods has primarily focused on what people actually know, it
is also important to consider what they think they know14,15. Self-
assessed knowledge is a strong predictor of attitudes, and people
tend to be poor judges of how much they know16. They often suffer
from an illusion of knowledge, thinking that they understand every-
thing from common household objects to complex social policies
better than they do17. This is why peoples sense of understanding
decreases when they try to generate explanations18, and why novices
are poorer at evaluating their talents than experts19. Gaining knowl-
edge in a domain often has the effect of revealing nuance and com-
plexity, hence reducing extremity of belief20,21. These results suggest
that extreme attitudes sometimes reflect low objective knowledge
paired with high self-assessed knowledge22,23. We examined the rela-
tionships between extremity of opposition to GM foods, objective
knowledge about science and genetics and self-assessed knowledge
about GM foods. We hypothesize that extremists will display low
objective knowledge but high subjective knowledge, and that the
gap between the two will grow with extremity.
In Study 1, we surveyed a sample of US adults (N= 1,000) rep-
resentative of the population for gender, education, income and
ethnicity. Hypotheses and analysis plans were pre-registered on before data collection. Participants were either
assigned to a study about GM foods (N= 501) or climate change
(N = 499). We first present methods and results for GM foods, then
climate change.
In the GM food study (mean age (Mage) = 51.1 yr ; 56.7% female),
participants were first asked two questions to measure attitudes:
extremity of opposition to GM foods (1 = no opposition; 7 = extreme
opposition) and concern (1 = no concern; 7 = extreme concern).
Overall, 90.82% of respondents reported some level of opposition
to GM foods and 93.01% reported some level of concern. Responses
to these two questions were highly correlated (coefficient of correla-
tion (r) = 0.88; P < 0.0001; N = 501) and we averaged them to form a
measure that we call ‘extremity of opposition’ for the main analyses.
Consistent with previous research, there were no significant dif-
ferences in extremity of opposition between self-reported liberals,
moderates and conservatives5,24 (see Supplementary Information
for complete details of all methods and analyses not reported in the
main text).
Next, participants were asked to judge their understanding of
GM foods (‘self-assessed knowledge’), using instructions and a sin-
gle-item rating scale adapted from the cognitive science literature18.
Finally, we measured scientific literacy (‘objective knowledge’)
with 15 true false questions adapted from the National Science
Foundation’s Science and Engineering Indicators survey25, the
American Association for the Advancement of Science Benchmarks
for Science Literacy26 and recent work on the public understand-
ing of science2729 (for example, “Electrons are smaller than atoms”).
We measured responses to the objective knowledge questions on
a 7-point scale anchored by ‘definitely true’ and ‘definitely false’.
Participants were given 3 to 3 points depending on correctness.
For example, when a participant chose definitely true, they received
3 points if the correct answer was ‘true, and 3 points if the cor-
rect answer was ‘false. We summed points across all questions to
measure scientific literacy. For robustness, we replicated all analyses
after binarizing the scale and treating scores of 1 to 3 as correct and
scores of 0 to 3 as incorrect.
Five of the items in the scientific literacy scale refer to genet-
ics (for example, “All plants and animals have DNA”). We summed
responses to these items to create a genetics literacy subscale. For
robustness, we also replicated the analyses after removing the genet-
ics questions from the scientific literacy scale.
Extreme opponents of genetically modified foods
know the least but think they know the most
PhilipM.Fernbach 1*, NicholasLight1, SydneyE.Scott2, YoelInbar3 and PaulRozin4
NATURE HUMAN BEHAVIOUR | VOL 3 | MARCH 2019 | 251–256 | 251
The Nature trademark is a registered trademark of Springer Nature Limited.
... More educated [7,38], wealthier [38,39], less religious [40], and more liberal participants [37,39] would rate genetic technologies as morally better, other things being equal. Moreover, ethical approval of genetic technologies would be greater when participants know more about those technologies [3,41]. For example, people well versed in genetic technologies will understand the different implications embryonic vs. adult gene editing will have on the body and if it will or will not affect future offspring. ...
... For example, people well versed in genetic technologies will understand the different implications embryonic vs. adult gene editing will have on the body and if it will or will not affect future offspring. Relatedly, expecting that a finding for genetically modified foods generalises to genetic technologies, more extreme ethical judgments would align with greater presumed knowledge [41]. Participants were also expected to rate genetic technologies as morally better if they had prior exposure to genetic testing [38]. ...
... and recorded responses on a 7-point Likert scale ranging from "much less than others" to "much more than others. " Participants then completed a short test of their genetic knowledge, taken from [41], which prompted them to assess claims like "It is the father's genes that decide whether the baby is a boy or a girl. " ...
Full-text available
Background Policy regulations of ethically controversial genetic technologies should, on the one hand, be based on ethical principles. On the other hand, they should be socially acceptable to ensure implementation. In addition, they should align with ethical theory. Yet to date we lack a reliable and valid scale to measure the relevant ethical judgements in laypeople. We target this lacuna. Methods We developed a scale based on ethical principles to elicit lay judgments: the Genetic Technologies Questionnaire (GTQ). In two pilot studies and a pre-registered main study, we validated the scale in a representative sample of the US population. Results The final version of the scale contains 20 items but remains highly reliable even when reduced to five. It also predicts behaviour; for example, ethical judgments as measured by the GTQ predicted hypothetical donations and grocery shopping. In addition, the GTQ may be of interest to policymakers and ethicists because it reveals coherent and ethically justified judgments in laypeople. For instance, the GTQ indicates that ethical judgments are sensitive to possible benefits and harms (in line with utilitarian ethics), but also to ethical principles such as the value of consent-autonomy. Conclusions The GTQ can be recommended for research in both experimental psychology and applied ethics, as well as a tool for ethically and empirically informed policymaking.
... Based on the information-deficit model, the present study investigated the role of scientific literacy on attitudes toward COVID-19 vaccines and preventive behaviors. Scientific literacy is a cognitive factor depicting objective knowledge of science (Fernbach et al., 2019), though there have been debates on the precise definition (Miller, 1983). The information-deficit model assumes that a lack of scientific literacy contributes to negative attitudes toward science (e.g., Bak, 2001). ...
... The information-deficit model assumes that a lack of scientific literacy contributes to negative attitudes toward science (e.g., Bak, 2001). Relevant to the present study, a line of research has shown that scientific literacy is associated with attitudes toward biotechnology-related topics (Rutjens et al., 2018;Fernbach et al., 2019;McPhetres et al., 2019). For example, those who have lower scores on scientific literacy tend to show negative attitudes toward vaccines (Rutjens et al., 2018). ...
... The degree of scientific literacy was measured by objective knowledge about science (Fernbach et al., 2019). Participants asked 15 true−false questions on scientific literacy (e.g., "Electrons are smaller than atoms") adapted from Fernbach et al. (2019). ...
... The existing research pays more attention to examining the complete information processing model of consumers when they understand GM food and decide to purchase it, and then explains the cognitive style of people's attitude toward GM [7,8]. For example, some researchers hold that people tend to oppose GM products because they lack sufficient scientific knowledge but believe they have a wealth of knowledge about GM food [9]. Similar findings were also found in China, showing that consumers do not trust the claims made by the government and scientists, which constitutes a reason people do not choose genetically modified foods [6]. ...
... Numerous previous studies have looked at the trade-off between risk and benefit [11][12][13][14][15] or the impact of knowledge [9,10] on individuals' acceptance of GM foods. These studies help us better understand people's psychological processes when buying GM foods, and perceived risk is considered to be one of the most critical factors to predict people's attitude toward GM products. ...
Based on compensatory control theory, the aim of this study was to examine the effects of perceived control on people's acceptance of genetically modified (GM) foods by using both correlational and experimental methods. Compensatory control theory proposes that the lower an individual's perceived control, the higher their need for structure, order, and certainty. Therefore, based on beliefs about GM foods that make some people less certain that those foods are as safe as traditional foods, we hypothesized that individuals with lower levels of perceived control are more inclined to reject GM foods. The analysis of questionnaire responses in Study 1 revealed that individuals' sense of control negatively predicted their risk perception of GM foods, while the need for structure played a mediating role. In Study 2, using a between-subject design, we manipulated participants' perceived control (higher vs. lower) and subsequently measured their risk perception and purchasing preferences for GM foods. The results in Study 2 show that under lower control conditions, individuals recognize higher risks related to GM foods, which, in turn, decreases their willingness to purchase GM foods. These results not only suggest that perceived control is a potential influential personal factor of the acceptance of GM foods but also extend the scope of the application of compensatory control theory.
... This bias may have real consequences for cases of acute misinformation, as a recent study found that a Dunning-Kruger effect for autism knowledge predicted opposition to mandatory vaccinations (Motta et al., 2018). In a related finding, self-assessments of knowledge are inversely correlated with actual knowledge and support for the scientific consensus on GMO foods (Fernbach et al., 2019). ...
This article identifies two major criticisms of simulation, the problem of representation—how similar is the model to reality?—and the problem of expert opinions—what role can be ascribed to expert opinions? In the late 1950s, philosophers Olaf Helmer and Nicholas Rescher proposed an epistemology that purportedly provided the foundation for understanding simulation and comparable predictive endeavors as scientific. Published in 1959 as “On the Epistemology of the Inexact Sciences,” it claimed that expert opinions should be acknowledged on the same epistemological level as theories or data, provided (1) that they come from experts that are carefully and transparently selected; (2) that the expert opinions are collected as reactions to a given set of empirical evidence; and (3) that the participating experts can contribute to deciding what exactly enters the set of empirical evidence. Doing so would solve the problem of expert opinion. While this proposal was innovative and soundly argued, it was not much received. This article argues that it is worthwhile to reconsider Helmer and Rescher’s proposal, as it provides a strategic position from which to re-conceptualize the use of expert opinions in simulation studies.
Public attitudes that are in opposition to scientific consensus can be disastrous and include rejection of vaccines and opposition to climate change mitigation policies. Five studies examine the interrelationships between opposition to expert consensus on controversial scientific issues, how much people actually know about these issues, and how much they think they know. Across seven critical issues that enjoy substantial scientific consensus, as well as attitudes toward COVID-19 vaccines and mitigation measures like mask wearing and social distancing, results indicate that those with the highest levels of opposition have the lowest levels of objective knowledge but the highest levels of subjective knowledge. Implications for scientists, policymakers, and science communicators are discussed.
With the continuous development of automatic driving technology and advancements in related experimental research, the probability of traffic accidents caused by human factors has been greatly reduced. However, people are still cautious about the safety of automated driving technology. The purpose of this study was to investigate users’ perceived safety indicators and the psychological factors of their perceived safety judgment of self-driving buses. In this study, a structural model of the factors that influence self-driving buses, including behavioral intention of technology acceptance, trust theory, perceived risk, and perceived safety, was developed based on the technology acceptance model (TAM). Subsequently, a relevant survey of 215 respondents was conducted and analyzed using the partial least squares method. The results indicated that trust, perceived usefulness, and perceived ease of use were important factors for judging the perceived safety of self-driving buses. The structural model developed in this study can quantify and analyze user data to filter out the factors that influence the perceived safety of self-driving buses, which is conducive to improving people’s trust and acceptance of self-driving buses.
Identifying and understanding the hesitancy degree of public COVID-19 vaccine in emergency may be helpful to the dissemination of vaccine-related public health information. Through a survey among the adult population of Chinese mainland (N = 1080) after the COVID-19 vaccine was approved for mass vaccination, it is found that although more than 80% of the public (87.8%) have a low hesitancy attitude towards COVID-19 vaccine, a considerable number of people still have a medium hesitancy and a high hesitancy attitude towards COVID-19 vaccine (the middle hesitancy rate is 9.8% and the high hesitancy rate is 2.4%). By multiple logistic regression, the subjective and objective knowledge levels of medium-high hesitancy group and low hesitancy group in COVID-19 vaccine were compared. The results showed that there were significant differences in subjective and objective knowledge levels between medium-high hesitancy group and low-hesitancy group in COVID-19 vaccine. Compared with those with low hesitancy, those with medium and high hesitancy have lower subjective knowledge level and objective knowledge level. The influence of subjective knowledge level on public vaccine hesitancy is significantly greater than that of objective knowledge. In addition, through multiple linear regression, the study found that the information channel had a significant impact on the public's subjective and objective knowledge. Receiving vaccine information from television, web pages, health professionals, health departments can promote subjective knowledge and objective knowledge, while receiving vaccine information from family and friends reduces subjective knowledge and objective knowledge. Considering the geographical location of the population in this study, the research results in this paper cannot be extended to the public in other countries. However, the method used in this paper is helpful for researchers to understand the hesitancy degree of COVID-19 vaccines in other places and its relationship with the public knowledge level of COVID-19 vaccines.
The rapid development of autonomous driving technology has attracted great attention from society nowadays. However, the lack of consumer acceptance might be a prominent barrier to the large-scale adoption of fully autonomous vehicles (FAVs). This study argues that it is critical to predicting FAV acceptance before it is fully popularised. To investigate the relationship between the public FAV subjective knowledge and general acceptance, we conducted an online questionnaire. The results showed that respondents with higher levels of FAV subjective knowledge were more likely to accept FAV. In addition, a significant moderating effect of trust was found. Specifically, in groups with higher level of trust, the same level of subjective knowledge evoked higher level of acceptance. In conclusion, the insights from this study could greatly facilitate ongoing research related to FAV acceptance. And policymakers should consider consumer characteristics, such as subjective knowledge and trust, when formulating AV promotion strategies, so as to effectively improve consumer acceptance of FAV.
One dilemma faced by policy makers is the choice between banning a harmful behavior and allowing the behavior to continue but with mitigated harm. This latter approach––a harm reduction strategy––is often efficacious, yet policies of this sort can be unpopular if people morally oppose the target behavior (MacCoun, 2013). This raises interesting questions for understanding how judgments of harmfulness relate to moral opposition. In four studies (N = 1090), including one U.S. representative sample, we found that increased moral opposition to risky sex, gun ownership, cigarette smoking (Studies 1–3), and unemployment (Study 4), was associated with less support for pre-exposure prophylaxis, gun safety training, e-cigarette use, and federal support respectively. However, there was variation across the issues across time, including when news broke of “vaping sickness” in 2019. Interestingly, judgments of harmfulness of both gun ownership and risky sexual behavior, though correlated with moral opposition, positively predict policy support, suggesting that it is possible to judge a behavior as harmful but otherwise acceptable, and in that case harm-reduction policy is also acceptable. Together, these results highlight the multi-faceted nature of moral opposition and its implications for real-world policy.
Full-text available
People often vote against the political establishment, as underscored by “Brexit” and the Trump election. The current contribution proposes that overclaiming one’s own knowledge predicts anti-establishment voting. We tested this idea in the context of a Dutch referendum on a European Union treaty with a clear pro- versus anti-establishment voting option. In a first wave (6 weeks before the referendum), Dutch citizens indicated their self-perceived understanding of the treaty, after which we tested their actual knowledge. We also measured participants’ general tendency to overclaim knowledge by assessing their familiarity with nonexisting stimuli. In a second wave shortly after the referendum, we asked participants what they had voted. Results revealed that increased self-perceived understanding yet decreased actual knowledge of the treaty, and general knowledge overclaiming, predicted an anti-establishment vote. Furthermore, these effects were most pronounced among right-wing extremists. We conclude that knowledge overclaiming predicts anti-establishment voting, particularly at the radical right.
Full-text available
Although Americans generally hold science in high regard and respect its findings, for some contested issues, such as the existence of anthropogenic climate change, public opinion is polarized along religious and political lines. We ask whether individuals with more general education and greater science knowledge, measured in terms of science education and science literacy, display more (or less) polarized beliefs on several such issues. We report secondary analyses of a nationally representative dataset (the General Social Survey), examining the predictors of beliefs regarding six potentially controversial issues. We find that beliefs are correlated with both political and religious identity for stem cell research, the Big Bang, and human evolution, and with political identity alone on climate change. Individuals with greater education, science education, and science literacy display more polarized beliefs on these issues. We find little evidence of political or religious polarization regarding nanotechnology and genetically modified foods. On all six topics, people who trust the scientific enterprise more are also more likely to accept its findings. We discuss the causal mechanisms that might underlie the correlation between education and identity-based polarization.
Conference Paper
Full-text available
Both the public and scientists value the contributions of science, but there are large differences in how each perceives science issues. Both groups agree that K-12 STEM education falls behind other nations.
Objective: Although the benefits of vaccines are widely recognized by medical experts, public opinion about vaccination policies is mixed. We analyze public opinion about vaccination policies to assess whether Dunning-Kruger effects can help to explain anti-vaccination policy attitudes. Rationale: People low in autism awareness - that is, the knowledge of basic facts and dismissal of misinformation about autism - should be the most likely to think that they are better informed than medical experts about the causes of autism (a Dunning-Kruger effect). This "overconfidence" should be associated with decreased support for mandatory vaccination policies and skepticism about the role that medical professionals play in the policymaking process. Method: In an original survey of U.S. adults (N = 1310), we modeled self-reported overconfidence as a function of responses to a knowledge test about the causes of autism, and the endorsement of misinformation about a link between vaccines and autism. We then modeled anti-vaccination policy support and attitudes toward the role that experts play in the policymaking process as a function of overconfidence and the autism awareness indicators while controlling for potential confounding factors. Results: More than a third of respondents in our sample thought that they knew as much or more than doctors (36%) and scientists (34%) about the causes of autism. Our analysis indicates that this overconfidence is highest among those with low levels of knowledge about the causes of autism and those with high levels of misinformation endorsement. Further, our results suggest that this overconfidence is associated with opposition to mandatory vaccination policy. Overconfidence is also associated with increased support for the role that non-experts (e.g., celebrities) play in the policymaking process. Conclusion: Dunning-Kruger effects can help to explain public opposition to vaccination policies and should be carefully considered in future research on anti-vaccine policy attitudes.
Plant breeding is one of the oldest sustainable agriculture methods used to increase the yield, quality and other biomaterial for human use. Many crops like fruits, vegetables, ornamental flowers, shrubs and trees, have been long cultivated to satisfy human food and aesthetical needs. Conventional breeding technologies like selection, hybridization, mutation through physical and chemical methods, and modern transgenic approaches are often used to improve the desired traits without inducing the pleiotropic effects. But these breeding methods are highly laborious and complicated to enhance crop production. Recently, targeted genome editing through engineered nuclease including zinc finger nuclease, transcription activator like effector nuclease and clustered regularly interspaced short palindromic repeats (CRISPRs) have been used to improve various traits in plants. Genome editing has emerged as a novel alternative approach to classical breeding with higher mutagenic efficiency. Here, we briefly cover the strengths of CRISPRs in comparison with other genome editing techniques. We also discuss its potential applications in genetic improvement of various crops and future prospective.
Public opposition to genetic modification (GM) technology in the food domain is widespread (Frewer et al., 2013). In a survey of U.S. residents representative of the population on gender, age, and income, 64% opposed GM, and 71% of GM opponents (45% of the entire sample) were “absolutely” opposed—that is, they agreed that GM should be prohibited no matter the risks and benefits. “Absolutist” opponents were more disgust sensitive in general and more disgusted by the consumption of genetically modified food than were non-absolutist opponents or supporters. Furthermore, disgust predicted support for legal restrictions on genetically modified foods, even after controlling for explicit risk–benefit assessments. This research suggests that many opponents are evidence insensitive and will not be influenced by arguments about risks and benefits.
Science communication has been historically predicated on the knowledge deficit model. Yet, empirical research has shown that public communication of science is more complex than what the knowledge deficit model suggests. In this essay, we pose four lines of reasoning and present empirical data for why we believe the deficit model still persists in public communication of science. First, we posit that scientists’ training results in the belief that public audiences can and do process information in a rational manner. Second, the persistence of this model may be a product of current institutional structures. Many graduate education programs in science, technology, engineering, and math (STEM) fields generally lack formal training in public communication. We offer empirical evidence that demonstrates that scientists who have less positive attitudes toward the social sciences are more likely to adhere to the knowledge deficit model of science communication. Third, we present empirical evidence of how scientists conceptualize “the public” and link this to attitudes toward the deficit model. We find that perceiving a knowledge deficit in the public is closely tied to scientists’ perceptions of the individuals who comprise the public. Finally, we argue that the knowledge deficit model is perpetuated because it can easily influence public policy for science issues. We propose some ways to uproot the deficit model and move toward more effective science communication efforts, which include training scientists in communication methods grounded in social science research and using approaches that engage community members around scientific issues.
Recent growth in the number of studies examining belief in climate change is a positive development, but presents an ironic challenge in that it can be difficult for academics, practitioners and policy makers to keep pace. As a response to this challenge, we report on a meta-analysis of the correlates of belief in climate change. Twenty-seven variables were examined by synthesizing 25 polls and 171 academic studies across 56 nations. Two broad conclusions emerged. First, many intuitively appealing variables (such as education, sex, subjective knowledge, and experience of extreme weather events) were overshadowed in predictive power by values, ideologies, worldviews and political orientation. Second, climate change beliefs have only a small to moderate effect on the extent to which people are willing to act in climate-friendly ways. Implications for converting sceptics to the climate change cause-and for converting believers' intentions into action-are discussed.
Of this article's seven experiments, the first five demonstrate that virtually no Americans know the basic global warming mechanism. Fortunately, Experiments 2-5 found that 2-45 min of physical-chemical climate instruction durably increased such understandings. This mechanistic learning, or merely receiving seven highly germane statistical facts (Experiment 6), also increased climate-change acceptance-across the liberal-conservative spectrum. However, Experiment 7's misleading statistics decreased such acceptance (and dramatically, knowledge-confidence). These readily available attitudinal and conceptual changes through scientific information disconfirm what we term "stasis theory"-which some researchers and many laypeople varyingly maintain. Stasis theory subsumes the claim that informing people (particularly Americans) about climate science may be largely futile or even counterproductive-a view that appears historically naïve, suffers from range restrictions (e.g., near-zero mechanistic knowledge), and/or misinterprets some polarization and (noncausal) correlational data. Our studies evidenced no polarizations. Finally, we introduce website designed to directly enhance public "climate-change cognition."