ArticlePDF Available

Avoiding cognitive biases: Promoting good decision making in research methods courses

Authors:
  • academicproofreadingplus.com

Abstract

The content of research methods courses in tertiary education, as revealed by popular textbooks, tends to largely focus on procedural aspects of research investigations. Although following proper procedures is essential for reaching valid and reliable outcomes, there is another attribute of good quality research that is often missing from research methods courses and textbooks: appropriate decision making and the avoidance of cognitive biases, such as confirmation bias and base rate neglect. In this essay, some of the more common cognitive biases are highlighted with examples from everyday life in order to illustrate how these biases can influence decision-making processes when reading about or conducting research investigations.
Avoiding cognitive biases: Promoting good decision making in research
methods courses
Paul Stapleton
Next to final version appearing in Teaching in Higher Education
<https://doi.org/10.1080/13562517.2018.1557137>
Abstract
The content of research methods courses in tertiary education, as revealed by popular
textbooks, tends to largely focus on procedural aspects of research investigations.
Although following proper procedures is essential for reaching valid and reliable
outcomes, there is another attribute of good quality research that is often missing from
research methods courses and textbooks: appropriate decision making and the avoidance
of cognitive biases, such as confirmation bias and base rate neglect. In this essay, some of
the more common cognitive biases are highlighted with examples from everyday life in
order to illustrate how these biases can influence decision-making processes when
reading about or conducting research investigations.
Introduction
I recently read about a study reported in the newspaper whose results indicated that
school-aged children who eat breakfast receive higher grades than those who don’t. I then
showed this news story to a group of first-year doctoral students (in education) and asked
them to try to explain the causes behind the research results. Unsurprisingly, most
students focused on the nutritional value of breakfast and the need for calories to
1
maintain blood sugar levels, and by association the ability to concentrate in class. So far
so good. But what the media did not report was the fact that children who do not eat
breakfast are more likely to come from single-parent families where the parent has less
time to help with homework, or afford after-school tutoring and all of the other family,
and socio-economic-related elements that lead to good school performance among
children. In other words, eating breakfast is more of a proxy for a caring and
economically stable family environment which leads to better academic results, rather
than a nutritional benefit.
Although in this case most of my students went in the right direction towards supplying a
causal reason, they appeared to be missing the forest for the trees. Uncovering deeper
causes and confounding effects is a critically important part of any analysis of data as
illustrated through simple examples such as this one. However, jumping to the most
obvious causes, referred to as the “availability heuristic,” (Tversky and Kahneman, 1974)
or confusing correlation with causation are common cognitive errors in research studies
as well as everyday life. Thus, cognitive biases, i.e., mistakes in reasoning related to
holding onto personal preferences and beliefs despite contrary information, can have an
important influence on data analysis when conducting research, but they tend to be
overlooked by research methods textbooks and courses that use these books. And while it
is understood that research methods textbooks and courses cannot possibly cover the
myriad elements involved in research processes, such as dealing with unique social
contexts, or the special interests sometimes served by research, this essay takes the
position that a discussion of cognitive biases should become an integral part of research
methods courses in higher education.
2
Research methods courses are a common fixture in master-and doctoral-level programs.
One recent survey, for example, reveals that among master’s programs in language
teaching, research methods courses are the fifth most common among core courses
(Author, 2018). These courses are perceived as necessary for graduate students entering
into a community of scholarship where they must understand and critically assess the
course readings, which often consist of journal articles describing empirical studies. At
the doctoral level or in master’s programs that require a dissertation, the content of
research methods courses also provides a grounding for those about to embark on their
own scholarly investigations. Thus, master and doctoral-level programs typically include
research methods courses as part of their degree requirements. In this paper, it is the
content of these courses that is of interest because, as I will argue, the methods used in
scientific inquiry are multifaceted involving not only the common operational elements
(such as how to design surveys or perform statistical tests) focused on in these courses,
but also cognitive aspects related to decision making that are not always given the
attention they deserve.
One way to uncover the content taught in these courses is to review popular research
methods textbooks. A preliminary search of a few frequently cited textbooks (based on
Google Scholar (GS) metrics) identified in the broad area of research methods in
education (Cohen, Manion and Morrison 2012 (GS37k); Creswell 2012 (GS29k); Denzin
and Lincoln 2003 (GS32k) Miles and Huberman 1994 (GS83k) and in my own area,
applied linguistics (Dornyei 2007 (GS 4k); McKay and Gass 2005 (GS2.4k), reveals none
have chapters or sections directly addressing cognitive biases. Rather, in each case,
following an introductory chapter or two discussing broad elements of research, the focus
3
of each of these textbooks is on procedural elements of research investigations. And
while this search is not meant to be exhaustive, these may well be characteristic of most
research methods textbooks in lacking discussions about how to make good decisions
while avoiding cognitive biases common to all humans.
Research methods courses and their associated textbooks certainly do not neglect
discussions about the importance of rigor. These courses, at least as reflected by their
textbooks, typically introduce, and have extended explanations about the many threats to
reliability and validity in both quantitative and qualitative studies to ensure students
understand that procedures are subject to many pitfalls when research does not follow
proper protocols. These cannot be learned overnight because there are so many ways
things can go wrong if a study is not performed properly according to established norms,
and these appear to be getting increasingly rigorous. For example, in quantitative studies,
effect size, not often mentioned in social science-related publications a decade or two
ago, is demanded by more reviewers now (Lindstromberg 2016). Similarly, in qualitative
studies involving coding, reviewers increasingly want to see independent coding with
inter-rater reliability scores, member checking and audit trails.
Although these textbooks, which presumably reflect the content of research methods
courses, contain the necessary elements and procedures for graduate students to apply to
their own critical reading and investigations, curiously, often missing are elements of
human decision making that play an essential role in analyzing data. It is true that some
textbooks, and presumably courses, do have discussions about research traditions
explaining about the nature of inquiry (see Seligar and Shohamy, 2011). Cohen, Manion
and Morrison (2011) have an excellent chapter on searching for causation, which has
4
parallels with the issues discussed in this paper; Miles and Huberman (2014) also have a
section on causal chains and networks; however, these chapters and sections are not
framed in terms of cognitive biases. The reason for this exclusion, or perhaps it is better
described as an absence, is that our cognitive biases may be viewed as falling outside of
the domain of what is typically considered “research methods.” A deeper look at some of
these cognitive biases, however, suggests that research methods courses could benefit
from an inclusion of some common thinking biases and errors because they are so closely
tied to scientific inquiry.
In the past decade, a number of books written for a general audience focusing on decision
making and the avoidance of cognitive errors have been published, perhaps the most
well-known of which is “Thinking fast and slow,” by Nobel laureate (Economics) Daniel
Kahneman (2011), largely based on his work with Amos Tversky in cognitive
psychology. In their collaborative research over about 14 years starting in the early
1970s, richly described in biographical terms by Michael Lewis recently in “The
Undoing Project,” via various experiments, they demonstrated numerous types of
irrational behavior exhibited by a majority of people. The connection that their work,
along with others in the field of behavioral economics and cognitive psychology has had
on our understanding of human behavior has been profound and may be deserving of
closer attention from research methods courses, which, as noted above, tend to focus on
procedural techniques. Unlike the operational skills required for ensuring research rigor,
analytical thinking is needed to generate both accurate and innovative outcomes;
however, as revealed by Tversky and Kahneman (1974), human intuition can often lead
lay people and researchers alike into reaching flawed conclusions.
5
Thus, I would like to suggest that research methods courses at the graduate level add
another level of content and instruction dealing with common biases and heuristics that
could overlay the current largely operationally oriented ones. Such instruction could be
represented by a chapter in a general research methods textbook, or perhaps a lecture or
two in a course, and would serve as a foundational part of research methods courses. The
purpose would be to draw attention to some very common cognitive biases and related
elements that can inhibit creativity or even lead to flawed conclusions.
In a course I co-teach at the doctoral level called “Conceptualizing Research,” I introduce
a few of the more common cognitive biases, some of which were uncovered by Tversky
and Kahneman. I then try to link these to the students’ broader research processes in the
hope that they will have a heightened awareness of the natural pitfalls that can lead them
down paths that are either excessively narrow or even flawed. In each case, I begin with a
story or example from the non-academic world and then attempt to draw wider
implications about the world of research.
Cognitive elements
Confirmation bias
When I was a child, for many years my mother was a nurse working the evening shift on
the psychiatric ward in a hospital. Once in a while, I remember her leaving in the early
evening and remarking, “Full moon tonight; we’ll be busy.” A common belief among
nurses on that ward, and to some extent the general public, it seems, was that the lunar
cycle somehow had an effect on psychiatric hospital admissions. Although such a belief
has been completely debunked1, its emergence most likely had its origins in some actual
1 Gorvin, J. J., & Roberts, M. S. (1994). Lunar phases and psychiatric hospital admissions. Psychological Reports, 75, 1435-1440.
6
occurrences when psychiatric ward admissions coincided with a cloudless night when the
moon was full. The belief may have been reinforced over the years when a few instances
of abnormally many admissions coincided with the full moon, which further confirmed
the belief. However, as Gilbert (1991) explains, humans understand a statement by first
attempting to believe it, after which we decide whether or not to disbelieve it. Thus, in the
case of the lunar cycle belief, there is an immediate tendency to first recall instances from
memory to support or confirm the belief and use those as evidence. This cognitive
process that selectively recalls information that supports existing beliefs is called
“confirmation bias.” Thus, the nurses probably ignored the many instances when a
normal number of admissions coincided with the full moon because “normal” is less
noticeable; plus, it would not have reinforced their existing beliefs.
To illustrate the power of confirmation bias, I have often played a little game with my
students. At the beginning of my lecture on cognitive biases, I ask students to identify a
number I have placed next to their sun sign (according to their month of birth), e.g., #10
for Scorpio, from one to 12, and then invite them to come to the front of the class to pick
up the piece of paper based on their number. On the paper is a version of the following
description from Forer (1949) that I have led them to believe is an individualized
depiction of their character based on their horoscope. However, in fact, all students
receive the same description on their numbered papers.
Some of your aspirations tend to be pretty unrealistic. At times you
are extroverted, affable, sociable, while at other times you are
introverted, wary, and reserved. You have found it unwise to be too
7
frank in revealing yourself to others. You pride yourself as being an
independent thinker and do not accept other's opinions without
satisfactory proof. You prefer a certain amount of change and
variety and become dissatisfied when hemmed in by restrictions
and limitations. At times you have serious doubts as to whether you
have made the right decision or done the right thing. Disciplined
and controlled on the outside, you tend to be worrisome and
insecure on the inside. Your sexual adjustment has presented some
problems for you. While you have some personality weaknesses,
you are generally able to compensate for them. You have a great
deal of unused capacity which you have not turned to your
advantage. You have a tendency to be critical of yourself. You have
a strong need for other people to like you and for them to admire
you.
Variations of this ploy have been tried over the past few decades, often with students, and
the results tend to be uniform. Before disclosing the trick, most students, after reading the
description, have a strong tendency to attribute some accuracy to it based on their own
self-perceptions. I have tried this ploy over a dozen times and have found (based on the
reaction from students before revealing the trick), that most of them have attributed some
deeper meaning to astrology. When asked to debunk the exercise, most students ascribe
their belief to the fact that the explanation is so broad that it could apply to anyone, which
of course is correct. However, they miss an even more important notion, which is related
to their own cognitive biases. When reading through they have unconsciously collected
8
statements that confirm their pre-existing beliefs about themselves and as their cherry-
picking piles up, the belief is reinforced. In the meantime, character traits that do not
align with their own beliefs tend to be ignored.
Confirmation bias may be the singularly most difficult cognitive bias to overcome in both
everyday life, as well as when we analyze our research data. Normally, we have some
expectation about what story the results of our studies will tell. Upon crunching the
numbers or coding the data, we hope that our bottom line reveals statistical significance,
or its qualitative equivalent, in order to confirm our preconceived beliefs; and if it
doesn’t, there is a temptation to revisit the data or use a different test to wedge it into
those beliefs, especially if the results are not too far off of them. Even in qualitative
research where patterns are supposed to emerge from the data, it is difficult not to have
preconceived notions about where the data is heading. Knowledge about, and heightened
awareness of our proclivities towards confirmation bias can encourage graduate students
to look more deeply at their results and consider alternative explanations rather than
cherry picking a choice quote, especially when results are not aligned with their current
beliefs or expectations. A deep awareness of our tendencies towards confirmation bias
can alert us to the significance of outliers, which also may be indicators of opportunities
leading to new discoveries and creative explanations.
Pattern seeking, randomness and regression to the mean
To illustrate the next cognitive bias, I ask students to imagine the gender sequence of six
babies born in a hospital (Kahneman 2011) as follows:
9
1. BBBGGG
2. GGGGGG
3. BGBBGB
I then ask them to rank the likelihood of each sequence. The usual response is that #3 is
the most likely (although I should disclose that my students are much warier of following
their intuition if I have deliberately misled them earlier in a lecture). Because we are all
aware that our sex at birth is random, #2 appears to be non-random, and this non-
randomness automatically triggers the human mind to seek a cause. “We are pattern
seekers, believers in a coherent world, in which regularities (such as a sequence of six
girls) appear not by accident but as a result of mechanical causality or of someone’s
intention (Kahneman 2001, 115).
With a little high school algebra, we realize that all three sequences are equally likely
(2X2X2X2X2X2=64), in other words, on average, “GGGGGG” randomly happens once
in every 64 possible birth sequences of six babies, just as #1 and #3 do.
Just as humans have proclivities to cherry-pick evidence that confirms our beliefs, we
also have a strong tendency to look for causes when we see patterns. However, seldom do
we attribute the observed pattern to random occurrences. Basically, humans are hard-
wired to seek causality because it brought evolutionary advantages. (That rustling in the
grass is probably just the wind, but best to jump away anyway in case it’s a snake). This
is a particularly important notion for graduate students to understand because they tend
not to be funded, and therefore work with smaller sample sizes, which of course tend to
randomly produce more patterns.
10
While randomness is a phenomenon that can explain some occurrences, another common
causal factor that receives less consideration than it should is regression to the mean, or
how measurements of a variable tend to fluctuate around an average. However, regression
to the mean tends to be ignored as a possible cause because it provokes little excitement
and provides no sexy appeal to the many pedagogy-related studies that involve
interventions. Like randomness, regression to the mean is a naturally occurring
phenomena unrelated to the independent variables that are introduced in many studies.
In a classroom activity, in order to provide an everyday example of regression to the
mean, I ask my students to open their YouTube app and view the top tracks from Psy, a
Korean singer, and count the “views” of his top several songs. The worldwide
phenomenon, “Gangnam Style,” clearly stands out as an aberration, while his later hits
have regressed to his mean.
Confusing correlation with causation while ignoring confounding effects
Closely related to our intuitive propensity to ignore randomness is confusing correlation
with causation as noted in the Introduction. Nowhere is this more common than in
medical studies that connect disease with diet. The media often reports studies that
purport to show those who consume a certain food, e.g., avocados, olive oil, walnuts,
blueberries or whatever other superfood is in fashion, to have greater longevity. In other
words, a pattern or correlation between the consumption of a certain food and length of
life is found and causation is assumed. However, particularly regarding human health,
which is enormously complex, simple dietary choices seldom bring about specific health
11
effects. As an example of this complexity, the eating of avocados may also be associated
with access to fresh fruit, higher-income brackets, knowledge of nutrition and exercise,
etc, all of which have a bearing on health, which raises the issue of confounding effects
when analyzing results.
Another study similar to the one described in the Introduction (eating breakfast resulting
in high grades) also relates to hidden causes. In this one, I ask students to explain the
deeper causes for the results of a study that found elderly participants who played
mahjong and bet on horse races had lower rates of dementia. The surface takeaway from
this study seems clear. Playing mahjong or betting on the races are intensely cognitive
activities that can help delay or avoid dementia. However, it can also be argued that those
who are attracted to these activities have certain existing biological characteristics that
predispose them to avoiding dementia. In other words, it may not be the activities that are
working to help them avoid dementia, but rather their inherent cognitive abilities that lead
them to enjoy these forms of entertainment that in turn coincide with lower rates of
dementia.
This principle can be extended to the types of studies graduate students conduct in their
research, e.g., experiments that purport to show the efficacy of a certain treatment over a
control. Often when the treatment results in some kind of improvement, especially when
small samples are involved, primacy is attributed to the new treatment while
downplaying or ignoring confounding effects. Thus, building an awareness about
correlation and causation requires repeated practice in order to develop a deep-seated
sense of caution when looking for patterns, while developing a healthy skepticism toward
the seduction of what appear to be the most available causes.
12
Representativeness, base rate and sample sizes
Most people have an intuitive understanding that larger sample sizes provide more
accurate readings of a measured criterion than smaller ones. Most also understand that
samples must be representative if they are meant to reflect characteristics of a larger
population. However, representativeness can have a deceptive side. The following quiz is
a version of one described by Kahneman (2011) that I introduce to my students.
Susan is 25-years-old. Her hobbies are paragliding and scuba diving. She has had several
boyfriends and she has tattoos on her ankles and back. Choose her job?
1. A yoga instructor
2. A school teacher
3. A librarian
4. A race car driver
Generally speaking, only a small percentage of my students will choose a school teacher,
which is the correct answer. Instead, they will respond based on their perceived
representativeness of Susan’s character, which is more likely to match the image of the
other three professions, rather than a school teacher. However, base rate informs us that
there are most likely dozens of times more school teachers than there are yoga
instructors, librarians and female race car drivers all combined, meaning there is a high
probability that Susan is a school teacher. It is true that representativeness can serve as a
modest indicator, but a much more powerful predictive force is base rate. However,
ascribing greater significance to representativeness while neglecting base rate, in other
words, failing to take statistical probabilities into account, is a very common cognitive
13
error that has implications during data analysis when looking to explain patterns or
reasons for certain results. This may be especially the case in studies that survey attitudes;
resulting analyses that invoke causative narratives based on representativeness tend to
provide much more compelling reading than an explanation that is based on base rate.
Conclusion
The cognitive biases discussed here tend to be absent from research methods textbooks;
and while these phenomena may be deemed outside of the normal domain of coverage in
research methods courses, I argue here that they can have a significant influence on both
the research process as well as the outcomes of a study. Because an intuitive response
tends to serve humans well, we can get lulled into a certain complacency even when
doing apparently rigorous research. This may occur for several reasons:
the rigorous adherence to proper research procedures is viewed as paramount,
while good decision making has less focus;
statistical realities such as randomness, regression to the mean and base rate are
banal and lack a much less compelling narrative than representativeness;
confirmation bias acts as a strong force preventing any confounding effects from
challenging established beliefs and positions.
Because our cognitive biases typically become entrenched by early adulthood, simple
didactic teaching, or awareness-raising about them is insufficient. Repeated exposure to
and practice with various research scenarios, similar to the ones outlined throughout this
14
essay, are required to reduce or avoid our tendencies to be attracted to the most obvious
explanations and causes without considering less intuitive statistical reasons.
Finally, it should be apparent that many of these cognitive biases have connections with
each other, and collectively can have an impact on research outcomes. However, these
forces do not tend to be included in research methods courses.
They should be.
References
Author (2018)
Cohen, Louis, Manion, Lawrence, and Morrison, Keith. 2011. Research Methods in
Education (7th edition). New York: Routledge.
Corbin, Juliet, and Strauss, Anselm. 2015. Basics of Qualitative Research: Techniques
and Procedures for Developing Grounded Theory (4th edition). Thousand Oaks,
CA: Sage.
Cresswell, John. W. 2012. Educational Research: Planning, Conducting, and Evaluating
Quantitative and Qualitative Research (Fourth edition). Boston, MA: Pearson.
Denzin, Norman. K., and Lincoln, Yvonna S. 2003. The Landscape of Qualitative
Research (second edition). Thousand Oaks, CA: Sage.
15
Dornyei, Z. 2007. Research Methods in Applied Linguistics: Quantitative, Qualitative
and Mixed Methodologies. Oxford: Oxford University Press.
Forer, B.R. 1949. “The Fallacy of Personal Validation: A Classroom Demonstration of
Gullibility.” Journal of Abnormal and Social Psychology 44, 118–123.
Gilbert, Daniel Todd. 1991. “How Mental Systems Believe.” American Psychologist 46,
107-119.
Kahneman, D. 2011. Thinking Fast and Slow. New York: Farrar, Straus and Giroux.
Lewis, M. 2016. The Undoing Project: A Friendship that Changed our Minds. New York:
Norton.
Lindstromberg, S. 2016. “Inferential Statistics in Language Teaching Research: A Review
and Ways Forward.” Language Teaching Research 20 (6), 741-768.
MacKay, Alison, and Gass, Susan M. 2005. Second Language Research Methods.
Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Miles, Matthew. B., and Huberman, A. Michael 1994. Qualitative Data Analysis: An
Expanded Sourcebook. Thousand Oaks, CA: Sage
Richards, Keith, Ross, Stephen J., and Seedhouse, Paul. 2011. Research Methods for
Applied Language Studies. London: Routledge.
Seligar, H. W., and Shohamy, E. 1989. Second Language Research Methods. Oxford:
Oxford University Press.
Stapleton, Paul., and Shao Qing. 2018. A worldwide survey of MATESOL programs in
2014: Patterns and perspectives." Language Teaching Research, 22: 10-28.
16
Tversky, Amos, and Kahneman, Daniel. 1974. “Judgment under Uncertainty: Heuristics
and Biases.” Science, 185, 1124-1131.
17
... For example, confirmation bias is the tendency to select information that supports a pre-existing belief and ignore information that contradicts the belief. Stapleton (2019) asserts that confirmation bias may be the most difficult cognitive bias to overcome in both everyday life and in research activities. Those who do not accept that climate change is real may ignore data that supports the predictions made by climate scientists while focusing on information that supports their own belief such as cold weather days, snowstorms, people ice fishing, and other isolated cold weather events. ...
ResearchGate has not been able to resolve any references for this publication.