Content uploaded by Matthew Mccrudden
Author content
All content in this area was uploaded by Matthew Mccrudden on Sep 21, 2021
Content may be subject to copyright.
Running head: MIXED METHODS INTRODUCTION
1
MIXED METHODS IN EDUCATIONAL PSYCHOLOGY INQUIRY
Matthew T. McCrudden1
Gwen Marchand2
Paul Schutz3
1Pennsylvania State University
2University of Nevada-Las Vegas
3University of Texas-Antonio
Corresponding author:
Matthew T. McCrudden
Pennsylvania State University
College of Education
Educational Psychology, Counseling, and Special Education
333 CEDAR Building
University Park, PA 16802
Office phone: 1-814-863-7536
E-mail: mtm402@psu.edu; mattmccrudden@hotmail.com
Gwen Marchand
University of Nevada-Las Vegas
College of Education
Educational Psychology and Higher Education
4505 S. Maryland Parkway
Box 453003
Las Vegas, NV 89154-3003
Office phone: 1-702-895-4303
E-mail: gwen.marchand@unlv.edu
Paul A. Schutz
UTSA, Department of Educational Psychology
501 César E. Chávez Blvd., DB 4.330
San Antonio, Texas 78207-4415
Email: paul.schutz@utsa.edu
Phone: 1-210-458-2612
Running head: MIXED METHODS INTRODUCTION
2
Abstract
Mixed methods research has the potential to advance theory and enhance the usefulness of
research findings. However, the success of a mixed methods research inquiry is tied to how well
researchers integrate the quantitative and qualitative strands, and to how well researchers address
the standards for quality in quantitative, qualitative, and mixed methods. In this introduction
article, we define mixed methods research and discuss what mixed methods research can offer to
the field of educational psychology. Then we consider what constitutes integration and rigor in
mixed methods research and describe three core mixed methods research designs. Following this
overview, we briefly introduce each article to this special issue, along with the commentary by
Vicki Plano Clark. We also discuss how the use of mixed methods can help address common
educational problems including: (a) identifying and exploring socially-situated and
contextualized learning processes; (b) providing insights into differences across individuals with
respect to educational outcomes; and, (c) building instruments that reflect the experiences of
individuals who will be assessed by these instruments. Finally, we close with thoughts on the
future of mixed methods research.
Running head: MIXED METHODS INTRODUCTION
3
There has been greater acceptance and use of mixed methods research since the turn of
the century (Burch & Heinrich, 2015; Creswell, 2010). One reason for this shift is the need for
different approaches to investigate complex educational and social issues (DeCuir-Gunby &
Schutz, 2017; Ivankova & Kawamura, 2010). Another reason is that major funding agencies,
such as the National Science Foundation and Institute of Education Sciences, have begun
encouraging researchers to use mixed methods research rather than singular research method
approaches (Mixed Methods in Education Research IES Technical Working Group, 2015; Plano
Clark, 2010). A third reason is that research communities have helped to establish mixed
methods approaches as an acceptable and scientifically-legitimate approach to inquiry (Biddle &
Schafft, 2015; Creswell & Plano Clark, 2018; Teddlie & Tashakkori, 2009). Mixed method
inquiry is a relatively new approach to conducting research in educational psychology compared
to quantitative and even qualitative approaches. Thus, it is important for researchers who conduct
or evaluate mixed methods research (e.g., members of doctoral committees, journal manuscript
reviews, or funding agencies) to understand what characterizes high-quality mixed methods
inquiry (DeCuir-Gunby & Schutz, 2017).
Our primary goal for this special issue was to support the development of mixed methods
research as an approach to inquiry in educational psychology by showcasing high-quality mixed
methods research studies conducted by educational psychologists across a range of topics. We
asked contributors to this special issue to describe some of the key features of their research
inquiries so the rationale for various decisions related to their respective study designs and the
steps taken to ensure rigor were stated explicitly in the articles. By encouraging authors to
describe what decisions they made and actions they took, as well as why, we aimed to promote
the transparency of the research process for readers. Further, we asked authors to provide
Running head: MIXED METHODS INTRODUCTION
4
procedural diagrams (Creswell & Plano Clark, 2018; Ivankova, Creswell, & Stick, 2006) to show
when the different methods used and the points of integration between them.
This special issue introduction article consists of seven main sections. First, we define
mixed methods research. Second, we consider broadly what mixed methods research can offer to
educational psychology. In the third and fourth sections we describe key areas of foci for the
special issue (i.e., integration and rigor). Fifth, we describe three core mixed methods research
designs and discuss each article with respect to design, rigor, and the value-added contributions.
Sixth, we briefly identify common problem spaces in educational psychology and how mixed
methods may help researchers address these challenges. Finally, we close with observations
about the direction of the field in terms of the use of mixed methods research and a cautionary
note.
Defining Mixed Methods Research
Mixed methods research can be defined as “research in which the investigator collects
and analyzes data, integrates the findings, and draws inferences using both quantitative and
qualitative approaches” (Tashakkori & Creswell, 2007, p. 4). Mixed methods research differs
from multiple method research in which an investigator uses two or more methods from the same
methodological tradition (i.e., more than one quantitative approach, or more than one qualitative
approach, in a single study). In mixed methods research, an investigator combines at least one
quantitative method and one qualitative method in a way that potentially maximizes the strengths
and minimizes weaknesses of each respective method. For instance, survey data can be collected
from a large number of participants in a relatively short time frame (potential strength) but may
provide limited insights into reasons underlying individuals’ responses (potential weakness).
Interviews can be conducted with a sample of participants who can provide in-depth descriptions
Running head: MIXED METHODS INTRODUCTION
5
about a phenomenon of interest (potential strength); however, data collection and analysis can be
time-intensive and involves a smaller number of participants (potential weakness). The intent is
to triangulate the data sets using both research methods traditions to offset potential limitations
or bias introduced within each respective tradition (Creamer, 2018). This, in turn, can enhance
the usefulness and interpretability of the findings.
Two particularly relevant characteristics of mixed methods research are methodological
eclecticism and paradigm pluralism (Teddlie & Tashakkori, 2012). Methodological eclecticism
means that researchers knowledgably select, use, and integrate the most appropriate methods
from a wide variety of quantitative, qualitative and mixed approaches to thoroughly investigate
the phenomena of interest. This contrasts with the incompatibility thesis, which “posits that
qualitative and quantitative research paradigms, including their associated methods, cannot and
should not be mixed” (Johnson & Onwuegbuzie, 2004, p. 14). Paradigm pluralism refers to “the
belief that a variety of paradigms may serve as the underlying philosophy for the use of mixed
methods” (Teddlie & Tashakkori, 2012; p. 779).
Although there are multiple worldviews available to guide mixed methods research,
pragmatism tends to be the overarching philosophy espoused by the majority of mixed methods
scholars (Lincoln, Lynham, & Guba, 2011; Tashakkori & Teddlie, 2003; Teddlie & Tashakkori,
2012). The focus tends to be on the usefulness or consequences of research, which includes the
role of ethics and values in the context of community and the social good (Maxcy, 2003); the
importance of the research question rather than the method; and the use of a number of methods
to address research topics (Creswell & Plano Clark, 2018).
The Value of Mixed Methods Research in Educational Psychology
Running head: MIXED METHODS INTRODUCTION
6
There are substantive demands in terms of effort, time, and expertise associated with
designing and conducting a rigorous mixed methods study (Creswell & Plano Clark, 2018;
McKim, 2017). However, there are a number of general benefits to using mixed methods,
including: (a) gaining a deeper and broader understanding of the phenomenon (Hurmerinta-
Peltomaki & Nummela, 2006), (b) providing readers greater confidence in the findings and
conclusions drawn from the study (O’Cathain, Murphy, & Nicholl, 2010), and (c) enabling
readers to more easily comprehend the significance of a study’s findings and grasp the meaning
of complex phenomena (McKim, 2017; Onwuegbuzie & Leech, 2004). Our work on this special
issue was guided in part by a desire to leverage these general benefits and to stimulate
conversation amongst educational psychologists about the value of mixed methods research to
the investigation of phenomena in our field. We approached this special issue with the following
questions: (1) What do mixed methods approaches offer to educational psychology researchers?
(2) Why do we believe that educational psychological training programs and researchers should
invest in the development of expertise in mixed methods research?
In educational psychology, researchers typically adopt a positivist or post positivist
methodology (general inquiry worldview) and methods (strategies and procedures for conducting
research) that are quantitative in nature (see Lincoln, Lynham & Guba, 2011 for discussion). For
instance, researchers have predominantly used quantitative data collection methods and analyses
aligned with the general linear model (Kaplan, Katz, & Flum, 2012; Koopmans, 2014). The
dominance of a single method for conducting research in a field of study can be problematic for
a number of reasons, most notably the potential restriction of knowledge that can result in a less
comprehensive understanding of phenomena under study. Rather than designing studies and
choosing methods on the basis of theoretical propositions and related research questions,
Running head: MIXED METHODS INTRODUCTION
7
researchers may adhere to preferred designs and methods within their communities of practice or
select methods with which they have the greatest experience and training (Hilpert & Marchand,
2018; Kaplan et al., 2012).
While considering research trends in educational psychology, Dumas and colleagues
(2015) noted that advances in statistical modeling not only allow researchers to test existing
theoretical models more comprehensively, but also enable researchers to develop those
theoretical models. Similarly, mixed methods research can potentially enable researchers to
develop and test theories in educational psychology within a range of contexts and with a variety
of populations to yield new knowledge that is relevant to practitioners, interdisciplinary scholars,
and emerging researchers in our own field. An increase in the use of mixed methods approaches
to inquiry in educational psychology, may lead to an expansion of research questions that
address underexplored aspects of theoretical models which have not been easily investigated by
our traditional, more familiar approaches (Rozin, 2001, 2009). For instance, researchers may use
mixed methods approaches to identify if, when, and why the experiences of individuals may
diverge from overall patterns, leading in some cases to new areas for theoretical exploration
(e.g., Butz & Usher, 2015).
Further, engagement in mixed methods research may offer opportunities for scholars to
reflect more critically on the research process itself. Engaging in mixed methods research
requires knowledge of quantitative, qualitative, and mixed methods and careful evaluation of
choices, such as sampling and processes involved in data collection and analysis (e.g., Bergman,
2011; Greene, 2008) at multiple points in the inquiry. This critical reflection may contribute to
rigor in the field as we think more carefully about the assumptions underlying our research
designs and methods and how these align with our theoretical assumptions (DeCuir-Gunby &
Running head: MIXED METHODS INTRODUCTION
8
Schutz, 2014; Hilpert & Marchand, 2018; Kaplan et al., 2012). Researchers may gain greater
critical awareness of the role that their worldviews play in research process. An additional
benefit is that increasing and expanding our research methods expertise can allow us to have
more meaningful conversations with colleagues in other domains in education, particularly with
those who tend to use qualitative or mixed methods. Conversations with researchers in other
fields investigating similar topics but with different research methods may spark interdisciplinary
partnerships. These are more likely to be fruitful when common ground with respect to research
methods are identified.
Integration
A high-quality mixed methods research study consists of more than just the use of
quantitative and qualitative strands in the same study. A defining feature of mixed methods
research is that the researcher integrates the quantitative and qualitative strands. Integration
occurs when an investigator intentionally combines quantitative and qualitative approaches in a
study such that their combination provides a more comprehensive understanding of the topic
(Fetters & Molina-Azorin, 2017). Thus, when and how the two strands are integrated plays an
essential role in establishing the quality of the study design, and ultimately of the quality of the
inferences and conclusions drawn from the study.
The term yield refers to the insights that a mixed methods study can provide that would
not be possible from a quantitative study or a qualitative study alone (O’Cathain, Murphy, &
Nicholl, 2007). As such, the integration of the quantitative and qualitative strands in a study
influences the yield. Integration can occur at one stage, or multiple stages. Fully-integrated
mixed methods research is “an approach to mixed methods research where there is the intention
to mix or integrate the qualitative and quantitative strands of study throughout each of the stages
Running head: MIXED METHODS INTRODUCTION
9
or phases of the research process” (Creamer, 2018, p. 12). Conversely, the use of quantitative
and qualitative strands in a study that are not integrated at all provides limited or no yield. Thus,
integration in mixed methods involves more than just the presence of quantitative and qualitative
data; it matters when and how the quantitative and qualitative strands are integrated. Later in this
article we explain how the authors in this special issue achieved integration in their studies to
provide greater insights about their topics.
Despite consensus about the importance of integration in mixed methods research, there
are a variety of views about how researchers can achieve integration in a mixed methods
research inquiry (Bryman, 2006; Greene & Caracelli, 1997; Creamer, 2018; Fetters, Curry, &
Creswell, 2013; McCrudden & McTigue, 2018; O’Cathain, Murphy, & Nicholl, 2007; Woolley,
2009; Yin, 2006). These discussions reflect the development of mixed methods as a relatively
new research paradigm and the generative nature of the paradigm for addressing questions in
dynamic, complex, and interdisciplinary research contexts. A comparison of the different
approaches to integration is beyond the scope of this introduction (see Creamer, 2018; Creswell
& Plano Clark, 2018; Fetters et al., 2013; Greene, 2007; and Teddlie & Tashakkori, 2009 for
some different approaches). However, in this special issue, we encouraged the contributing
authors to explicitly indicate how they achieved integration in their respective studies.
Rigor
The success of a mixed methods research inquiry is tied to how well researchers meet the
standards for quality in quantitative, qualitative, and mixed methods approaches. Research
methods scholars have developed strategies and guidelines for ensuring rigor in the research
process. Thus, researchers interested in conducting quality mixed methods research must be
familiar with the strategies and guidelines used in all three general approaches.
Running head: MIXED METHODS INTRODUCTION
10
Issues related to internal and external validity (Benson, 1998; Shadish, Cook, &
Campbell, 2002) play a prominent role in quantitative study designs and inferences about results.
When evaluating the rigor of a quantitative study, we make judgments about the evidence
provided for reliability and for validity. This type of evidence is critical to our interpretation of
the results from a quantitative study. For instance, when using a survey, coefficient alpha can be
used to determine whether survey items in a scale are internally consistent and confirmatory
factor analysis can be used to determine whether the scale actually measures the construct the
researcher claims it measures. The results are problematic if the scale does not measure a
particular construct in a particular context.
Similarly, qualitative researchers have developed several ways to establish rigor. One
way is through the researcher positionality statement, which is the researchers’ description of
their views on the topic under study (e.g., how did they get interested in the topic, why is it
important to them). This statement is important in both qualitative and quantitative approaches
because researcher perspective plays an influential role in all phases of the investigation (e.g.,
research questions, participant selection, data analysis). This statement helps the reader
understand the findings and highlights the researchers’ reflexivity (i.e., self-awareness of how
ones’ thoughts and views can affect the research process). Further, there are a number of ways
researchers can provide evidence for the credibility/trustworthiness of their findings, such as
evidence of prolonged engagement with the participants, whereby the goal is to provide a thick
rich description that reflects the complexities of an experience that participants describe to the
researcher. The thick description is also enhanced with member-checks where the researcher
revisits the participants with the data to get the participants thoughts on what the researcher is
finding (however, see Morse, 1998). Finally, discussing how audit trails (i.e., in-depth
Running head: MIXED METHODS INTRODUCTION
11
description of the steps the researcher has taken), peer debriefing (i.e., dialoguing about results
with colleagues, experts in the field), and using triangulation (i.e., the integration of different
methods or data sources) can provide evidence for the credibility/trustworthiness the findings
(Levitt, Bamberg, Creswell, Frost, Josselson, & Suarez-Orozco, 2018; Morse, 2015).
Mixed methods researchers have also developed criteria for evaluating the quality of
mixed methods research studies (Creamer, 2018; Levitt et al., 2018; O’Cathain, 2010; O'Cathain,
Murphy, & Nicholl, 2008). For instance, researchers synthesize the literature, formulate research
questions, and articulate a clear rationale/justification for implementing a mixed methods design,
and the accompanying research methods, to address those questions. In addition, researchers
need to implement both the quantitative and qualitative methods with the previously established
rigor for both approaches. As might be expected, the usefulness of the findings from a mixed
method study can be compromised if either the quantitative or qualitative data collection and
analysis lack sufficient rigor. Usefulness may also be compromised if a mixed methods study
lacks integration or a clear explanation of integration approaches. Therefore, it is important that
the integration of the approaches is appropriate and presented with sufficient detail so the reader
can see how the findings fit together and tie back to the research questions. Finally, any
inferences that emerge from the study should be connected back to the purpose of the study, the
research questions, and findings on which the inferences were based.
Research Designs
Researchers can use mixed methods research designs when they pose quantitative and
qualitative research questions in the same study, or when a research question contains elements
of both (DeCuir-Gunby & Schutz, 2017). There are three core mixed methods research designs:
convergent, explanatory sequential, and exploratory sequential (Creswell & Plano Clark, 2018;
Running head: MIXED METHODS INTRODUCTION
12
Plano Clark & Ivankova, 2016), each of which was utilized by at least one set of authors in this
special issue. Table 1 provides a general overview of three core mixed method research designs
and Figure 1 depicts these three core designs. These three core designs differ with respect to the
timing of the data collection and the timing of the data analyses. With respect to data collection,
data for the quantitative and qualitative strands can be collected (nearly) concurrently or
sequentially. With respect to data analysis, the implementation of the second strand is either
independent from or dependent upon the data analysis from the first strand. We organized the
articles in this section based on their use of the three core mixed methods research designs.
Convergent Design
The first core design is a convergent design (also referred to as a concurrent or parallel
design). With this type of design, the data for the quantitative and qualitative strands are
collected in approximately the same timeframe, the data for both strands are analyzed
independently, and then the data from both strands are integrated during interpretation to identify
possible sources of convergence or divergence (Figure 1a). The implementation of neither strand
is contingent upon the data analysis of the other strand. The intent of integrating the two strands
is often to generate interpretations that extend the breadth and range of the inquiry and/or seek
corroboration of results from the two strands (Creswell & Plano Clark, 2018). Three of the
articles in the special issue (Schmidt, Kafkas, Maier, Shumow, & Kackar-Cam, this issue; Usher,
Ford, Lee, & Weidner, this issue; White, DeCuir-Gunby, & Kim, this issue) used a convergent
design, although they achieved integration in different ways.
Schmidt et al. (this issue; Figure 1) used a convergent mixed methods study design to
investigate teachers’ use of instructional strategies that are meant to promote students’ beliefs
about the value or usefulness of course content beyond the immediate instructional context (i.e.,
Running head: MIXED METHODS INTRODUCTION
13
relevance). In the qualitative strand, they conducted classroom observations over a seven-week
time-period and teacher interviews at the completion of the observation period to evaluate
teachers’ perceptions and communication of relevance during class, which provided evidence of
prolonged engagement with the participants. For the classroom observations, observers recorded
activities and teacher-student interactions to identify instances in which teachers explicitly talked
about why science content mattered. These qualitative data were then coded and transformed into
frequencies (quantized; Sandelowski, Voils, & Knafl, 2009). The researchers conducted the
teacher interviews to gain insights into teachers’ perspectives about how they communicate the
value of science in and out of the classroom. In the quantitative strand, they measured students’
perceptions of science utility on an end of class questionnaire item periodically over the seven-
week time frame in which they conducted the classroom observations. Student perceptions were
measured following class meetings in which observations took place. Thus, they achieved
integration through data transformation, which refers to converting one data type into the other
type of data (e.g., quantitizing qualitative data) and integrating it with the data that have not been
transformed using triangulation to provide evidence for the credibility/trustworthiness of the
results.
Integrating the two strands enhanced their study by enabling the researchers to evaluate
overlapping yet different facets of relevance. Specifically, the interview data enabled the
researchers to gain insights into teacher beliefs about science utility and their views about how
they communicated science utility to their students, whereas the classroom observation data were
used to describe actual teacher behaviors. Further, these behaviors could be considered in light of
student survey data, developed from existing measures, to evaluate the extent to which teacher
behaviors were related to student perceptions of science utility. Scale reliability was reported as
Running head: MIXED METHODS INTRODUCTION
14
internal consistency. Thus, the convergent design made it possible to investigate teacher-student
interactions related to the relevance of science content, provide teachers’ views of their practices,
and gather student perceptions of science in light of these teaching practices, thus providing
mixed methods rigor.
Usher et al. (this issue; Figure 1a) used a convergent mixed methods study design to
investigate sources of self-efficacy beliefs for mathematics and science, and self-efficacy beliefs
regarding mathematics and science for middle and high school students in a rural, high-poverty
area in Appalachia. In the quantitative strand, students completed sources of self-efficacy belief
scales for mathematics and science, and self-efficacy belief scales for mathematics and science.
The authors used confirmatory factor analysis to examine the internal structure of the sources of
self-efficacy belief scales in mathematics and science and reported coefficient alphas. The
quantitative results from structural equation models (SEM) indicated that mastery experience and
physiological state both predicted mathematics self-efficacy, whereas only mastery experience
predicted science self-efficacy.
Somewhat surprisingly, neither vicarious experience nor social persuasion predicted self-
efficacy for mathematics or science. Further, physiological state did not predict science self-
efficacy. Participants’ responses to open-ended questions about what raised and lowered their
self-efficacy in mathematics and science contextualized the quantitative findings. The
researchers used data conversion or transformation with the qualitative data. Specifically, they
converted the qualitative codes to quantitative data (quantized), which enabled them to identify
convergence and divergence between the two data sets when they interpreted the findings
(Sandelowski et al., 2009). The qualitative findings corroborated the results from the quantitative
strand regarding significant predictors of self-efficacy. However, by including participant voices,
Running head: MIXED METHODS INTRODUCTION
15
the qualitative findings depicted a broader view of mastery experience in which both direct
experience (i.e., one’s past experiences in performing tasks) and performance evaluation (e.g.,
feedback about one’s performance on a task) were prevalent sources of students’ self-efficacy
beliefs for mathematics and science.
Integrating the two strands enhanced the study by enabling the researchers to identify
convergence and divergence between the data from the closed-ended questionnaire items and the
open-ended interview prompts. Further, by integrating and interpreting the quantitative and
qualitative findings, the researchers were able to identify patterns that were not apparent in the
quantitative data alone. For instance, although neither vicarious experience nor social persuasion
individually predicted self-efficacy in the SEM models, students incorporated these sources with
other sources in their responses to the open-ended questions. Thus, the convergent design made it
possible to investigate what types of experiences support and undermine student self-efficacy,
how students use information to judge what they can do, and to identify divergent results
between each method.
White et al. (this issue; Figure 1) used a convergent mixed methods study design to
investigate the relations among racial identity, science identity, science self-efficacy beliefs, and
science achievement for African American students at historically Black colleges and
universities (HBCUs). In the quantitative strand, 347 students completed questionnaires
pertaining to their racial identity, science identity, and science self-efficacy and self-reported
their science college grade-point average. The quantitative results from the path analysis
indicated a significant positive relation between science identity and science self-efficacy, and
that science self-efficacy meditated the indirect relation between science identity and college
Running head: MIXED METHODS INTRODUCTION
16
science achievement. Similarly, science self-efficacy mediated a marginal relation between racial
identity and college science achievement.
In the qualitative strand, the researchers used a Critical Race Theory lens and conducted
individual interviews with 14 African American science students who had participated in the
quantitative strand. The interview protocol elicited participants’ views about the influence of the
HBCU environment on the constructs measured in the quantitative strand (achieving integration
through building; Fetters et al., 2013), with a focus on their pre-college experiences with science
and more recent experiences that had shaped their science and racial identities. When the
researchers merged the two data sets, they found the qualitative findings corroborated the results
from the quantitative strand. Students indicated that receiving recognition as African American
scientists was very important to them, and recognition is one of the most salient aspects of an
individual’s science identity. However, science identity is much more domain general, whereas
self-efficacy is more domain or even task specific. Thus, it is possible that students who have a
stronger domain-general science identity may have different levels of science self-efficacy based
on the specific domain, which may be related to achievement in their science classes.
Integrating the two strands enhanced their study by enabling the researchers to identify
convergence between the closed-ended questionnaire items and the open-ended interview
prompts. Further, by integrating the quantitative and qualitative findings, the researchers were
able to corroborate and explain patterns from the path analysis. Thus, the convergent design
made it possible to investigate similarities and differences between students’ identities and their
science self-efficacy, and their relations to science achievement.
Explanatory Sequential Design
Running head: MIXED METHODS INTRODUCTION
17
The second core design is an explanatory sequential design. With this type of design, the
data for the quantitative strand are collected and analyzed, followed by the collection and
analysis of the data for the qualitative strand (Figure 1b). Importantly, the quantitative strand
informs the sampling procedure for the subsequent qualitative strand because the qualitative data
are used to explain some finding from the quantitative strand. Then, the data for both strands are
brought together and the qualitative strand is used to explain or illuminate a particular finding
from the quantitative strand. Thus, the data collection and analysis for the quantitative strand
precede the data collection for the qualitative strand, and the quantitative and qualitative strands
are dependent; the implementation of the qualitative strand is contingent upon the data analysis
from the quantitative strand. The intent of integrating the two strands is often to use the
qualitative strand to elaborate, enhance, or explain some finding of interest from the quantitative
strand (Creswell & Plano Clark, 2018). One article (Matthews & López, this issue) in the special
issue used an explanatory sequential design.
Matthews and López (this issue; Figure 2) used an explanatory sequential design to
investigate the relations among teacher beliefs, teacher behaviors that affirm students’ ethnicity
and culture, and mathematics achievement for Latino children in primary school. In the
quantitative strand, the teachers completed survey items about their critical awareness (i.e.,
knowledge about teaching historically marginalized students), expectations for student success,
and their use of asset-based pedagogy (i.e., incorporation of cultural content and the Spanish
language in their classroom instruction; CCI). For the reliability and validity of the CCI, the
researchers reported internal consistency and confirmatory factor analysis evidence for the scale.
Next, the researchers used multi-path models to identify predictors of student achievement. The
quantitative results indicated that teacher expectations directly predicted student mathematics
Running head: MIXED METHODS INTRODUCTION
18
achievement but not use of asset-based pedagogy, while teachers’ critical awareness indirectly
predicted mathematics achievement via teacher reported use of asset-based pedagogy.
To further explain these findings, the researchers used the survey data to identify teachers
for follow-up interviews. Specifically, they purposefully sampled and interviewed teachers who
had the highest scores on the survey items that measured critical awareness and expectations for
student success. Thus, they integrated the quantitative and qualitative strands through sampling,
(specifically extreme-case sampling), a form of integration known as connecting (Fetters et al.,
2013). Further, the researchers used the survey items from the quantitative strand to develop the
interview protocol for the qualitative strand. Thus, they integrated the data collection procedures
from both strands, a form of integration known as building (Fetters et al., 2013). The interview
data enabled the researchers to provide an in-depth description of the teachers’ beliefs and how
they affirmed their students’ ethnicity and culture in the classroom and curriculum. In the
qualitative analysis phase, rigor was enhanced by using a constant comparative approach to
ensure the voices of the participants in developing their theory. As such, the qualitative findings
indicated a key difference in teacher goals for using asset-based pedagogies. Some teachers
described using asset-based pedagogies to realize socio-engagement goals (i.e., building
community and promoting equity and awareness of cultures), whereas other teachers described
using asset-based pedagogies to realize academic goals (i.e., leveraging students’ funds of
knowledge academic learning). Further, teachers who espoused academic goals conveyed a
deeper understanding of cultural marginalization.
Integrating the two strands enhanced their study in two main ways. First, including the
qualitative strand made it possible to potentially explain why neither teacher critical awareness
nor cultural content integration directly predicted student mathematics achievement. Second, the
Running head: MIXED METHODS INTRODUCTION
19
interview data indicated that despite having similar quantitative profiles, teachers who had high
self-reported critical awareness and expectations espoused different goals, which influenced their
classroom practices and engagement with students. The researchers concluded that both critical
awareness and high expectations in concert were predictive of the implementation of culturally
responsive teaching leading to growth in student learning. Thus, the explanatory design enabled
the researchers to provide a more comprehensive understanding of the topic than would have
been possible with just a quantitative approach.
Exploratory Sequential Design
The third core design is an exploratory sequential design. With this type of design, the
data for the qualitative strand are collected and analyzed, followed by the collection and analysis
of the data for the quantitative strand (Figure 1c). Importantly, the qualitative strand informs the
data collection for the subsequent quantitative strand. Then, the data for both strands are brought
together to evaluate the generalizability of the initial qualitative findings. Thus, data collection
for the qualitative strand precedes data collection for the quantitative strand, and data collection
for the quantitative strand is dependent upon the data analysis from the qualitative strand. The
intent of integrating the two strands is often to use the qualitative phase to create or build a
follow-up quantitative instrument or intervention (Creswell & Plano Clark, 2018). One article in
the special issue (Kumar et al., this issue) used an exploratory sequential design.
Kumar et al. (this issue; Figure 1) used an exploratory sequential design to investigate
features of culturally responsive learning environments across 12 middle schools in two
geographically-close school districts. In the qualitative strand, they conducted 57 focus-group
interviews with 333 students from different cultural backgrounds about their interactions with
others and their experiences in their middle schools; thus, providing ample evidence of
Running head: MIXED METHODS INTRODUCTION
20
prolonged engagement with the participants. They identified four general themes from the
interview data about student perceptions of cultural responsiveness in their schools: (a)
perceptions of teachers as respectful/prejudiced and culturally respective/insensitive, (b)
culturally responsive and inclusive curriculum, (c) intergroup relationships, and (d) school
policies and practices. Based on these data, they developed items for a questionnaire. Thus, they
integrated the data collection procedure from the qualitative strand to the data collection
procedure for the quantitative strand, a form of integration known as building (Fetters et al.,
2013). Specifically, themes from interview data were used to generate survey items to be
evaluated in the quantitative strand.
In the follow-up quantitative strand, a different sample of students (n = 2894) whose
backgrounds mirrored the students who were interviewed, completed the questionnaire. To
provide reliability and validity evidence, the researchers reported internal consistency for the
three scales following a confirmatory factor analysis: (a) promoting cultural openness and
positive intergroup relations, (b) providing culturally inclusive and responsive curriculum, and
(c) establishing culturally responsive school practices and policies.
Integration of the two strands enhanced their study by providing evidence of the
generalizability of the qualitative findings to a large sample. The qualitative strand enabled the
researchers to identify features of culturally inclusive and responsive curricular learning
environments (CIRCLEs) based on student focus group interviews and to develop questionnaire
items. In the quantitative strand, they tested the applicability and psychometric generalizability
of CIRCLE questionnaire to a large sample. Thus, the exploratory sequential design enabled the
researchers to develop and provide validity evidence for an instrument to measure the features of
CIRCLEs.
Running head: MIXED METHODS INTRODUCTION
21
Commentary
We close this special issue with a commentary by Vicki Plano Clark (this issue), a
specialist in mixed methods research. Plano Clark’s work focuses on designs for conducting
mixed methods research, examining procedural aspects of these designs, and examining broader
questions about contexts for the adoption and use of mixed methods. Her work has been at the
forefront of mixed methods research. For example, in 2011, she co-led the development of Best
Practices for Mixed Methods in the Health Sciences for NIH's Office of Behavioral and Social
Sciences Research and she is a founding co-editor of SAGE’s Mixed Methods Research Series.
Plano Clark situates each of the empirical articles in a core research design, describes the
importance of integration to mixed methods research, and discusses four strategies the authors
use to integrate the quantitative and qualitative approaches in their studies.
Addressing Challenges in Educational Psychology
During the submission process, we asked authors to consider the question of what could
be learned about their topics of study by using mixed methods designs that could not be learned
from mono-method approaches. In reading across studies, we also noted the value-added feature
of mixed methods with respect to common problem spaces in research in educational psychology
in three specific areas: (a) to identify and explore socially-situated and contextualized learning
processes; (b) to provide insights into differences across individuals with respect to educational
outcomes such as learning and motivation; and (c) to build instruments that reflect the
experiences of individuals who will be assessed by these instruments. In the following
paragraphs we elaborate on how the use of mixed methods approaches may be helpful for
expanding the research findings in each of these spaces.
Running head: MIXED METHODS INTRODUCTION
22
Educational psychologists have called for research that expands our understanding of the
socially-situated or contextualized nature of learning processes (e.g., Nolan, Horn, & Ward,
2015; Schutz, 2014). Mixed methods research offers promise to further our knowledge about the
socially-embedded nature of learning. For instance, researchers can use large-scale multilevel
modeling to investigate the influence of individual- and group-level variables (and their
interactions) on different outcomes of interest (Raudenbush & Bryk, 2002). As a complement to
such quantitative approaches, qualitative research can be instrumental in gaining insights into
how individuals, such as students, make meaning of their social context and how the explicit and
implicit messages students receive from relational partners, such as teachers or peers, and school
structures influence their learning experiences (e.g., Gray et al., 2017; Kurtz-Costes & Woods,
2017). Both approaches offer value in considering interactions amongst learners and their
environments. However, researchers who use both approaches in a study may gain a deeper and
broader understanding of the role of individual- and group-level variables on student outcomes
and have greater confidence in their findings and conclusions. Articles in this special issue
integrated general quantitative patterns and in-depth narrative responses from participants to
reveal how specific environmental and social cues and supports created unique experiences for
learners. Importantly, all articles in the special issue investigated and discussed contextual
aspects of learning that were afforded by their use of mixed methods research.
A related issue is that while there are many studies in the educational psychology
literature that conduct analyses to look for differences across groups (e.g., poverty level, gender,
race) there is limited research exploring whether these are appropriate distinctions to make or
reasons why groups of individuals may systematically respond differently to measures of
outcomes (DeCuir-Gunby & Schutz, 2014). In other words, when researchers control for group
Running head: MIXED METHODS INTRODUCTION
23
differences or test for moderation effects across groups, there is rarely any meaningful
investigation into reasons or rationales for these findings from the perspectives of individuals.
Researchers may draw upon theory to explain their findings, but often these theoretical
explanations remain empirically underexplored. Articles in this special issue illustrate how
mixed methods can be used to explore distinct experiences that contribute to systematic variation
across and within groups, particularly by including the voices of the participants.
Finally, mixed methods also holds promise for addressing measurement issues in
educational psychology, particularly related to capturing subjective experiences (Benson, 1998).
Rather than using existing theory and research experts to generate survey items, mixed methods
approaches can be used to incorporate the experiences of potentially relevant populations during
instrument development. Such an approach may be particularly important when researchers
investigate constructs that are race-focused (DeCuir-Gunby & Schutz, 2014) or when they
administer an instrument to a widely diverse group of participants. Instruments developed and
normed with homogenous groups reflect the values, experiences, and beliefs of that group. In
educational psychology, many commonly used instruments have been validated using white,
middle-class, samples; thus, many of the instruments reflect the values, experiences, and beliefs
of these samples. However, the use of these instruments in research with diverse samples is
susceptible to bias in favor of participants that are similar to the original norming sample.
Researchers in educational psychology have attempted to redress this problem by testing
for measurement invariance amongst groups to eliminate measurement equivalence issues on
survey responses (Schwartz et al., 2014). However, even this approach does not address the issue
that the survey items themselves may not capture the heterogenous experiences of the individuals
in the participating sample. When mixed methods approaches are used to generate items
Running head: MIXED METHODS INTRODUCTION
24
inclusive of the experiences of the intended population, the instrument can more authentically
reflects student experiences, increasing the credibility of findings that result from the use of the
instrument. This special issue offers a helpful example of how instruments can be developed or
refined to ensure that measures reflect diverse cultural experiences (Kumar et al., this issue) and
conversely, that conclusions drawn from the use of these measures are reflective of substantive
findings and not an artifact of measurement problems.
Conclusion and the Future of Mixed Methods in Educational Psychology
As indicated, our primary goal for this special issue was to support the development of
mixed methods research in educational psychology. To realize this goal, we sought to showcase
high-quality mixed methods research studies conducted by educational psychology researchers
across a range of topics. We believe this special issue has accomplished that goal by presenting
articles that demonstrate rigor, integration of methods, and results that may not have emerged
from single method approaches. As such, we think the use of mixed methods by educational
psychologists has an important future yet acknowledge that mixed method inquiry is not a
panacea.
However, it is also important to acknowledge that using a mixed methods approach is not
for the “faint of heart.” First, mixed methods research requires knowledge and skills in not just
one area of research methods but in three (i.e., qualitative, quantitative, and mixed methods).
Researchers in these three areas of research methods have developed expectations and standards
for good practice and rigor. This can be a particularly challenging for researchers if they have
had little or no education or training in qualitative research methods. This can result in “QUAL-
light” research (Teddlie & Tashakkori, 2012, p. 777), the use of “qualitative data as
‘handmaiden’ or ‘second best’ to the quantitative data”, or the use of mixed methods that leads to
Running head: MIXED METHODS INTRODUCTION
25
the “‘adding and stirring’ of qualitative methods that often takes the form of sprinkling in some
vignettes to provide narrative examples of the conclusions already reached by means of
quantitative methods” (Hesse-Bieber, 2010, pg. 457). Therefore, researchers interested in using
mixed methods for their research must also understand and meet those expectations and
standards for good practice and rigor in their own work.
Second, in most cases, a mixed methods study, by its very nature, has the potential to use
more resources. Collecting and analyzing data from two approaches has the potential to be more
resource-demanding (e.g., time, funding) than using a single method (Bergman, 2011); thus,
using a second research method in most cases will increase the time and cost of the project. To
ensure adequate expertise and resources, mixed methods research is often conducted as a team
approach (Creswell, Klassen, Plano Clark & Smith, 2011). In fact, all the articles in this special
issue were conducted by multiple author teams. Working as a collaborative research team can
entail its own challenges (Fiore, 2008) but may represent a fruitful avenue for researchers
committed to executing mixed methods studies.
Lastly, journal editors may experience difficulties when handling the review and
evaluation of mixed methods manuscripts. In the field of educational psychology, most journals
have a long history of publishing predominantly quantitatively-focused research, although this
trend is changing. Nevertheless, it is important for members of editorial boards or reviewers to
be sufficiently well-versed or trained in mixed methods research to adequately evaluate mixed
methods studies, particularly in terms of standards for rigor or integration. While researchers on
editorial boards may know, understand, and expect to review manuscripts that use quantitative
methods, they may have less understanding of rigor for manuscripts that include qualitative or
mixed methods. In addition, mixed methods manuscripts tend to be longer than mono-methods
Running head: MIXED METHODS INTRODUCTION
26
manuscripts, and manuscripts that report mixed methods research may approach the page/word
limit restriction imposed by some journals. These challenges may discourage researchers from
conducting mixed methods research studies or from submitting them to educational psychology
journals. As educational psychologists grow more comfortable with producing, consuming, and
reviewing mixed methods research, our ability to offer critical appraisal of the quality of the
work and contributions to the field will be enhanced.
We were encouraged by the quality of work the authors in this special issue produced and
wish to thank them for their efforts in conducting and reporting their respective studies. We look
forward to reading high-quality mixed methods studies that extend into areas not represented by
this special issue. For instance, although none of the articles in the special issue used
interventions and experimental research, we believe that mixed methods research can be
beneficial for researchers who implement interventions or experimental research to explain
intervention challenges, failures, and successes. Mixed methods can help researchers investigate
participant experiences during an intervention (e.g., Koster, Bouwer, & van den Bergh, 2017),
which can be used to improve or adapt an intervention, or to elaborate or explain between-group
and within-group differences (e.g., McCrudden, Magliano, & Schraw, 2010). Ultimately the
promise for mixed methods research in educational psychology will emerge from needs within
our field and be realized by the creativity of our community of scholars.
Acknowledgements: We would like to thank Patricia Alexander, Jessica DeCuir-Gunby,
Revathy Kumar, Francesca López, Jamaal Matthews, Jennifer Schmidt, and Ellen Usher for their
feedback on earlier drafts of this article.
Running head: MIXED METHODS INTRODUCTION
27
References
Benson, J. (1998). Developing a strong program of construct validation: A test anxiety example.
Educational Measurement: Issues and Practice, 17(1), 10-17.
Bergman, M.M. (2011) ‘The good, the bad, and the ugly in mixed methods research and design’,
Journal of Mixed Methods Research, vol. 5, no. 4, pp. 271-275.
Biddle, C., & Schafft, K. A. (2015). Axiology and anomaly in the practice of mixed methods
work: Pragmatism, valuation, and the transformative paradigm. Journal of Mixed
Methods Research, 9(4), 320-334.
Burch, P., & Heinrich, C. J. (2015). Mixed methods for policy research and program evaluation.
Sage Publications.
Butz, A. R., & Usher, E. L. (2015). Salient sources of early adolescents' self-efficacy in two
domains. Contemporary Educational Psychology, 42, 49-61.
Bryman, A. (2006) Integrating quantitative and qualitative research: How is it done. Qualitative
Research, vol. 6, no. 1, pp. 97–114.
Greene, J. C., & Caracelli, V. J. (1997). Defining and describing the paradigm issue in mixed‐
method evaluation. New directions for evaluation, 1997(74), 5-17.
Creamer, E. G. (2018). An introduction to fully integrated mixed methods research. SAGE
Publications, Thousand Oaks, CA.
Creswell, J. W. (2010). Mapping the developing landscape of mixed methods research. SAGE
handbook of mixed methods in social & behavioral research, 2, 45-68.
Creswell, J. W., Klassen, A. C., Plano Clark, V. L., & Smith, K. C. (2011). Best practices for
mixed methods research in the health sciences. Bethesda (Maryland): National Institutes
of Health, 2013, 541-545.
Running head: MIXED METHODS INTRODUCTION
28
Creswell, J. W., & Plano Clark, V. (2018). Designing and conducting mixed methods research.
Thousand Oaks, CA: Sage Publications, Inc.
DeCuir-Gunby, J. T., & Schutz, P. A. (2014). Researching race within educational psychology
contexts. Educational Psychologist, 49(4), 244-260.
DeCuir-Gunby, J. T., & Schutz, P. A. (2017). Developing a mixed methods proposal: A practical
guide for beginning researchers (Vol. 5). SAGE Publications.
Dumas, D., Alexander, P. A., & Singer, L. M. (2015). Analyzing historical patterns, examining
current trends, and forecasting change in the field of Educational Psychology: A cross-
cultural perspective. Knowledge Cultures, 3(2), 7-23.
Fiore, S. M. (2008). Interdisciplinarity as teamwork: How the science of teams can inform team
science. Small Group Research, 39(3), 251-277.
Fetters, M. D., Curry, L. A., & Creswell, J. W. (2013). Achieving integration in mixed methods
designs—principles and practices. Health Services Research, 48(6pt2), 2134-2156.
Fetters, M. D., & Molina-Azorin, J. F. (2017). The Journal of Mixed Methods Research starts a
new decade: The mixed methods research integration trilogy and its dimensions. Journal
of Mixed Methods Research, 11(3), 291–307.
Gray, D. L., Hope, E. C., & Matthews, J. S. (2018). Black and belonging at school: A case for
interpersonal, instructional, and institutional opportunity structures. Educational
Psychologist, 53(2), 97-113.
Greene, J. C. (2007). Mixed Methods in Social Inquiry. San Francisco: Jossey-Bass
Greene, J. C., & Caracelli, V. J. (1997). Advances in mixed-method evaluation: The challenges
and benefits of integrating diverse paradigms (No. 658.4032 A244). Jossey-Bass
Publishers.
Running head: MIXED METHODS INTRODUCTION
29
Hesse-Biber, S. (2010). Qualitative approaches to mixed methods practice. Qualitative Inquiry,
16, 455- 468.
Hilpert, J. C., & Marchand, G. C. (2018). Complex systems research in educational psychology:
Aligning theory and method. Educational Psychologist, 53(3), 185-202.
Hurmerinta-Peltomäki, L., & Nummela, N. (2006). Mixed methods in international business
research: A value-added perspective. Management International Review, 46(4), 439-459.
Ivankova, N. V., Creswell, J. W., & Stick, S. (2006). Using mixed methods sequential
explanatory design: From theory to practice. Field Methods, 18(1), 3–20.
Ivankova, N. V., & Kawamura, Y. (2010). Emerging trends in the utilization of integrated
designs in the social, behavioral, and health sciences. Sage handbook of mixed methods in
social and behavioral research, 2, 581-611.
Kaplan, A., Katz, I., & Flum, H. (2012). Motivation theory in educational practice: Knowledge
claims, challenges, and future directions. APA educational psychology handbook, 2, 165-194.
http://dx.doi.org/10.1037/13274-007
Koopmans, M. (2014). Nonlinear change and the black box problem in educational
research. Nonlinear Dynamics, Psychology and Life Sciences, 18, 5-22.
Koster, M., Bouwer, R., & van den Bergh, H. (2017). Professional development of teachers in
the implementation of a strategy-focused writing intervention program for elementary
students. Contemporary Educational Psychology, 49, 1-20.
Kumar, R., Karabenick, S., Warnke, J.H., Hany, S., & Seay, N. (this issue). Culturally inclusive
and responsive curricular learning environments (CIRCLEs): An exploratory sequential
mixed-methods approach. Contemporary Educational Psychology, doi:
https://doi.org/10.1016/j.cedpsych.2018.10.005
Running head: MIXED METHODS INTRODUCTION
30
Kurtz-Costas, B. & Woods, T. (2017). Race and ethnicity in the study of competence motivation.
In, Elliot, A. J., Dweck, C. S., & Yeager, D. S. (Eds.). Handbook of competence and
motivation: Theory and application (2nd ed., pp. 529-546). Guilford Publications.
Levitt, H. M., Bamberg, M., Creswell, J. W., Frost, D. M., Josselson, R., & Suarez-Orozco, C.
(2018). Journal article reporting standards for qualitative primary, qualitative meta-
analytic, and mixed methods research in psychology: The APA Publications and
Communications Board task force report. American Psychologist, 73, 26-46.
Lincoln, Y. S., Lynham, S. A., & Guba, E. G. (2011). Paradigmatic controversies, contradictions,
and emerging confluences, revisited. The Sage handbook of qualitative research, 4, 97-
128.
Matthews, J. S., & López, F. (this issue). Speaking their language: The role of cultural content
integration and heritage language for academic achievement among Latino children.
Contemporary Educational Psychology, doi:
https://doi.org/10.1016/j.cedpsych.2018.01.005
Maxcy, S. (2003). Pragmatic threads in mixed methods research in the social sciences: The
search for multiple modes of inquiry and the end of the philosophy of formalism. In A.
Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral
research (pp. 51-90). Thousand Oaks, CA: Sage.
McCrudden, M. T., & McTigue, E. M. (2018). Implementing integration in an explanatory
sequential mixed methods study of belief bias about climate change with high school
students. Journal of Mixed Methods Research, doi:
https://doi.org/10.1177/1558689818762576.
Running head: MIXED METHODS INTRODUCTION
31
McCrudden, M. T., Magliano, J. P., & Schraw, G. (2010). Exploring how relevance instructions
affect personal reading intentions, reading goals and text processing: A mixed methods
study. Contemporary Educational Psychology, 35(4), 229-241.
McKim, C. A. (2017). The value of mixed methods research: A mixed methods study. Journal of
Mixed Methods Research, 11(2), 202-222.
Mixed Methods in Education Research IES Technical Working Group Meeting Summary. (2015,
May 29). Retrieved from https://ies.ed.gov/ncer/whatsnew/techworkinggroup/.
Morse J.M. (1998) Validity by committee. Qualitative Health Research 8(4), 443–445.
Morse, J. M. (2015). Critical analysis of strategies for determining rigor in qualitative inquiry.
Qualitative Health Research, 25, 1212–1222.
Nolen, S. B., Horn, I.S., & Ward, C.J. (2015). Situating motivation. Educational
Psychologist, 50(3), 234-247.
O’Cathain, A. (2010). Assessing the quality of mixed methods research: Toward a
comprehensive framework. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed
methods in social & behavioral research (2nd ed., pp. 531-555). Thousand Oaks, CA:
SAGE Publications.
O’Cathain, A., Murphy, E., & Nicholl, J. (2007). Integration and publications as indicators of
“yield” from mixed methods studies. Journal of Mixed Methods Research, 1(2), 147-163.
O'Cathain, A., Murphy, E., & Nicholl, J. (2008). The quality of mixed methods studies in health
services research. Journal of Health Services Research & Policy, 13(2), 92-98.
O'Cathain, E. Murphy, J. Nicholl (2010). Three techniques for integrating data in mixed methods
studies. British Medical Journal, 341, 1147-1150.
Running head: MIXED METHODS INTRODUCTION
32
Onwuegbuzie, A. J., & Leech, N. L. (2004). Enhancing the interpretation of significant findings:
The role of mixed methods research. The Qualitative Report, 9(4), 770-792.
Plano Clark, V. L. (2010). The adoption and practice of mixed methods: US trends in federally
funded health-related research. Qualitative Inquiry, 16(6), 428-440.
Plano Clark, V. L. (this issue). Meaningful integration within mixed methods studies:
Identifying why, what, when and how. Contemporary Educational Psychology.
Plano Clark, V., & Ivankova, N. (2016). Mixed methods research. A guide to the field. Thousand
Oaks, CA: Sage.
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data
analysis methods (Vol. 1). Thousand Oaks, CA: Sage.
Rozin, P. (2001). Social psychology and science: Some lessons from Solomon Asch. Personality
and Social Psychology Review, 5(1), 2-14.
Rozin, P. (2009). What kind of empirical research should we publish, fund, and reward? A
different perspective. Perspectives on Psychological Science, 4(4), 435-439.
Sandelowski, M., Voils, C.I., & Knafl, G. (2009). On quantitizing. Journal of Mixed Methods
Research, 3, 208–222.
Schmidt, J. A., Kafkas, S. S., Maier, K. S., Shumow, L., & Kackar-Cam, H. Z. (this issue). Why
are we learning this? Using mixed methods to understand teachers’ relevance statements
and how they shape middle school students’ perceptions of science utility. Contemporary
Educational Psychology, doi: https://doi.org/10.1016/j.cedpsych.2018.08.005
Schutz, P. A. (2014). Inquiry on teachers’ emotion. Educational Psychologist, 49(1), 1-12.
Schwartz, S. J., Syed, M., Yip, T., Knight, G. P., Umaña‐Taylor, A. J., Rivas‐Drake, D., ... &
Running head: MIXED METHODS INTRODUCTION
33
Ethnic and Racial Identity in the 21st Century Study Group. (2014). Methodological
issues in ethnic and racial identity research with ethnic minority populations: Theoretical
precision, measurement issues, and research designs. Child Development, 85(1), 58-76.
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental
designs for generalized causal inference. Boston, MA: Houghton-Mifflin.
Tashakkori, A., & Creswell, J.W. (2007). The new era of mixed methods. Journal of Mixed
Methods Research, 1, 3-7.
Tashakkori, A., & Teddlie, C. (2003). Handbook of mixed methods in social and behavioral
research. Thousand Oaks, CA: Sage.
Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating
quantitative and qualitative approaches in the social and behavioral sciences. Thousand
Oaks, CA: Sage.
Teddlie, C. & Tashakkori, A. (2012). Common “core” characteristics of mixed methods
research: A review of critical issues and call for greater convergence. American
Behavioral Scientist, 56 (6), 774-788.
Usher, E. L., Ford, C. J., Lee, C. R., & Weidner, B. L. (this issue). Sources of math and science
self-efficacy in rural Appalachia: A convergent mixed methods study. Contemporary
Educational Psychology, doi: https://doi.org/10.1016/j.cedpsych.2018.10.003.
White, A., DeCuir-Gunby, J., & Kim, S. (this issue). A mixed methods exploration of the
relationships between the racial identity, science identity, science self-efficacy, and
science achievement of African American students at HBCUs. Contemporary
Educational Psychology, https://doi.org/10.1016/j.cedpsych.2018.11.006.
Running head: MIXED METHODS INTRODUCTION
34
Woolley, C.M. (2009). Meeting the mixed methods challenge of integration in a sociological
study of structure and agency. Journal of Mixed Methods Research, 3(1), 7-25.
Yin, R.K. (2006). Mixed methods research: Are the methods genuinely integrated or merely
parallel? Research in the Schools, 13(1), 41-47.
Running head: MIXED METHODS INTRODUCTION
35
TABLE 1
Types of Core Mixed Methods Research Designs
Types of Core Mixed Methods Research Designs
Convergent
Explanatory sequential
Exploratory sequential
Timing of data
collection
Concurrent
Sequential
Sequential
Timing of data
analysis
Independent
Dependent
Dependent
Intent of
integration
Generate interpretations that
extend the breadth and range of
the inquiry and/or seek
corroboration
Use the qualitative strand to
elaborate, enhance, or explain
some finding of interest from
the quantitative strand
To use the qualitative
phase to create or build a
follow-up quantitative
instrument or intervention
Examples from
special issue
Schmidt et al. (this issue);
Usher et al. (this issue); White
et al. (this issue)
Matthews & López (this
issue)
Kumar et al. (this issue)
Running head: MIXED METHODS INTRODUCTION
36
(a) Convergent design
(b) Explanatory sequential design
(c) Exploratory sequential design
FIGURE 1
Core mixed methods research designs
Sources: Adapted from Creswell and Plano Clark (2018) and Plano Clark and Ivankova (2016).
Quantitative
data collection
Qualitative
data collection
Quantitative
data analysis
Qualitative
data analysis
Interpret
findings
Quantitative
data collection
Qualitative
data collection
Quantitative
data analysis
Qualitative
data analysis
Sampling
(connecting)
Interpret
findings
Qualitative
data collection
Quantitative
data collection
Qualitative
data analysis
Quantitative
data analysis
Development
(building)
Interpret
findings
Merge
findings