ArticlePDF Available

Abstract and Figures

Process-tracing has grown in popularity among qualitative researchers. However, unlike statistical models and estimators—or even other topics in qualitative methods—process-tracing is largely bereft of guidelines, especially when it comes to teaching. We address this shortcoming by providing a step-by-step checklist for developing a research design to use process-tracing as a valid and substantial tool for hypothesis testing. This practical guide should be of interest for both research application and instructional purposes. An online appendix containing multiple examples facilitates teaching of the method.
Content may be subject to copyright.
.........................................................................................................................................................................................................................................................................................................................
842  PS • October 2018 © American Political Science Association, 2018 doi:10.1017/S1049096518000975
The Teacher
Process-Tracing Research Designs:
A Practical Guide
Jacob I. Ricks, Singapore Management University
Amy H. Liu, University of Texas at Austin
ABSTRACT  Process-tracing has grown in popularity among qualitative researchers. However,
unlike statistical models and estimators—or even other topics in qualitative methods—process-
tracing is largely bereft of guidelines, especially when it comes to teaching. We address this
shortcoming by providing a step-by-step checklist for developing a research design to use
process-tracing as a valid and substantial tool for hypothesis testing. This practical guide
should be of interest for both research application and instructional purposes. An online
appendix containing multiple examples facilitates teaching of the method.
How does one develop a research design based on
process-tracing? This question highlights a major
challenge in teaching and adopting process-tracing
methods. Although there is an expanding body
of work on the approach (Beach and Pedersen
2013; Bennett and Checkel 2015; Humphreys and Jacobs 2015;
Mahoney 2012; Rohlfing 2014), we are still faced with Collier’s
(2011, 823) lamentation: “Too often this tool is neither well
understood nor rigorously applied” (see also Blatter and Blume
2008, 318; Zaks 2017). One central concern is that there are few
instructional materials in the qualitative-methods canon (Elman,
Kapiszewski, and Kirilova 2015; Kapiszewski, MacLean, and Read
2014). This article provides a short, practical guide for develop-
ing a process-tracing research design. The corresponding online
appendix applies this guide to four examples, thereby offering
a tool for researchers seeking to employ and instructors planning
to teach this method.
The material is organized in the form of a checklist
that provides introductory guideposts to help researchers
structure their research designs. This article is not a compre-
hensive literature review (Kay and Baker 2015), and neither
is it the final word on what constitutes good process-tracing
(Waldner 2015). There remains much work to be done in defin-
ing, delineating, and developing process-tracing methods,
and we advise graduate students and advanced researchers to
become familiar with these debates (Beach and Pedersen 2013;
Bennett and Checkel 2015). Instead, our contribution is to
make process-tracing accessible and more readily applicable to
beginners without being distracted by ongoing methodological
discussions.
The discussion is limited to one type of process-tracing: theory
testing (Beach and Pedersen 2013). Specifically, we focus on the
systematic study of the link between an outcome of interest and
an explanation based on the rigorous assessing and weighting of
evidence for and against causal inference. By defining process-
tracing in these terms, we emphasize the role of theory and the
empirical testing of hypotheses. The challenge is to assemble a
research design equipped to do so.
THE CHECKLIST
To craft a research design based on process-tracing, we suggest
that researchers must (1) define their theoretical expectations,
(2) give direction to their research, and (3) identify the types of
data necessary for testing a theory. Stated differently, the steps
outlined in figure 1 set the stage for implementing best practices
(Bennett and Checkel 2015). In the online appendix provided to
assist with teaching, we show how this checklist can be applied in
four different examples: the rise of the Japanese developmental
state; the electoral success of the Thai Rak Thai party in Thailand;
the standardization of English in Singapore; and the bureaucratic
reforms of the Philippines irrigation agency. We recommend that
instructors start with the checklist before having students read
the appendix; these materials should be paired with Collier (2011).
Alternatively, instructors can present both the checklist and the
appendix simultaneously and then assign students to use the
checklist to evaluate a separate article based on process-tracing
methods (e.g., Fairfield 2013 and Tannenwald 1999). The goal is to
ingrain in students’ minds what process-tracing is and how it can
be used. In the following discussion, we reference the example of
Slater and Wong’s (2013) process-tracing analysis of why strong
authoritarian parties sometimes embrace democratization.
Step 1: Identify Hypotheses
We adopt the maxim “Theory saves us all.” Research designs and
empirical analyses for causal analysis should be theoretically guided.
Jacob I. Ricks is assistant professor of political science at Singapore Management University.
He can be reached at jacobricks@smu.edu.sg.
Amy H. Liu is associate professor in government at University of Texas at Austin.
She can be reached at amy.liu@austin.utexas.edu.
PS • October 2018 843
.........................................................................................................................................................................................................................................................................................................................
Therefore, establishing testable hypotheses based on our theories
is the first step in good process-tracing. In this sense, building
a research design for process-tracing is the same as in any other
attempt at causal inference. There is, however, one important
distinction. In process-tracing, we are concerned not only with
our theory of interest; we also must juxtapose rival explanations
that we intend to test (Hall 2013; Rohlfing 2014; Zaks 2017). It is
important that the concerned hypothesis is evaluated against
alternative(s) in a Lakatosian sense, creating a “three-cornered
fight” that pits our observations against both our primary theory
and at least one alternative (Lakatos 1970).
The checklist is structured to allow for the testing of multiple—
that is, as many as required—rival hypotheses. In an oft-used
comparison, detectives in criminal cases begin their investigation
by focusing on those closest to the victim and then eliminating
suspects (i.e., hypotheses) along the way. Social scientists should
act similarly, remembering Ockham’s razor: seek first hypotheses
that are clearly related to the outcome, simple, and testable before
employing more complex explanations. These theoretical expecta-
tions should be plainly established before moving to step 2.
Step 2: Establish Timelines
The second step is to sequence events. Timelines should be
bookended according to the theoretical expectations. The conclusion
of the timeline will be at or shortly after the outcome of interest—that
is, the dependent variable. The challenge is to identify how far back
in time we must go to seek out our cause. A good timeline begins
with the emergence of the theorized causal variable. For instance,
we hypothesize that the compounded effect of antecedent party
strength, ominous signals, and legitimization strategies causes
strong authoritarian parties to embrace democratization (Slater and
Wong 2013). Therefore, we begin our timeline with the foundations
of the vital components of the theory—namely, the antecedent
strength of the party—and end it with the democratic transition.
The timeline has several purposes. First, it clarifies the research-
er’s thought process. Second, it establishes temporal precedence.
Third, it provides what can be constituted as a “face-validity” test
for the argument. Fourth, it helps to identify major events that
could have shaped the outcome of interest. In doing so, this allows
us to revisit our hypotheses and to ascertain whether we might be
missing an obvious probable cause for the concerned outcome.
In essence, we give ourselves the opportunity to verify whether
the events in question fit the hypotheses. Analogously, criminal
investigators also use timelines to establish the victims’ histories
and points where they may have met foul play. Although these
timelines rarely find their way into published works, they are an
imperative step in the research process. Researchers should keep
their timelines readily available with updates as they progress
through the many stages of fieldwork. Timeline development is a
critical exercise before initiating evidence collection.
Step 3: Construct Causal Graph
After sequencing the timeline, the next step is to construct
a causal graph (Waldner 2015). This type of graph identifies
the independent variable(s) of
interest. It also provides struc-
ture, allowing us to focus on the
link between the explanation
and the concerned outcome. In
other words, a causal graph vis-
ually depicts the causal process
through which X causes Y. With
a causal graph, we can identify all
moments when the concerned
actor (e.g., individual, govern-
ment, party, or group) made a
choice that could have affected
the result. This endogenous
choice need not be contentious,
but it does need to be theoreti-
cally relevant.
We depart slightly from
Waldner (2015), however, in
two ways. First, we contend that
just as not all choices are rele-
vant moments, not all relevant
moments are choices. They also
Figure 1
Process-Tracing: The Checklist
In an oft-used comparison, detectives in criminal cases begin their investigation by focusing
on those closest to the victim and then eliminating suspects (i.e., hypotheses) along the way. Social
scientists should act similarly, remembering Ockham’s razor: seek first hypotheses that are clearly
related to the outcome, simple, and testable before employing more complex explanations.
844  PS • October 2018
.........................................................................................................................................................................................................................................................................................................................
The Teacher: Process-Tracing Research Designs: A Practical Guide
can be exogenous events—that is, critical junctures that emerge
from events such as the discovery of oil or a natural disaster. What
matters is that these moments are “collectively sufficient to gen-
erate the outcome” (Waldner 2015, 131). Second, our use of causal
graphs potentially includes events that may not fit clearly into
the causal process being identified. We distinguish these events
with dashed lines. In contrast, the causal process remains out-
lined with solid lines. This highlights and clarifies—especially for
students—that not all interesting events are variables of interest.
Figure 2
Causal Graph of Slater and Wong (2013)
These activities are part of the background work that must be accomplished before engaging
in any type of fieldwork—from visiting archives to conducting interviews, from administering
surveys to observing participants.
For an example, we offer a simple causal graph of Slater and
Wong’s (2013) theory about why strong authoritarian-party
states democratize (figure 2). Slater and Wong began by pre-
senting their scope condition: democratic transitions under
the watch of dominant authoritarian ruling parties. Given
this situation, our theoretical expectation would be a low like-
lihood of democratization. Yet, Slater and Wong (2013, 719)
claimed that “dominant parties can be incentivized to con-
cede democratization from a position of exceptional strength”
under a set of three specific conditions. First, they must enjoy
a high degree of antecedent strengths—that is, confidence that
the party can still dominate post-transition politics. Second,
this strength, however, must have been challenged by ominous
signals that the party is past its authoritarian prime. Third,
leaders must strategically choose to adopt democratic legitima-
tion strategies.
Causal graphs follow the initial timeline; they build on the
series of events that are identified in the timeline. In other
words, we can pinpoint the hypothesized explanation and
the outcome in a temporal chain. We can specify where and
which types of empirical information are necessary for the
analysis. The timeline and the causal graph can be developed
together iteratively. Whereas the sequence of events will not
change, the creation of the causal graph might cause us to revisit
the timeline to clarify links or highlight important missing
information.
Step 4: Identify Alternative Choice or Event
At each relevant moment in the causal graph, a different choice
could have been made or another event could have happened. For
each distinct moment, we identify these alternative(s). It is impor-
tant, however, that these alternatives are theoretically grounded.
There must be a reason that the choice could have been made or
that the event could have manifested differently.
Step 5: Identify Counterfactual Outcomes
Next, for each moment, we identify the counterfactual outcome
that would have happened if the alternative choice had been
taken or the alternative event had transpired. Counterfactuals
are vital to process-tracing, especially when no alternative cases
are considered (Fearon 1991). When treating hypothetical predic-
tions, it is imperative that another outcome was possible. If there
is no plausible theory-informed alternative outcome, then no real
choice or event has taken place. Thus, the link between the input
and the outcome was predetermined; hence, process-tracing pro-
vides little value added. Note that steps 4 and 5 are closely linked.
An approach in lieu of counterfactuals is the use of controlled
comparisons, wherein the case of interest is compared with empiri-
cal alternatives rather than a hypothetical counterfactual (Slater
and Ziblatt 2013). However, if a researcher is primarily focused
on one single case—or perhaps multiple cases that are not explic-
itly comparable via the research design—then this counterfactual
exercise is important. Even if a researcher does use controlled
comparisons, we still recommend considering counterfactuals.
Note, however, that counterfactuals are heuristic devices that allow
us to identify hypothesized outcomes and thus potential data
to collect; they are not evidence per se.
It is important that steps 1 through 5 be conducted before data
collection. These activities are part of the background work that
must be accomplished before engaging in any type of fieldwork
from visiting archives to conducting interviews, from administer-
ing surveys to observing participants. They are essential to the
process of theory testing because they establish expectations about
what researchers should encounter during their data-collection
process. Because process-tracing often is iterative, researchers
likely will revisit these steps throughout the research project
especially in light of new data. However, an initial plan for data
collection should be designed based on these five steps.
Step 6: Finding Evidence for Primary Hypothesis
After we have established a timeline, outlined our causal graphs,
and identified our theoretical expectations, we can design the
data-collection portion of our research project. At each iden-
tified relevant moment, we must plan to systematically find
evidence that the variable germane to the primary hypothesis
was the reason the concerned actor pursued the timeline path.
It is important that as we design our data collection, we must
recognize that not all evidence types are the same (Bennett
2014; Collier 2011; Mahoney 2012; Rohlfing 2014). Some data are
necessary to establish causation; others sufficient—and then there
are data that are neither or both. We suggest drawing on Van
Evera’s (1997) four types of evidence, summarized in table 1:
straw-in-the-wind, hoops, smoking gun, and doubly decisive.
Due to space constraints, we do not explain these evidence types
in detail (see Collier 2011 for an extensive discussion). Figure 1
utilizes these evidence types and the appendix demonstrates
their application.
PS • October 2018 845
.........................................................................................................................................................................................................................................................................................................................
Process-tracing involves rigor and attention to details and logic of causal inference similar to
that of a detective or a medical examiner.
When creating a data-collection plan, it is common for
researchers—especially those who spend time in the field—to
accumulate data in a “soak-and-poke” fashion. We do not con-
demn such efforts; however, we encourage researchers to think
carefully about the evidence types they are collecting because
most information gathered will be of the straw-in-the-wind
type. Stated differently, whereas much data gathered may offer
weak support for—or at least not negate—the primary hypothesis,
it is not the most useful for testing purposes. When designing
research, it is absolutely vital to remain cognizant of the evidence
type collected and its ability to support or negate the larger claims
(Fairfield 2013). The causal graph is particularly useful at this
point because it identifies the links that must be made between
Table 1
Types of Evidence for Process-Tracing
Sucient for Arming Causal Inference
No Yes
Necessary for Arming
Causal Inference
No
1. Straw-in-the-Wind 3. Smoking Gun
a. Passing: Arms relevance of hypothesis but does not conrm it. a. Passing: Conrms hypothesis.
b. Failing: Hypothesis is not eliminated but is slightly weakened. b. Failing: Hypothesis is not eliminated but is
somewhat weakened.
c. Implications for rival hypothesis:
Passing slightly weakens them.
Failing slightly strengthens them.
c. Implications for rival hypothesis:
Passing substantially weakens them.
Failing somewhat strengthens them.
Ye s
2. Hoops 4. Doubly Decisive
a. Passing: Arms relevance of hypothesis but does not conrm it. a. Passing: Conrms hypothesis and eliminates
others.
b. Failing: Eliminates hypothesis. b. Failing: Eliminates hypothesis.
c. Implications for rival hypothesis:
Passing somewhat weakens them.
Failing somewhat strengthens them.
c. Implications for rival hypothesis:
Passing eliminates them.
Failing substantially strengthens them.
Source: Collier (2011, 825)
interviews—for example, with military advisers from the authori-
tarian period who relayed growing disloyalty among the armed
forces and recommended the leadership to concede. It also can
be ascertained from archival documents—for example, minutes
from cabinet meetings discussing different electoral rules for the
party to adopt on transition. Conversely, evidence describing the
personalities active in alternative rival parties might be consid-
ered straw-in-the-wind. Although interesting, these data are not
vital to establishing the strength of the theory; more important is
information on the level of threat they posed to the ruling party.
When we design data collection, we must be careful to focus on the
evidence types that matter lest we be left building our evidentiary
house with a pile of straw.
our variables of interest to establish causation. For instance, certain
evidence types simultaneously can support our proposed theory
and eliminate a rival one. Van Evera (1997) called this doubly-
decisive evidence. If such a datum is found, then we can exclude
all other hypotheses and step 6 becomes the final one in
our process-tracing efforts. Unfortunately, these cases are rare.
Therefore, we must increase our evidence pool to demonstrate
that our hypothesis is the best fit from a set of possible expla-
nations. This is outlined in step 7.
For step 6, we exhort researchers to make clear their expec-
tations about the evidence types needed to (1) further support
their argument, and (2) negate the rival hypotheses. For instance,
consider Slater and Wong’s (2013) assertion that democratiza-
tion can emerge from strategic decisions by a ruling party. Here,
we want smoking-gun evidence that links antecedent strength,
ominous signals, and legitimation strategies directly to the
decision to democratize. This type of evidence can be found in
Step 7: Find Evidence for Rival Hypothesis
Our final step is to repeat step 6; at each choice node, the focus
now should be on alternative explanations. This step may
require multiple iterations depending on the number of rival
hypotheses. The objective is to dismiss as many explanations as
possible, leaving only one hypothesis as the most likely. Here,
the most important evidence type is the exclusionary or—per
Van Evera (1997)—the hoops test. Hoops evidence, if absent, can
eliminate a hypothesis from consideration. If the hypothesized
variable was not present when the event happened, then we can
dismiss the rival hypothesis.
If the rival explanation is not easily discarded, we must move
on to other data types. Wherever possible, we look for opportuni-
ties to dismiss the hypothesis. However, if at some point we find
evidence to the contrary, we cannot reject it. Instead, we must
consider that a rival hypothesis could explain the phenomenon of
interest better than the primary one.
846  PS • October 2018
.........................................................................................................................................................................................................................................................................................................................
The Teacher: Process-Tracing Research Designs: A Practical Guide
Because political phenomena are complex, it is possible
that the different explanations may not be mutually exclusive
(Zaks 2017). Therefore, pitting competing hypotheses against
one another can result in instances in which multiple hypotheses
all seem to have explanatory leverage. When these conditions
manifest, we must rely on a deep understanding of our cases to
weigh the evidence and judge which hypothesis best explains the
outcome. As in a criminal investigation, we must discern which
theory of the crime has the strongest evidence and proceed as
best we can to trial.
CONCLUSION
Despite the popularity of process-tracing in empirical research,
discussions on how to develop effective research designs based
on the method are largely absent in political science—especially
when we consider teaching materials. Frequently, there is a
disjuncture between theoretically driven research designs and
rigorously evaluated empirics. Beyond this, to those who do
not regularly engage in process-tracing, the method can be poorly
understood. The prime advocates of process-tracing continue
to make strides in pushing methodological understanding and
boundaries. This work, however, does not necessarily lend itself
to introducing the tool to the uninitiated. As a result, critics have
dismissed process-tracing as being ineffective in explaining polit-
ical phenomena beyond a singular case—if even that. We under-
stand but do not agree with these positions.
Process-tracing involves rigor and attention to details and
logic of causal inference similar to that of a detective or a medical
examiner. It requires establishing a sequence of events and iden-
tifying a suspect pool. With each piece of evidence, we can elimi-
nate a variable and/or strengthen one hypothesis against another.
We conduct this iterative process until we are ready for trial.
In this spirit, we offer our checklist to help researchers develop
a causal research design and then evaluate pieces of evidence sys-
tematically against it. Such practical guidance is largely missing
in the process-tracing literature. This guide and the applications
in the online appendix attempt to address this shortcoming and
to demonstrate how process-tracing can be done rigorously.
We challenge advocates to adopt these standards in their own work
and skeptics to conceptualize process-tracing as more than glori-
fied storytelling. We also hope that the method can be integrated
easily and clearly as a component of political science courses—not
only in methods classes but also in substantive courses. Indeed,
through careful application, process-tracing can serve as a strong
tool for hypothesis testing.
SUPPLEMENTARY MATERIAL
To view supplementary material for this article, please visit
https://doi.org/10.1017/S1049096518000975
ACKNOWLEDGMENTS
We thank Marissa Brookes, Jason Brownlee, José Cheibub,
Travis Curtice, Jennifer Cyr, John Donaldson, Richard Doner,
Zach Elkins, Michael Giles, Anna Gunderson, Nicholas Harrigan,
Abigail Heller, Allen Hicken, Laura Huber, Kendra Koivu, James
Mahoney, Eddy Malesky, Joel Moore, Ijlal Naqvi, Sari Niedzwiecki,
Rachel Schoner, Dan Slater, Hillel Soifer, Kurt Weyland, and
the anonymous reviewers for helpful comments on this article.
An earlier version was presented at the 2016 Southwest Mixed
Methods Research Workshop at University of Arizona. We were
able to convert this project from an idea to an article under the
auspices of the Short-Term Research Collaboration Program at
Singapore Management University. Any errors belong to the
authors. n
REFERENCES
Beach, Derek, and Rasmus Brun Pedersen. 2013. Process-Tracing Methods. Ann Arbor:
University of Michigan Press.
Bennett, Andrew. 2014. “Process-Tracing with Bayes.” Qualitative and Multi-Method
Research Spring: 46–51.
Bennett, Andrew, and Jeffrey T. Checkel. 2015. “Process-Tracing: From Philosophical
Roots to Best Practices.” In Process-Tracing, ed. Andrew Bennett and Jeffrey T.
Checkel, 3–37. New York: Cambridge University Press.
Blatter, Joachim, and Till Blume. 2008. “In Search of Co-Variance, Causal Mechanisms
or Congruence? Towards a Plural Understanding of Case Studies.Swiss Political
Science Review 14 (2): 315–56.
Collier, David. 2011. “Understanding Process-Tracing.” PS: Political Science & Politics
44 (4): 823–30.
Elman, Colin, Diana Kapiszewski, and Dessislava Kirilova. 2015. “Learning
through Research: Using Data to Train Undergraduates in Qualitative Methods.”
PS: Political Science & Politics 48 (1): 39–43.
Fairfield, Tasha. 2013. “Going Where the Money Is: Strategies for Taxing Economic
Elites in Unequal Democracies.World Development 47: 42–57.
Fearon, James D. 1991. “Counterfactuals and Hypothesis Testing in Political Science.”
World Politics 43 (2): 169–95.
Hall, Peter A. 2013. “Tracing the Progress of Process-Tracing.” European Political
Science 12 (1): 20–30.
Humphreys, Macartan, and Alan M. Jacobs. 2015. “Mixing Methods: A Bayesian
Approach.” American Political Science Review 109 (4): 653–73.
Kapiszewski, Diana, Lauren MacLean, and Benjamin Read. 2014. Field Research
in Political Science. New York: Cambridge University Press.
Kay, Adrian, and Phillip Baker. 2015. “What Can Causal Process-Tracing Offer to
Policy Studies? A Review of the Literature.Policy Studies Journal 43 (1): 1–21.
Lakatos, Imre. 1970. “Falsification and the Methodology of Scientific Research
Programmes.” In Criticism and the Growth of Knowledge, ed. Imre Lakatos and
Alan Musgrave, 91–196. New York: Cambridge University Press.
Mahoney, James. 2012. “The Logic of Process-Tracing Tests in the Social Sciences.”
Sociological Methods & Research 41 (4): 570–97.
Rohlfing, Ingo. 2014. “Comparative Hypothesis Testing Via Process-Tracing.”
Sociological Methods & Research 43 (4): 606–42.
Slater, Dan, and Joseph Wong. 2013. “The Strength to Concede: Ruling Parties and
Democratization in Developmental Asia.” Perspectives on Politics 11 (3): 717–33.
Slater, Dan, and Daniel Ziblatt. 2013. “The Enduring Indispensability of the Controlled
Comparison.” Comparative Political Studies 46 (10): 1301–27.
Tannenwald, Nina. 1999. “The Nuclear Taboo: The United States and the Normative
Basis of Nuclear Non-Use.” International Organization 53 (3): 433–68.
Van Evera, Stephen. 1997. Guide to Methods for Students of Political Science. Ithaca, NY:
Cornell University Press.
Waldner, David. 2015. “What Makes Process-Tracing Good? Causal Mechanisms,
Causal Inference, and the Completeness Standard in Comparative Politics.”
In Process-Tracing, ed. Andrew Bennett and Jeffrey T. Checkel, 126–52. New York:
Cambridge University Press.
Zaks, Sherry. 2017. “Relationships Among Rivals (RAR): A Framework for
Analyzing Contending Hypotheses in Process-Tracing.” Political Analysis
25 (3): 344–62.
... The family coordinators verified or contradicted the activities, prerequisites/assumptions, and outcomes as well as how these were related to each other. This process was inspired by the 'smoking gun test' (Mahoney, 2015;Ricks & Liu, 2018), in which hypotheses are contradicted or confirmed using an iterative process. Family coordinators' verification of activities, prerequisites/assumptions, and outcomes, as well as the relationship between them, provided strong support for the mechanisms described. ...
... We follow the best practices in process tracing established by Collier (2011) and Ricks and Liu (2018). We first establish a timeline of events in December 1989, and then explain why this particular unfolding is better explained by considering anti-regime norms based on the logic of appropriateness, rather than two alternative theories: (i) a major shift in the political opportunity structure exploited by dissidents with exceptionally low levels of risk aversion, and (ii) a favourable dynamic of the repressive apparatus of the communist regime. ...
Preprint
Full-text available
We demonstrate the persistent role of political violence in shaping identities, making people from victimised communities more likely to engage in dissent even in high-threat environments. Theoretically, we argue that extreme repression instills anti-regime hostility that is curated and nurtured over time, making dissent the appropriate form of political behaviour when the opportunity arises, regardless of its consequences. Our research focuses on the Romanian Gulag, a network of labour camps, penal colonies and extermination centres used by the communist regime to suppress opposition, resulting in hundreds of thousands of deaths between 1945 and 1965. Using spatial regression and instrumental variable methods, we find that places with Gulag facilities had, on average, 5 times more people injured during the anti-communist revolution of 1989. Bayesian process tracing, conducted in a pathway case selected using causal forests, provides ample evidence in support of our theorised causal mechanism.
... We follow the best practices in process tracing established by Collier (2011) and Ricks and Liu (2018). We first establish a timeline of events in December 1989, and then explain 21 Since pathway cases only become apparent after the cross-case causal inference was performed, they are by design diagnostic tools aimed at further exploring the depth of the causal relationship, not at expanding its breadth (Gerring and Cojocaru, 2016, p.405-406) 22 In other words, the visibility of the causal mechanism needs to be monotonic in the treatment effect, which makes the presence of the theorised causal mechanism in the pathway case a necessary condition for claiming the more general presence of that mechanism in the population of cases, and its absence strong evidence for rejecting the original theory. ...
Preprint
Full-text available
We demonstrate the persistent role of political violence in shaping identities, making people from victimised communities more likely to engage in dissent even in high-threat environments. Theoretically, we argue that extreme repression instills anti-regime hostility that is curated and nurtured over time, making dissent the appropriate form of political behaviour when the opportunity arises, regardless of its consequences. Our research focuses on the Romanian Gulag, a network of labour camps, penal colonies and extermination centres used by the communist regime to suppress opposition, resulting in hundreds of thousands of deaths between 1945 and 1965. Using spatial regression and instrumental variable methods, we find that places with Gulag facilities had, on average, 5 times more people injured during the anti-communist revolution of 1989. Bayesian process tracing, conducted in a pathway case selected using causal forests, provides ample evidence in support of our theorised causal mechanism.
Article
Full-text available
This paper concerns new forms of collective agency in the area of health care in Croatia from 2015 until 2018. These new forms developed in the midst of the growing privatization and commodification of health care and the simultaneous decrease in the accessibility to health care. Privatization has taken place slowly, but continuously over the last 29 years. The traditional civil society organizations in the field of health care used to be characterized by a narrow set of activities, with vertical structures, and were frequently focused on a single-disease approach and collaboration with the pharmaceutical industry. Such practices produced limited results, hence improved forms of activism emerged. In this research, we illustrate their development using the example of three case studies of collective agency. The first case study is looking at the policy analysis and the activist group started by the Organization for Workers' Initiative and Democratization (OWID); the second one focuses on the informal group of medical students called U3 formed at the Andrija Štampar School of Public Health with the aim of developing critical thinking; and the third case study considers the Karika Association, started as an attempt to rethink health care in the community. The main research methods employed included process tracing analysis and research data comparison aimed at showing the differences between the traditional and the new forms of healthcare activism, in addition to the secondary sources of information such as scientific and professional literature. The results show that the new forms of collective agency in the healthcare area include various groups of citizens not necessarily connected with a specific disease, that they have a horizontal structure and are focused on the healthcare system in general. In conclusion, they represent the beginning of a paradigmatic shift in activism from a single-disease approach towards comprehensive health care that has the potential to deal with the growing issue of commercialization, commodification and inequalities in today's healthcare systems.
Article
A diverse group of governments have accepted “Major Non-NATO Ally” (MNNA) status since the designation's establishment in the late 1980s. This United States (U.S.) designation signals friendship and facilitates cooperation, but it provides no formal security commitments. Why and when have U.S. partners accepted MNNA status? I argue that designees will accept the status when they are ready to acknowledge America's appreciation—a perceptual and relational concept that conveys gratitude absent guarantees. For some designees, untimely embrace of U.S. appreciation could negatively impact their relations with their societies and/or third-party states. For others, accepting appreciation would preclude forming a formal alliance with the U.S. I analyze published sources and incorporate interviews to compare Qatar, which accepted MNNA status in 2022, and the United Arab Emirates, which has not accepted the status as of late 2024. This article contributes to the literature on asymmetric security alignments by centering the Gulf governments and it provides a timely evaluation of an underexplored aspect of global alliance politics.
Article
Full-text available
Anti-reclamation movements are common in Indonesia, but their effectiveness varies. Such movements, which oppose the infilling of coastal waters and wetlands, consistently draw support from environmentalists, fisherfolk, and coastal residents. To succeed, however, they must transcend these constituencies and mobilize broad coalitions. In this paper, I apply the concept of political opportunity to explain variation in the ability of anti-reclamation movements to achieve this goal. Specifically, I argue that the opportunity to build broad coalitions depends on the positioning of political, economic, and communal elites. Disagreement among these groups creates opportunities for activists to recruit some of them as allies in the construction of economically diverse, cross-class coalitions. Consensus, by contrast, excludes elites as potential allies, forcing activists to build geographically expansive but class-based coalitions. To develop my argument, I draw on local news archives and primary source documents to compare similarly situated anti-reclamation movements in Bali and Makassar. In Bali, the movement flourished by cultivating an alliance with communal elites and local businesses. In Makassar, the movement withered because public officials, local businesses, and communal elites all welcomed reclamation. My findings imply that anti-reclamation movements are most likely to succeed when they emphasize communal identities with cross-class appeal. Yet such tactics alienate parallel movements from one another and undermine national activism. As a result, anti-reclamation movements fight the same battles over and over without achieving national reforms that would empower coastal communities to participate in coastal planning. Under such conditions, reclamation deepens the vulnerability of coastal communities to climate change.
Article
Using case studies from Namibia and Mozambique, we examine how regulations against hunting impact person–place relationships and affect multidimensional wellbeing in conservation spaces. We combine Amartya Sen's capability approach with theories of place, using Chris de Wet's concept of disemplacement to investigate the ways conservation efforts affect rural quality of life. We find conservation and place-making become incompatible if people are prevented from adapting lifestyles and livelihoods to accommodate changing circumstances. By tracing distinct dimensions of the disemplacement process, we demonstrate the adverse and compounding effects of wildlife regulations associated with nature tourism. Disruptions to economic livelihoods and physical security destabilize person–place bonds that enhance wellbeing. Material losses and economic hardship are accompanied by institutional disruptions that contribute to marginalization and social exclusion. We provide a detailed illustration of how conservation regulations constrain agency and contribute to a growing sense of powerlessness by decreasing local control over wildlife, which consequently weakens place attachment and diminishes wellbeing. Our study demonstrates how people become more multidimensionally impoverished as conservation initiatives change the places they value while simultaneously limiting their capabilities to maintain place attachment.
Article
With participation in electoral politics limited, asylum seekers rely on lobbying to influence policy, but the factors which facilitate or constrain this process remain unclear. I interview asylum rights lobbyists in Germany and the United Kingdom to study the effects of lobbying institutions, using process tracing to identify influence mechanisms. Contrary to previous research, I find that successful lobbying stems from the quality not the quantity of institutions. I show how Germany’s corporatist institutions provide stable lobbying structures, building trustful working relationships. This enables collaborative policymaking and increases opportunities for influence. Informal, inconsistent structures in the United Kingdom’s pluralist system impair trust, reducing collaborative policymaking and influence. UK institutions are more policy responsive, however, enabling groups to capitalise on favourable public opinion, whereas the German system remains stable. The results demonstrate the effects of institutional arrangements and identify mechanisms which can improve policy outcomes for asylum seekers and other disadvantaged groups.
Article
Full-text available
We anticipate students will acquire research skills most effectively when they engage in “active learning,” rather than passively absorbing information. The question raised by Druckman’s call to integrate research and undergraduate teaching is to what extent active learning can or should take the form of direct involvement in faculty members’ research projects. To be sure, significant benefits to such participation exist. Collaborative research can strengthen mentoring relations. It can also teach students about teamwork and group dynamics, about discipline and rigor, about listening and compromise, and about the rhythm and mechanics of carrying out social inquiry. Students may be able to acquire substantive knowledge about their subject area as well. And participating in faculty research can empower and inspire undergraduates to want to engage in research, an unquestionably positive outcome. Nonetheless, we believe that undergraduates can only meaningfully collaborate in a research project—and can only learn from doing so—if they first receive a solid grounding in the methods being used in the project. Collaborative research involves a range of individual research tasks at different levels of sophistication and with varying degrees of proximity to a project’s underlying epistemology. Undergraduates with little or no methodological preparation can undertake simple tasks with straightforward scripts. Yet a research task is not by itself an act of social inquiry. It only becomes so by virtue of its placement in a project’s larger methodological context. Moreover, simply undertaking research tasks does not allow students to grasp the methods being used in a project, or to understand their rules, why they must be followed, what it means to follow them, and what failing to do so implies for the resulting scholarship.
Article
Full-text available
Causal process tracing (CPT) has emerged as an important method of causal inference in qualitative social science research, most notably in case study research designs. There is now a considerable literature on the aims, philosophical groundings, and methods of process tracing. This paper reviews the CPT literature to assess what new directions it may suggest for policy studies. The first part of the paper sets out the methodological advantages CPT offers in building and testing theories of policy change, most notably in supporting a theoretical pluralism to address the problem of complexity in policy studies. Building on recent scholarship across the social sciences, the second part examines step by step the recently minted “best practice” for undertaking CPT in policy studies. This part includes discussion of the possible pitfalls of CPT as a method; common errors involved in its use are set out and minimization strategies offered. In particular, while acknowledging the usefulness of Bayesian tests for causality as heuristic devices, we emphasize the limitations of applying such tests in practice. Possible correctives are suggested. The final part of the paper speculates more generally on the potential of CPT to improve our investigation of patterns of policy change over time.
Chapter
Advances in qualitative methods and recent developments in the philosophy of science have led to an emphasis on explanation via reference to causal mechanisms. This book argues that the method known as process tracing is particularly well suited to developing and assessing theories about such mechanisms. The editors begin by establishing a philosophical basis for process tracing - one that captures mainstream uses while simultaneously being open to applications by interpretive scholars. Equally important, they go on to establish best practices for individual process-tracing accounts - how micro to go, when to start (and stop), and how to deal with the problem of equifinality. The contributors then explore the application of process tracing across a range of subfields and theories in political science. This is an applied methods book which seeks to shrink the gap between the broad assertion that 'process tracing is good' and the precise claim 'this is an instance of good process tracing'.
Chapter
Two books have been particularly influential in contemporary philosophy of science: Karl R. Popper's Logic of Scientific Discovery, and Thomas S. Kuhn's Structure of Scientific Revolutions. Both agree upon the importance of revolutions in science, but differ about the role of criticism in science's revolutionary growth. This volume arose out of a symposium on Kuhn's work, with Popper in the chair, at an international colloquium held in London in 1965. The book begins with Kuhn's statement of his position followed by seven essays offering criticism and analysis, and finally by Kuhn's reply. The book will interest senior undergraduates and graduate students of the philosophy and history of science, as well as professional philosophers, philosophically inclined scientists, and some psychologists and sociologists.
Book
Process-tracing in social science is a method for studying causal mechanisms linking causes with outcomes. This enables the researcher to make strong inferences about how a cause (or set of causes) contributes to producing an outcome. Derek Beach and Rasmus Brun Pedersen introduce a refined definition of process-tracing, differentiating it into three distinct variants and explaining the applications and limitations of each. The authors develop the underlying logic of process-tracing, including how one should understand causal mechanisms and how Bayesian logic enables strong within-case inferences. They provide instructions for identifying the variant of process-tracing most appropriate for the research question at hand and a set of guidelines for each stage of the research process. © 2019 by Derek Beach and Rasmus Brun Pedersen. All rights reserved.
Article
Methodologists and substantive scholars alike agree that one of process tracing’s foremost contributions to qualitative research is its capacity to adjudicate among competing explanations of a phenomenon. Existing approaches, however, only provide explicit guidance on dealing with mutually exclusive explanations, which are exceedingly rare in social science research. I develop a tripartite solution to this problem. The Relationships among Rivals (RAR) framework (1) introduces a typology of relationships between alternative hypotheses, (2) develops specific guidelines for identifying which relationship is present between two hypotheses, and (3) maps out the varied implications for evidence collection and inference. I then integrate the RAR framework into each of the main process-tracing approaches and demonstrate how it affects the inferential process. Finally, I illustrate the purchase of the RAR framework by reanalyzing a seminal example of process-tracing research: Schultz’s (2001) analysis of the Fashoda Crisis. I show that the same evidence can yield new and sometimes contradictory inferences once scholars approach comparative hypothesis testing with this more nuanced framework.
Book
Field research - leaving one’s home institution in order to acquire data, information or insights that significantly inform one’s research - remains indispensable, even in a digitally networked era. This book, the first of its kind in political science, reconsiders the design and execution of field research and explores its role in producing knowledge. First, it offers an empirical overview of fieldwork in the discipline based on a large-scale survey and extensive interviews. Good fieldwork takes diverse forms yet follows a set of common practices and principles. Second, the book demonstrates the analytic benefits of fieldwork, showing how it contributes to our understanding of politics. Finally, it provides intellectual and practical guidance, with chapters on preparing for field research, operating in the field and making analytic progress while collecting data, and on data collection techniques including archival research, interviewing, ethnography and participant observation, surveys, and field experiments. © Diana Kapiszewski, Lauren M. MacLean and Benjamin L. Read 2015.
Article
We develop an approach to multimethod research that generates joint learning from quantitative and qualitative evidence. The framework—Bayesian integration of quantitative and qualitative data (BIQQ)—allows researchers to draw causal inferences from combinations of correlational (cross-case) and process-level (within-case) observations, given prior beliefs about causal effects, assignment propensities, and the informativeness of different kinds of causal-process evidence. In addition to posterior estimates of causal effects, the framework yields updating on the analytical assumptions underlying correlational analysis and process tracing. We illustrate the BIQQ approach with two applications to substantive issues that have received significant quantitative and qualitative treatment in political science: the origins of electoral systems and the causes of civil war. Finally, we demonstrate how the framework can yield guidance on multimethod research design, presenting results on the optimal combinations of qualitative and quantitative data collection under different research conditions.
Article
This article discusses process tracing as a methodology for testing hypotheses in the social sciences. With process tracing tests, the analyst combines preexisting generalizations with specific observations from within a single case to make causal inferences about that case. Process tracing tests can be used to help establish that (1) an initial event or process took place, (2) a subsequent outcome also occurred, and (3) the former was a cause of the latter. The article focuses on the logic of different process tracing tests, including hoop tests, smoking gun tests, and straw in the wind tests. New criteria for judging the strength of these tests are developed using ideas concerning the relative importance of necessary and sufficient conditions. Similarities and differences between process tracing and the deductive-nomological model of explanation are explored.
Article
Building on a Lakatosian approach that sees Social Science as an endeavour that confronts rival theories with systematic empirical observations, this article responds to probing questions that have been raised about the appropriate ways in which to conduct systematic process analysis and comparative enquiry. It explores varieties of process tracing, the role of interpretation in case studies, and the relationship between process tracing and comparative historical analysis.