Content uploaded by Tammar Zilber
Author content
All content in this area was uploaded by Tammar Zilber on Apr 29, 2022
Content may be subject to copyright.
Positioning and Fit in
Designing and Executing
Qualitative Research
Tammar B. Zilber
1
and Renate E. Meyer
2
Abstract
In this paper, we aim to help researchers think, design, and execute their empirical
journey by mapping the terrain of choices common in qualitative research. We
offer a matrix that relates to various dimensions—the level at which to study the phe-
nomenon (level of analysis), types of field materials, time orientation of research and
data, the analytic approach, and the unit of data (unit of analysis). This matrix is
intended to support making informed decisions that result in specific research designs
and the continuous process of reflection as to how these choices open and limit the
ability to answer the research question and offer an analytic generalization based on
the findings.
Keywords
qualitative research, level of analysis, analytic approach, data sources, time orientation,
unit of data
Introduction
Doing empirical research is all about making choices—what research questions to ask,
what kinds of field material to collect, how to analyze them, how to position the project
within a paradigmatic stand and a theoretical conversation. And on it goes, as each of
these big choices is actualized through numerous smaller ones. This is more so in qual-
itative research that adopts a constructionist epistemology and political pluralism
(Amis & Silk, 2008; Denzin & Lincoln, 1994). On a methodological level, such
1
Hebrew University Business School, Jerusalem, Israel
2
Vienna University of Economics and Business & Copenhagen Business School, Vienna, Austria
Corresponding Author:
Tammar B. Zilber, Hebrew University Business School, Mount Scopus, Jerusalem, 9190501, Israel.
Email: TZilber@huji.ac.il
Methodology Corner
The Journal of Applied Behavioral Science
1–16
© The Author(s) 2022
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/00218863221095332
journals.sagepub.com/home/jab
research has to consider that the researchers always enter an already meaningful world
and have to immerse into this meaning while being reflective of their own meaning-
making and potential interferences in the field. Qualitative research—either inductive,
bottom-up, or abductive, coming to novel insights through “mental leaps”(Reichertz,
2004)—requires sensitivity and flexibility to allow the empirical phenomenon and its
many contexts direct the project. No matter how much thought was put into designing a
research strategy before entering the field, qualitative researchers have to adapt deci-
sions on the fly and off-script once in the field.
The challenge is even more daunting as there are no “right”choices, and each meth-
odological option opens a space of possibilities and limits others. Thus, quality and
rigor in qualitative research are not about making specific decisions but about how
one makes choices, reasons, and justifies them (Grodal et al., 2021; Harley &
Cornelissen, 2022; Pratt et al., 2022; Schwandt, 2000). It is about using reflexivity
to identify choices to be made, map the possibilities, assess their implications, make
a choice that will fit well with past choices and the goal of the project (e.g., future
choices), and eventually take responsibility over these choices, and account for their
consequences.
In this paper, we aim to help researchers in this journey by mapping the terrain of
choices common in qualitative research. Based on our own experience as qualitative
researchers, teaching qualitative methods courses, and our roles as editors of qualitative
papers (Renate as Editor in Chief of Organization Studies, Tammar as Associate Editor
at Academy of Management Journal), we know how confusing these choices may be.
We know how easily we retreat to readymade options and templates that seem common
and legitimate without examining their fit to our specific project. We know how easy it
is to get distracted and consider each choice in isolation, losing sight of their interde-
pendencies and consequences. Thus, we offer a matrix that relates to various dimen-
sions—the level at which to study the phenomenon (level of analysis), types of field
materials (data sources), time orientation of research project and data, the analytic
approach, and the unit of data (unit of analysis). Each of these dimensions offers an
array of possibilities that fit more or less with each other. We are aware that there
are more dimensions than these but believe that working with them provides a solid
ground from which to dive deeper.
The level of phenomenon spans from the individual, through group, organization,
and field, to society or world polity. Data may be comprised of interviews, observa-
tions, or archival texts (including images, artifacts, and other types of non-verbal
texts). The field materials may be natural (existing without the research) or artificial
(provoked through the study), and match the time orientation of the research project
(time-authentic) or not. The analytic approach may be categorical, comparative, or
processual. Finally, the unit of data may be any recurring happening that serves as
focus for the analysis, like paragraphs, reports, meetings, events, or an entire interview
or organization. Juxtaposing the level of the phenomenon with the data source creates a
matrix that can enhance our thinking about the position of our study and reflect on it
through the time orientation, analytic approach, and unit of data, each adding a layer
of richness and complexity to our thinking. We offer this matrix as a guiding tool to
2The Journal of Applied Behavioral Science 0(0)
help researchers visualize the position of their research project within the terrain of pos-
sibilities in qualitative research and assess the degree of fit between the research ques-
tion and their choices of the case(s), data, and analysis.
Given current debates and critiques of templates in qualitative research (see, for
example, a recent special feature topic at Organizational Research Methods, Kohler
et al., 2022), let us state our position at the outset. The matrix we offer is intended
to help researchers think, design, and execute their research. It offers no clear-cut
recipes as to what to choose, as much as it highlights the choices that need to be
made and the various options and considerations. It is intended to support making
informed decisions that result in specific research designs and the continuous
process of reflection as to how these choices open and limit the ability to answer the
research question and offer an analytic generalization (Schwandt, 2015; Tsoukas,
2009) based on the findings. We focus on the early stages of the research process
when the researcher develops the general contours of the project and the focus of
attention.
Making Informed Decisions: Mapping the Terrain
To make informed decisions about one’s study, we suggest that one needs to situate it
within a matrix that relates to five dimensions: Level of the phenomenon, data sources,
time orientation, analytic approach, and unit of data. We will present each of these
dimensions separately, mapping it through classifications, and then move to examine
the fit between them.
Level of Phenomenon
The level at which a phenomenon is studied (usually referred to as the level of analysis
or level of phenomenon) is a familiar notion. It is also tricky, as it may convey stability
or independence of different levels, whereas they are changing and interdependent
(Dansereau et al., 1999; Scott, 2014). Still, for its usefulness, we will distinguish
among five different levels related to the scope of the studied phenomena (Scott,
2014). A study may focus on individual-level phenomena, like individual motivation,
preferences, or emotions. It may focus on group-level phenomena (including dyads),
like groupthink or self-managing teams. The focus may be on the organization as a
whole, looking at organizational culture or organizational identity. Some theories or
phenomena direct us to look at the inter-organizational space, called field, inter-
organizational network, industry, or ecosystem. In other cases, we take an even
broader approach and look at the societal level or world polity. Some phenomena
can be studied on different levels. The workings of institutions, for instance, can be
studied on all levels, from the individual level (such as emotions) or dyads (e.g.,
doctor–patient relationships), organizations (e.g., adoption of rationalized myths),
fields (e.g., the emergence or change of a network of organizations that partake in a
conversation that matters to them), society or the world polity (e.g., societal or
global meanings that cross boundaries and are translated in local contexts).
Zilber and Meyer 3
Levels of the phenomenon are relative, and each level is interlinked with the others.
Moreover, one may bridge different levels in one study by conducting multilevel
research (e.g., Haack et al., 2020). For simplicity, though, we will relate here to
studies that focus on one seemingly distinct level of the studied phenomenon.
Data Source
Generally, there are three primary sources of field materials in qualitative research:
archival data, observations, interviews. Although becoming data implies being
touched by the researcher, there are varying degrees of researcher involvement in pro-
ducing data: Naturally occurring data exists without the research, whereas non-
naturally occurring data has been contrived through the researcher and would not
exist without the research project (Potter & Wetherell, 1987; Silverman, 2019).
Archival data is the study of texts. “Text”is defined broadly, including written (books,
legal documents, meeting minutes), visual (images such as photos, diagrams, or cartoons,
and moving images such as TV programs or movies), and material (artifacts such as fur-
niture, buildings, objects, or prototypes). The researcher may collect the texts to create the
archive or usean existing one. Archival data may seem unproblematic and easy to gather,
yet it may involve much effort to access and ensure it is sufficiently complete (Stanley,
2017). In most instances, archival data are natural data, meaning that the texts were pro-
duced without any connection to the investigation, and the researcher was not involved in
creating them. But research may also elicit texts, for instance, by asking field actors for
written statements (such as open questions in surveys) or images (Meyer et al., 2013).
Observations may be another source of field material. In observations (Bernstein,
2017; Locke, 2011), what is observed is an activity that would have been going on
with no connection to the study. However, it would not have turned into data that
could be analyzed later without the researcher observing and documenting this activity.
There are different kinds of observations (overt or covert, Roulet et al., 2017; participa-
tory or non-participatory, Tedlock, 1991) involving various aspects and degrees of
immersion (Dumont, 2022). The degree to which the researcher’s presence has influenced
the observed activities is difficult to assess but nonetheless has to be considered when ana-
lyzing the data. Observations have to be documented in writing (field journal), audio or
video recording (or a mix thereof, see, e.g., Jarrett & Liu, 2018; Thompson & Byrne,
2022), and they eventually become a text to be analyzed. Finally, interviews conducted
for the research project are the most proactive source of field material. The interview
would not have existed had the researchers not created the interview situation, selected
interviewees, and posed questions. Interviews are social encounters in their own right
which means that the questions and utterances of the interviewers are part of the data
and the social situation of the interview needs to be taken into account when the data
is analyzed (Potter & Wetherell, 1987). There are, of course, different kinds of interviews
(e.g., open, semi-structured, and structured; life story interviews; narrative interviews)
grounded within different paradigmatic approaches (Alvesson, 2003; Froschauer &
Lueger, 2020; Langley & Meziani, 2020). It is common to record and transcribe (verba-
tim or non-verbatim) interviews and, hence, turn them into texts.
4The Journal of Applied Behavioral Science 0(0)
No one data source is principally superior to another—it depends on the research
question, the availability and quality of the data itself, and the method of analysis.
For all non-natural data, one needs to consider to what degree the researcher has
impacted the data and what the implications are. Therefore, some traditions of interpre-
tative research (e.g., hermeneutics) give, when available, preference to natural data.
The quality of qualitative research is not dependent on the amount of data collected
or analyzed. While there is no ‘natural’end to data collection (Potter & Wetherell,
1987), the collection of data in qualitative research, in general, follows the principle
of saturation. This means that data collection ends when additional data only
confirm what we already know, but does not add any new insights (Glaser &
Strauss, 1967). Regretfully, in our discipline, many conflate the quality of data with
its quantity. Falling into the fallacy of big numbers, even within qualitative research,
we see trends towards large data sets, with dozens of interviews, hundreds of hours
of observations, and archival texts by the thousands. It is nearly impossible to
handle such an extensive dataset with sufficient qualitative sensitivity. Thus, often
time, we see authors who either use the data selectively and focus only on parts
thereof without transparent choices concerning the units of data that are relevant for
their research endeavor or—trying to capture its totality—float above the data, offering
a birds’eye overview or descriptive summary rather than in-depth analysis
(Howard-Grenville et al., 2021). Our discussion here reflects our preference for the
quality of data and the ability to approach it in-depth over the quantity of data.
The distinction between these three types of data sources is not always as clear-cut
as it may seem. Interviews may be an opportunity for observation that may include the
interactions before and after the formal interview, or take note of the place and
mise-en-scene of the interview, and many observations involve informal interviews.
The distinction between archival and observational field material is blurry when it
comes to, for example, powerpoint presentations during an observed meeting or
field diaries. It has been lately complicated by data based on social media. Should
we treat Facebook posts as archival data or observations? It depends on the time
frame and whether or not one collects the data in one step or follows a FB group as
it develops. Moreover, one may use more than one data source in each study and tri-
angulate various data sources (Hammersley & Atkinson, 2019). Finally, the relevant
distinctions in the social sciences lie on the ontological and epistemological level,
and not on the level of methods and procedures (Hitzler & Eberle, 2004). Therefore,
interpretative research may also cross boundaries and use qualitative and quantitative
data collection techniques and mixed methods of analysis (Jick, 1979; Hannigan et al.,
2019; Meyer & Höllerer, 2010; Mohr & Neely, 2009; Molina-Azorin et al., 2017). For
simplicity, though, we will assume hereafter the use of one primary qualitative data
source.
Time Orientation
Qualitative research aims at understanding meaning. However, meaning is not stable or
fixed but changes over time and is in permanent flux (Schütz, 1962). The meaning
Zilber and Meyer 5
people give to occurrences they anticipate is different from their lived experience when
they actually occur, and again different from the meaning they give them when they
look back in time and recapitulate past events. Here, we reflect on the temporalities
inherent in the data, on the time orientation of the research project, and, finally, on
the fit between them.
All data, whether natural or artificial, is produced at a certain point in time, and the
meaning it contains is tied to this time of production. In addition, data reports on things
that, at the time of data production, lie in the present, the past, or in the future. Interview
data capture meanings at the time the interview was conducted. Still, the interviewee
may talk about events in the past or future or try to recollect what events meant for
them ‘then.’Observations collect data in real-time, in vivo, even when they observe
the planning of future happenings. All archival materials bear the imprint of the
time they were produced, even if they give futuristic visions or are themselves histor-
ical analysis. Qualitative research accounts for this time orientation inherent in the data
by being mindful of the prospective and retrospective sensemaking (Brown et al.,
2015; Huber & Power, 1985; Sandberg & Tsoukas, 2015; Weick, 1995) of the partic-
ipants in the study. Finally, most data sources (exceptions are often images or objects)
have an inherent temporal flow (the sequence of activities in observations or the
sequence of what is said in interviews or written in verbal texts) that may be relevant
for the analysis. Sequential analysis, an analytic technique in hermeneutic analysis
(Lueger et al., 2005), for example, strictly follows this temporal flow when analyzing
the material.
The research project also has a specific time orientation. It can focus on the mean-
ings assigned to the phenomenon by different types of actors in the present, or focus on
the variety of views that existed in the past or are projected into the future. Process
studies or longitudinal analysis, per definition, cover a specific period of time and
are interested in dynamics and how meaning changes.
Ideally, the temporality of the data matches the time orientation of the project and is,
in this sense, time-authentic. Time-authentic data has the advantage of being less subject
to retro- and prospective sensemaking (Barley & Tolbert, 1997). For instance, research-
ers may want to reconstruct futuristic visions that existed in the 1960ies by studying Star
Track episodes from that time or show the different frames that have been used in the
antivaxx movement in 2021 by studying social media posts from this year. For research
that examines processes or more extended periods, using time-authentic data implies
collecting data that have been produced over the entire period covered.
Analytic Approach
Here we take a “Big ‘A’” approach to analysis.
1
Rather than discussing specific tech-
niques of analysis (“small ‘a’” analysis), we relate to a broad approach to handling the
data, which goes hand in hand with specific research designs (how to study the phe-
nomenon, what data to collect, etc.). Most generally, there are three “big tent”
approaches to handling data within qualitative research in our discipline—categorical,
comparative, and process analysis.
6The Journal of Applied Behavioral Science 0(0)
Acategorical analysis is focused on meanings, practices, or mechanisms and is
based on a “parts and whole”understanding, according to which we can partition the
whole into thematic parts. We can understand the whole picture through the move
between first and second-order thematic units (Lieblich, 1998). The Gioia method
(Gioia et al., 2013) may be the most known in our discipline but is not the only one.
Some varieties of discourse analysis (Titscher et al., 2000), qualitative content analysis
(Mayring, 2000) or hermeneutic analysis (Lueger et al., 2005) that focus on meanings
(rather than their dynamics) are principally similar in their logic. This analytic approach
is usually based on the collection of interviews or archival material that covers a specific
organizational drama grounded within a particular time and place.
Comparative analysis is based on the use of multiple cases, similar on some dimen-
sions and different from each other on aspects that can then be compared to understand
the impact of these differences on specific outcomes. Eisenhardt multiple case studies
approach represent this cross-sectional or variance approach (Eisenhardt, 1989, 2021;
Eisenhardt & Graebner, 2007). Yet, the same logic may apply when we collect data
within one organization (single case study) yet across multiple divisions, branches,
events, team meetings, or decision-making instances (e.g., McPherson & Sauder,
2013). Theoretical sampling (Glaser & Strauss, 1967) distinguishes between four strat-
egies for sampling comparative cases: maximizing or minimizing differences between
cases and/or between concepts. The choice has implications for the kind and scope of
the theory that can be generated. One way or the other, we need to ensure the compa-
rability of the various case studies—analytically, in terms of the data collected, the time
frame covered, and how deep analysis it allows.
Process studies (Langley, 1999; Langley et al., 2013) focus on the temporal dynam-
ics of meanings and actions. Rather than taking a categorization or comparative
approach, they focus on dissecting a long process into its phases and understanding
the mechanisms that account for moving from one state of affairs to another, either
in a linear, parallel, recursive or conjunctive style (Cloutier & Langley, 2020).
Langley’s (1999) process method is a good representation of this approach. Recent
studies under the newish “historical studies”banner that delve more deeply into the
past are another example (Decker et al., 2021; Godfrey et al., 2016; Vaara &
Lamberg, 2016; Wadhwani et al., 2018).
No one of these approaches is better than the others, and the distinction between the
three approaches is analytic like all other distinctions in this paper. In practice, they
may be integrated. For example, one may use multiple cases, comparing processes
of change or mechanisms within and between them; one may study how differences
in meanings or mechanisms change across different periods (e.g., by using dynamic
topic modeling techniques). Again, we will simplify by assuming these approaches
are employed in distinct research projects.
Unit of Data
Beyond deciding about the level of the phenomenon (level of analysis), and once one
encounters the kind of field material, one may reflect on the unit of data (unit of
Zilber and Meyer 7
analysis). These terms are closely related yet not synonyms. The unit of data is the ele-
mentary chunk of the text, interview, or observation that serves as the basis for
the analysis, e.g., development of codes, themes, or comparisons. Depending on the
research question, the appropriate unit of data can be comprised of a small (such as
a sentence, a paragraph, an argument, an interaction, etc.) or a large (such as the
entire interview, meeting, website/document) amount of data. The study may use indi-
vidual interviews as the data source yet focus on stories about the organization told
within interviews. The unit of data is not the interview but the story (Zilber, 2009).
The study may involve collecting various field materials on the organizational level
and then focusing on specific recurring events such as procedures (Barley, 1990a) as
units of data to be analyzed. Another study may be based on organizational websites
as the data source, but instead of using the text as a whole, the authors used “semantic
triplets”of organization–verb–recipient text passages (“With this campaign, we react
to complaints by local residents”;“Today, we involve citizens early on into the plan-
ning process”) as the unit of data that allowed them to identify the multiple role iden-
tities enacted by the organization and, eventually, how institutional logics are
instantiated on the organizational level (Jancsary et al., 2017).
One can not analyze the data without deciding the unit of data on which to focus
(though sometimes this decision is done in practice, intuitively and not necessarily
reflected upon). But in reality, the space of possibilities for deciding the unit of data
is determined during data collection. As we encounter the data, we may identify poten-
tial units of data that seem interesting and productive to follow and then collect them
systematically while in the field. We may later change focus or find that this data unit is
more or less relevant or fruitful to where the study emerged. All the more a reason to
identify possible data units early on, and ensure their systematic collection. As
Spradely (1980) recommends, one needs to have a clear understanding of the
various elements of the social world one studies (Spradley distinguished between
nine such elements: space, object, act, activity, event, time, actor, goal, and feeling)
yet still strive to collect materials holistically. In the same vein, one better gather as
much data and as varied as possible (Barley, 1990b) while also giving attention to
viable units of data that may later be used as the basis of rigorous analysis.
Using the Matrix to Contemplate Fit
Fit is essential for qualitative research (e.g., Bansal et al., 2018; Gehman et al., 2018;
Howard-Grenville et al., 2021). There is no perfect method, as each is flawed and
partial in its unique way. Instead, we need to ensure a process of informed decision-
making, through which we assess the various options, their pros and cons, and their
implications for our overall project. We need to ensure the fit between philosophy of
science (ontology, epistemology, methodology), theory, and method, and more con-
cretely—between our research question, the level of the phenomenon, data to be col-
lected, its time orientation, the overall analytic approach, and unit of data. In this
process, we may build on traditions in qualitative organizational research. Still, we
cannot do that automatically, as we need to check the fit and adapt to the specific
8The Journal of Applied Behavioral Science 0(0)
circumstances of our concrete study. What our matrix adds, we hope, is a concrete tool
that maps the central aspects that need to be fitted together within a research project.
The matrix, presented in Figure 1, allows researchers to position their research and
check that their choices fit together coherently and justifiably. Of course, qualitative
research is a non-linear process, and the matrix can be used at any stage of the research
process. Further, the matrix can be “read”from different angles, starting from each
dimension it covers and creating any sequence of fitness-checks. In the very early
stage of designing a new research project, the first step would be to identify the
level of the phenomenon and analytic approach and then move to other dimensions.
In later stages of the research project, one may start with the givens and preferences
and adapt the not-yet-decided aspects to ensure fit. For example, if one is to use inter-
view texts already collected, or if the researcher has a clear preference for conducting
interviews, one can start from these as given and make sure all other dimensions are
adapted accordingly. Below, we exemplify one possible use that seemed most produc-
tive in presenting the tool.
The inner part of the matrix juxtaposes levels of the phenomenon under study with
the potential sources of data collection. The innermost space also relates to the analytic
approach, and the outer skirts are about time orientation and unit of data. Thus, a first
step may be to identify the position of one’s project and research question within the
grid in terms of the level of the phenomenon and the accessibility of sources of field
material. Note that there are no one-on-one relations between these two dimensions.
For example, interviewing individuals does not necessarily mean that the study
focuses on individual-level phenomena. Suppose we ask people about their life
stories, experiences, and emotions to identify trajectories in constructing personal iden-
tity (e.g., Lieblich, 1998). In that case, the individual-level interviews match the level
Figure 1. Mapping the terrain of choices in qualitative research.
Zilber and Meyer 9
of the phenomenon. Yet, if we use interviews to ask organizational members about the
organization’s strategy work and its effects, we may use individual interviews for an
organizational-level inquiry (Kornberger et al., 2021).
On the other hand, if one focuses on the societal (e.g., nation state) or world polity
level, one may wonder whether interviews are a good data source for a qualitative
inquiry. The fit between the level of the phenomenon and data source is not trivial
—both in terms of quantity (how many interviews will allow capturing society-level
understandings and sentiments?) and the possibility of representation (though
usually not necessary for analytic, rather than statistical, generalization, see Small,
2009; on a debate over the use of interviews for the study of macro-level phenomena
like culture, see Lamont & Swidler, 2014). Observations of official state ceremonies or
archival data (official and counter official news media, for example) may be more
easily justified sources of data. In historical studies, the research depends on the
very ability to find or create the archive—as the researcher depends on existing data.
Thus, adaptations usually involve fitting the research question, level of the phenome-
non, and analytic approach to the available archive (rather than the other way around).
At this stage, we also recommend thinking about one’sanalytic approach and
whether it fits the level of the phenomenon and available data sources. For example,
suppose the researcher strives to compare two organizations, teams, or fields. In that
case, she needs to make sure she chooses the two units wisely, for their similarities
and differences, and can collect the same kinds of data, covering the relevant period
of time, in both (see, for example, Kellogg, 2009; Kellogg, 2014. See also Kellogg,
2011 for an innovative design in which she compares in vivo data collected in the
2000s to two previous studies from the 1970s and 1990s).
It is also worth thinking about the fitoftime orientation of the data with the analytic
approach and the overall goal (level of phenomenon and research questions). For
example, longitudinal data is required if one is interested in an organizational or
field-level phenomenon and wants to follow a process unfolding. Interviews are
always in vivo data, and if they relate to the past, they involve, as we have outlined
above, problems and limitations connected to retrospective bias. However, it is not
always possible to conduct interviews with the same people at different times. Most
importantly, the time span that can be covered is limited. To cover more extended
periods, archival data (e.g., on the field and societal levels, see Zilber, 2006) may be
better, as it is time-authentic and grants access to interpretations at the time. A mix
of in vivo observations, retrospective interviews, and time-authentic archival data
(e.g., Zilber, 2002) may also be used, as these data sources complement each other
—especially if one is interested in how people construct the past (rather than how
they constructed the present then).
Once in the field and encountering the data, one may start thinking about the units of
data to be later used for the analysis. Though the focus of the study may still change,
thinking about the unit of data early on may help researchers make sure they have col-
lected enough relevant data. For example, in their study of the Occupy movement in
London, a social movement (inter-organizational level of phenomenon), Reinecke
and Ansari (2021) focused on specific instances of interactions as their unit of data,
10 The Journal of Applied Behavioral Science 0(0)
which allowed them to delve deep into each instance and compare patterns across
them. Zilber (2007), who was also interested in the field-level, studied field-level
events and later focused on stories in the data. While she only studied one field, and
a limited number of events, she identified many stories and delved deeply into them
and their impact on the dynamics of the events (see also Gross & Zilber, 2020).
Meyer and Höllerer (2010) were interested in the dynamics of field-level issue
frames. They used a decade of media reports as a time-authentic data source and
focused on speaker statements as the unit of data to identify types of actors, their
accounts, storylines and eventually frames, and their dynamics over the period
observed.
Concluding Remarks
The tool we offer is grounded in classifications of various dimensions of qualitative
research and fine distinctions in each of these dimensions. Such classifications are nec-
essarily simplistic and reductionist and do not reflect the genres and subgenres of qual-
itative research (Bansal et al., 2018) or innovative methods in various subdisciplines of
qualitative research, from ethnography (Seligmann & Estes, 2020) to computational
methods of analyses that aim to measure culture and meaning (Mohr et al., 2020).
Also, they may seem at odds with our understanding of qualitative research as non-
linear and flexible. Positivistic studies adhere to linear development, which ensures
the ethical basis of the statistical techniques used. Qualitative studies are better con-
ceived as spiral endeavors. Researchers may go back and forth between the literature,
data, and emerging insights and even adapt the research question or their focus as they
allow the data to direct them according to the tradition of discovery (Locke, 2011).
Still, we hold that laying open one’s core assumptions and thinking ahead and plan-
ning is essential. The same goes for periodic check-ups of the ongoing fit between
moving parts. Even if the plans may change, it is necessary to depict the possibilities
and choices and reflect on how they open up yet limit future options. We hope that the
crude analytic distinctions and mapping we are offering will help think about method-
ological nuances and allow for methodological creativity, all in the service of better
interpretation (or construction) of the world around us.
Acknowledgments
Tammar Zilber thanks the Recanati Center at the Hebrew University for its continuous support.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this
article.
Zilber and Meyer 11
ORCID iD
Tammar B. Zilber https://orcid.org/0000-0002-6409-1269
Note
1. Following Gee’s (2015) distinction between “Big ‘D’” and “small ‘d’” discourse, or
Bamberg’s (2006; see also Bamberg & Georgakopoulou, 2008) distinction between Big
and small stories.
References
Alvesson, M. (2003). Beyond neopositivists, romantics, and localists: A reflexive approach to
interviews in organizational research. Academy of Management Review,28(1), 13–33.
https://doi.org/10.5465/amr.2003.8925191
Amis, J. M., & Silk, M. L. (2008). The philosophy and politics of quality in qualitative organi-
zational research. Organizational Research Methods,11(3), 456–480. https://doi.org/10.
1177/1094428107300341
Bamberg, M. (2006). Stories: Big or small—why do we care? Narrative Inquiry,16(1), 139–
147. https://doi.org/10.1075/ni.16.1.18bam
Bamberg, M., & Georgakopoulou, A. (2008). Small stories as a new perspective in narrative and
identity analysis. TEXT & TALK,28(3), 377–396. https://doi.org/10.1515/TEXT.2008.018
Bansal, P., Smith, W. K., & Vaara, E. (2018). New ways of seeing through qualitative research.
Academy of Management Journal,61(4), 1189–1195. https://doi.org/10.5465/amj.2018.
4004
Barley, S. R. (1990a). The alignment of technology and structure through roles and networks.
Administrative Science Quarterly,31,61–103. https://doi.org/10.2307/2393551
Barley, S. R. (1990b). Images of imaging: Notes on doing longitudinal field work. Organization
Science,1(3), 220–247. https://doi.org/10.1287/orsc.1.3.220
Barley, S. R., & Tolbert, P. S. (1997). Institutionalization and structuration: Studying the links
between action and institution. Organization Studies,18(1), 93–117. https://doi.org/10.
1177/017084069701800106
Bernstein, E. S. (2017). Making transparency transparent: The evolution of observation in man-
agement theory. Academy of Management Annals,11(1), 217–266. https://doi.org/10.5465/
annals.2014.0076
Brown, A. D., Colville, I., & Pye, A. (2015). Making sense of sensemaking in organization
studies. Organization Studies,36(2), 265–277. https://doi.org/10.1177/0170840614559259
Cloutier, C., & Langley, A. (2020). What makes a process theoretical contribution? Organization
Theory,1,1–32. https://doi.org/10.1177/2631787720902473.
Dansereau, F., Yammarino, F. J., & Kohles, J. C. (1999). Multiple levels of analysis from a lon-
gitudinal perspective: Some implications for theory building. Academy of Management
Review,24(2), 346–357. https://doi.org/10.2307/259086
Decker, S., Hassard, J., & Rowlinson, M. (2021). Rethinking history and memory in organiza-
tion studies: The case for historiographical reflexivity. Human Relations,74(8), 1123–1155.
https://doi.org/10.1177/0018726720927443
Denzin, N. K., & Lincoln, Y. S. (1994). Handbook of qualitative research (1st ed.). Sage.
Dumont, G. (2022).Immersion in organizational ethnography: Four methodological require-
ments to immerse oneself in the field. Organizational Research Methods, https://doi.org/
10.1177/10944281221075365.
12 The Journal of Applied Behavioral Science 0(0)
Eisenhardt, K. M. (1989). Building theories from case-study research. Academy of Management
Review,14(4), 532–550. https://doi.org/10.2307/258557
Eisenhardt, K. M. (2021). What is the Eisenhardt method, really? Strategic Organization,19(1),
147–160. https://doi.org/10.1177/1476127020982866
Eisenhardt, K. M., & Graebner, M. E. (2007). Theory building from cases: Opportunities and
challenges. Academy of Management Journal,50(1), 25–32. https://doi.org/10.5465/amj.
2007.24160888
Froschauer, U., & Lueger, M. (2020). Das qualitative interview. Zur praxis interpretativer
analyse sozialer systeme. facultas.
Gee, J. P. (2015). Discourse, small d, big D. In K. Tracy, C. Ilie, & T. Sandel (Eds.), The inter-
national encyclopedia of language and social interaction. Jogn Wiley & Sons.
Gehman, J., Glaser, V. L., Eisenhardt, K. M., Gioia, D., Langley, A., & Corley, K. G. (2018).
Finding theory-method fit: A comparison of three qualitative approaches to theory build-
ing. Journal of Management Inquiry,27(3), 284–300. https://doi.org/10.1177/
1056492617706029
Gioia, D. A., Corley, K. G., & Hamilton, A. L. (2013). Seeking qualitative rigor in inductive
research: Notes on the Gioia methodology. Organ/izational Research Methods,16(1),
15–31. https://doi.org/10.1177/1094428112452151
Glaser, B., & Strauss, A. (1967). The discovery of grounded theory: Strategies for qualitative
research. Aldine.
Godfrey, P. C., Hassard, J., O’Connor, E. S., Rowlinson, M., & Ruef, M. (2016). What is orga-
nizational history? Toward a creative synthesis of history and organization studies. Academy
of Management Review,41(4), 590–608. https://doi.org/10.5465/amr.2016.0040
Grodal, S., Anteby, M., & Holm, A. L. (2021). Achieving rigor in qualitative analysis: The role
of active categorization in theory building. Academy of Management Review,46(3), 591–
612. https://doi.org/10.5465/amr.2018.0482
Gross, T., & Zilber, T. B. (2020). Power dynamics in field-level events: A narrative approach.
Organization Studies,41(10), 1369–1390. https://doi.org/10.1177/0170840620907197
Haack, P., Sieweke, J., & Wessel, L. (2020). Microfoundations and multilevel research on insti-
tutions. In Research in the sociology of organizations: Microfoundations of institutions
(Vol. 65A, pp. 12–40). Emerald Publishing.
Hammersley, M., & Atkinson, P. (2019). Ethnography: Principles in practice (4th ed.).
Routledge.
Hannigan, T. R., Haans, R. F. J., Vakili, K., Tchalian, H., Glaser, V. L., & , Wang, M. S., Kaplan,
S., & Jennings, P. D. (2019). Topic modeling in management research: Rendering new
theory from textual data. Academy of Management Annals,13(2), 586–632. https://doi.
org/10.5465/annals.2017.0099
Harley, B., & Cornelissen, J. (2022). Rigor with or without templates? The pursuit of method-
ological rigor in qualitative research. Organizational Research Methods,25(2), 239–271.
https://doi.org/10.1177/1094428120937786
Hitzler, R., & Eberle, T. S. (2004). Phenomenological life-world analysis. In U. Flick, E. von
Kardoff, & I. Steinke (Eds.), A companion to qualitative research (pp. 67–71). Sage.
Howard-Grenville, J., Nelson, A., Vough, H., & Zilber, T. B. (2021). Achieving fit and avoiding
misfit in qualitative research. Academy of Management Journal,64(5), 1313–1323. https://
doi.org/10.5465/amj.2021.4005
Huber, G. P., & Power, D. J. (1985). Retrospective reports of strategic-level managers:
Guidelines for increasing their accuracy. Strategic Management Journal,6(2), 171–180.
https://doi.org/10.1002/smj.4250060206
Zilber and Meyer 13
Jancsary, D., Meyer, R. E., Höllerer, M. A., & Barberio, V. (2017). Towards a structural model
of organizational-level institutional pluralism and logic interconnectedness. Organization
Science,28(6), 1150–1167. https://doi.org/10.1287/orsc.2017.1160
Jarrett, M., & Liu, F. (2018). “Zooming with”: A participatory approach to the use of video eth-
nography in organizational studies. Organizational Research Methods,21(2), 366–385.
https://doi.org/10.1177/1094428116656238
Jick, T. D. (1979). Mixing qualitative and quantitative methods: Triangulation in action.
Administrative Science Quarterly,24(4), 602–611. https://doi.org/10.2307/2392366
Kellogg, K. C. (2009). Operating room: Relational spaces and microinstitutional change in
surgery. American Journal of Sociology,115(3), 657–711. https://doi.org/10.1086/603535
Kellogg, K. C. (2011). Hot lights and cold steel: Cultural and political toolkits for practice
change in surgery. Organization Science,22(2), 482–502. https://doi.org/10.1287/orsc.
1100.0539
Kellogg, K. C. (2014). Brokerage professions and implementing reform in an age of experts.
American Sociological Review,79(5), 912–941. https://doi.org/10.1177/0003122414
544734
Kohler, T., Smith, A., & Bhakoo, V. (2022). Templates in qualitative research methods: Origins,
limitations, and new directions. Organizational Research Methods,25(2), 183–210. https://
doi.org/10.1177/10944281211060710
Kornberger, M., Meyer, R. E., & Höllerer, M. A. (2021). Exploring the long-term effect of strat-
egy work: The case of sustainable Sydney 2030. Urban Studies,58(16), 3316–3334. https://
doi.org/10.1177/0042098020979546
Lamont, M., & Swidler, A. (2014). Methodological pluralism and the possibilities and limits of
interviewing. Qualitative Sociology,37, 153–171. https://doi.org/10.1007/s11133-014-
9274-z
Langley, A. (1999). Strategies for theorizing from process data. Academy of Management
Review,24(4), 691–710. https://doi.org/10.5465/amr.1999.2553248
Langley, A., & Meziani, N. (2020). Making interviews meaningful. Journal of Applied
Behavioral Science,56(3), 370–391. https://doi.org/10.1177/0021886320937818
Langley, A., Smallman, C., Tsoukas, H., & Van de Ven, A. H. (2013). Process studies of change
in organization and management: Unveiling temporality, activity, and flow. Academy of
Management Journal,56(1), 1–13. https://doi.org/10.5465/amj.2013.4001
Lieblich, A. (1998). Categorical-content perspective. In A. Lieblich, R. Tuval-Mashiach, & T. Zilber
(Eds.), Narrative research: Reading, analysis and interpretation (pp. 112–126). Sage.
Locke, K. (2011). Field research practice in management and organization studies: Reclaiming
its tradition of discovery. Academy of Management Annals,5(1), 613–652. https://doi.org/
10.5465/19416520.2011.593319
Lueger, M., Sandner, K., Meyer, R., & Hammerschmid, G. (2005). Contextualizing influence
activities. An objective hermeneutical approach. Organization Studies,26(8), 1145–1168.
https://doi.org/10.1177/0170840605055265
Mayring, P. (2000). Qualitative inhaltsanalyse. Grundlagen und Techniken (7th ed.). Deutscher
Studien Verlag.
McPherson, C. M., & Sauder, M. (2013). Logics in action: Managing institutional complexity in
a drug court. Administrative Science Quarterly,58(2), 165–196. https://doi.org/10.1177/
0001839213486447
Meyer, R. E., & Höllerer, M. A. (2010). Meaning structures in a contested issue field: A topo-
graphic map of shareholder value in Austria. Academy of Management Journal,53(6),
1241–1262. https://doi.org/10.5465/amj.2010.57317829
14 The Journal of Applied Behavioral Science 0(0)
Meyer, R. E., Höllerer, M. A., Jancsary, D., & van Leeuwen, T. (2013). The visual dimension in
organizing, organization, and organization research. Academy of Management Annals,7(1),
487–553. https://doi.org/10.5465/19416520.2013.781867
Mohr, J. W., Bail, C. A., Frye, M., Lena, J. C., Lizardo, O., McDonnell, T. E., Mische, A.,
Tavory, I., & Wherry, F. F. (2020). Measuring culture. Columbia University Press.
Mohr, J. W., & Neely, B. (2009). Modeling foucault: Dualities of power in institutional fields.
In Research in the sociology of organizations: Institutions and ideology (Vol. 27,
pp. 203–255). Emerald Publishing.
Molina-Azorin, J. F., Bergh, D. D., Corley, K. G., & Ketchen, D. J. (2017). Mixed methods in
the organizational sciences: Taking stock and moving forward. Organizational Research
Methods,20(2), 179–192. https://doi.org/10.1177/1094428116687026
Potter, J., & Wetherell, M. (1987). Discourse and social psychology. Beyond attitudes and
behaviour. Sage.
Pratt, M. G., Sonenshein, S., & Feldman, M. S. (2022). Moving beyond templates: A bricolage
approach to conducting trustworthy qualitative research. Organizational Research
Methods,25(2), 211–238. https://doi.org/10.1177/1094428120927466
Reichertz, J. (2004). Abduction, deduction, induction in qualitative research. In U. Flick, E. von
Kardoff, & I. Steinke (Eds.), A companion to qualitative research (pp. 159–164). Sage.
Reinecke, J., & Ansari, S. (2021). Microfoundations of framing: The interactional production of
collective action frames in the occupy movement. Academy of Management Journal,64(2),
378–408. https://doi.org/10.5465/amj.2018.1063
Roulet, T. J., Gill, M., Stenger, S., & Gill, D. (2017). Reconsidering the value of covert research:
The role of ambiguous consent in participant observation. Organizational Research
Methods, 20(3), 487–517. https://doi.org/10.1177/1094428117698745
Sandberg, J., & Tsoukas, H. (2015). Making sense of the sensemaking perspective: Its constit-
uents, limitations, and opportunities for further development. Journal of Organizational
Behavior,36(S1), 6–32. https://doi.org/10.1002/job.1937
Schütz, A. (1962). Collected papers I—the problem of social reality. Nijhoff.
Schwandt, T. A. (2000). Three epistemological stances for qualitative inquiry. In N. K. Denzin,
& Y. Lincoln (Eds.), Handbook of qualitative research (pp. 189–214). Sage.
Schwandt, T. A. (2015). The sage dictionary of qualitative inquiry (4th ed.). Sage.
Scott, W. R. (2014). Institutions and organizations: Ideas, interests, and identities. Sage.
Seligmann, L. J., & Estes, B. P. (2020). Innovations in ethnographic methods. American
Behavioral Scientist,64(2), 176–197. https://doi.org/10.1177/0002764219859640
Silverman, D. (2019) Interpreting qualitative data (6th ed.). Sage.
Small, M. L. (2009). ‘How many cases do I need?’on science and the logic of case selection in
field-based research. Ethnography,10(1), 5–38. https://doi.org/10.1177/1466138108099586
Spradley, J. P. (1980). Participant observation. Holt, Rinehart and Winston.
Stanley, L. (2017). Archival methodology inside the black box: Noise in the archive. In
N. Moore, A. Salter, L. Stanley, & M. Tamboukou (Eds.), The archive project: Archival
research in the social sciences (pp. 33–67). Routledge.
Tedlock, B. (1991). From participant observation to the observation of participation: The emer-
gence of narrative ethnography. Journal of Anthropological Research,47(1), 69–94. https://
doi.org/10.1086/jar.47.1.3630581
Thompson, N. A., & Byrne, O. (2022). Imagining futures: Theorizing the practical knowledge of
future-making. Organization Studies,43(2), 247–268. https://doi.org/10.1177/
01708406211053222
Zilber and Meyer 15
Titscher, S., Meyer, M., Wodak, R., & Vetter, E. (2000). Methods of text and discourse analysis.
Sage.
Tsoukas, H. (2009). Craving for generality and small-N studies: A wittgensteinian approach
towards the epistemology of the particular in organization and management studies. In
D. A. Buchanan, & A. Bryman (Eds.), The SAGE handbook of organizational research
methods (pp. 285–301). Sage.
Vaara, E., & Lamberg, J. A. (2016). Taking historical embeddedness seriously: Three
approaches to advance strategy process and practice research. Academy of Management
Review,41(4) 633–657. https://doi.org/10.5465/amr.2014.0172
Wadhwani, R. D., Suddaby, R., Mordhorst, M., & Popp, A. (2018). History as organizing: Uses
of the past in organization studies. Organization Studies,39(12), 1663–1683. https://doi.org/
10.1177/0170840618814867
Weick, K. E. (1995). Sensemaking in organizations. Sage.
Zilber, T. B. (2002). Institutionalization as an interplay between actions, meanings, and actors:
The case of a rape crisis center in Israel. Academy of Management Journal,45(1), 234–254.
Zilber, T. B. (2006). The work of the symbolic in institutional processes: Translations of rational
myths in Israeli high tech. Academy of Management Journal,49(2), 281–303. https://doi.
org/10.5465/amj.2006.20786073
Zilber, T. B. (2007). Stories and the discursive dynamics of institutional entrepreneurship: The
case of Israeli high-tech after the bubble. Organization Studies,28(7), 1035–1054. https://
doi.org/10.1177/0170840607078113
Zilber, T. B. (2009). Institutional maintenance as narrative acts. In T. Lawrence, R. Suddaby, &
B. Leca (Eds.), Institutional work (pp. 205–235). Cambridge University Press.
16 The Journal of Applied Behavioral Science 0(0)