ArticlePDF Available

Using Computer-assisted Qualitative Data Analysis Software (CAQDAS) to Re-examine Traditionally Analyzed Data: Expanding our Understanding of the Data and of Ourselves as Scholars


Abstract and Figures

As diverse members of a college of education evaluation committee one of our charges is to support faculty as we document and improve our teaching. Our committee asked faculty to respond to three qualitative questions, documenting ways in which interdepartmental and cross-department conversations are used to promote reflective thinking about our practice. Three of us investigated the use of CAQDAS to provide an additional level of analysis and how we learned more about ourselves as scholars through this collaboration. Our findings include recommendations regarding the use of CAQDAS to support collaborative efforts by diverse scholars. © 2012: Linnea L. Rademaker, Elizabeth J. Grace, Stephen K. Curda, and Nova Southeastern University.
(74=5. =5+.: :<2,4.
&12;:<2,4.2;+:7=01<<7A7=/7:/:..*6-78.6*,,.;;+A<1.&1.#=*42<*<2>.$.87:<*< %')7:3;<1*;+..6
*,,.8<.-/7:26,4=;27626&1.#=*42<*<2>.$.87:<+A*6*=<17:2B.-*-5262;<:*<7:7/ %')7:3;7:57:.
&12;?7:32;42,.6;.-=6-.:*:.*<2>.75576;<<:2+=<276 76,755.:,2*4%1*:.423.2,.6;.
The Qualitative Report 2012 Volume 17, Article 43, 1-11
Using Computer-assisted Qualitative Data Analysis Software
(CAQDAS) to Re-examine Traditionally Analyzed Data:
Expanding our Understanding of the Data and of Ourselves as
Linnea L. Rademaker, Elizabeth J. Grace, and Stephen K. Curda
National Louis University, Chicago, Illinois, USA
As diverse members of a college of education evaluation committee one of
our charges is to support faculty as we document and improve our
teaching. Our committee asked faculty to respond to three qualitative
questions, documenting ways in which interdepartmental and cross-
department conversations are used to promote reflective thinking about
our practice. Three of us investigated the use of CAQDAS to provide an
additional level of analysis and how we learned more about ourselves as
scholars through this collaboration. Our findings include
recommendations regarding the use of CAQDAS to support collaborative
efforts by diverse scholars. Key Words: Multi-disciplinary Research,
Multiple Perspectives, CAQDAS, Qualitative Research.
In this paper we discuss our work on a college-wide evaluation committee, in
which we investigated faculty’s use of conversation and reflection to demonstrate
professionalism in teaching. The three of us, as members of the committee, all are
members of different college departments, and had very different perspectives towards
research and the use of computers. Linnea is a professor of research methods in a college
of education, and primarily teaches qualitative research and action research. Linda had
the most experience with “Computer-Assisted Qualitative Data Analysis Software” or
CAQDAS, and wanted to bring together Elizabeth and Steve to collaborate on this
project. Elizabeth is a professor in the department of special education and primarily
teaches math methods, assessment, and disability studies. Elizabeth has a diagnosis of
autism and had minimal experience with CAQDAS before this study. Steve is the
director of assessment for the college and a professor in the technology department. Steve
has extensive experience with quantitative research and with computers, using computers
in his quantitative research, but had little experience with qualitative research or the use
of computers in qualitative research before this study. Significantly, we wanted to make
visible the ability of CAQDAS to assist diverse scholars in collaborating in research and
data analysis. We initially conducted the study to highlight the importance of authentic
assessment of faculty teaching in an era of accountability that often tends to examine only
quantitative assessments, specifically student course evaluation data. But our further
work with this data enlightened us to the collaborative benefits of CAQDAS. After
completing the data collection (described below), completing a traditional group hand-
analysis of the data, and presenting our findings to our college, Linnea wanted to expand
the project by inputting the data into CAQDAS. She asked Elizabeth and Steve to join
her, because of their unique backgrounds. She knew that Steve had extensive computer
experience, and that Elizabeth had extensive qualitative research experience. But, Linnea
2 The Qualitative Report 2012
wondered how the three of us would be able to work together on a project, and how the
use of computers might change our understanding of the data. In essence, we wanted to
expand upon the work of the committee’s group qualitative analysis; but, more
importantly, we wanted to see if three diverse researchers could work together using
computer software to assist our collaboration and to see if the three of us would find
additional insights into one another’s ways of thinking about qualitative data analysis.
Research Question and Rationale
We began our inquiry with the following guiding question:
When analyzing qualitative data, how will the use of CAQDAS change
our analysis and interpretations, compared to our first round of coding an
analysis within our larger committee?
However, this question changed over the time of our work together. We naїvely expected
that using a computer software program to help us analyze might help us uncover hidden
themes and issues, but Elizabeth was skeptical of this because of her reticence to believe
that CAQDAS had any value in aiding analysis. Of course data analysis in qualitative
research can be daunting, due to the sheer volume of languaged, video, audio and
pictorial data that is available. When we investigated the literature on computer software
in qualitative data analysis, we changed our question (will follow the rationale) and
provide the following rationale as a snapshot of that literature search.
The use of computers to aid in the analysis challenge is fairly new, when
compared with the history of qualitative research and traditional analysis through hand-
coding, sorting, and memoing. Computer data analysis software for qualitative research
was available widely to the public in the late 1990s (compared with the traditional
analysis of qualitative data, which began in the anthropological literature and dates back
to the work of Boas, 1858-1942; Malinowski, 1884-1942; Mead, 1901-1978 and others.
It was adopted by other disciplines seeking to document ethnographies of culture in
education in the 1960s and 1970s with the work of Wolcott, and Geertz, followed by
Lincoln and Guba, Miles and Huberman, and Corbin and Strauss and others in the
1980s). Interestingly, although NUD*IST was “born” in 1981 (Richards, 2002), version
2 in 1987 was still only available on mainframes, and not commercially. It wasn’t until
1993 that it was available on MACs only, then later on PCs. Atlas ti. released their first
commercial version of their CAQDAS in 1993 (http://www.qualitative-data- and the first English version of MAX QDA
appeared in 1995 ( Literature on
CAQDAS began to appear in the late 1990s, with the majority of literature appearing
after 2000 (In a search of EBSCO host databases, we found two references for 1996, one
for 1999, and the rest were post-2000). Most of this literature addressed the capabilities
of the software and caveats for novice users. Yet we wondered what such programs could
offer us as three diverse researchers that might support us as we sought to work together
to present an analysis and interpretation of our study? How might the results be
expanded? How would the software help us collaborate? What could we learn that will
facilitate our cohesive interpretations?
Linnea L. Rademaker, Elizabeth J. Grace, and Stephen K. Curda 3
Theoretical Framework
In conducting a literature search on the usage and issues of CAQDAS, we
discovered that with the improvement of software over the last 15 years, authors have
written increasingly more about the topic in qualitative research journals. Some offered
caveats or implications for continued use of such software (e.g., Bazeley, 2006; Johnston,
2006; Lee & Esterhuizen, 2000; Smith & Short, 2001). Lage and Godoy (2008) presented
a review of literature about the benefits and issues of using CAQDAS. Specifically, the
authors discuss the literature via a categorization of four themes: (a) “the effectiveness of
software data management”; (b) “the increase of researcher’s closeness with the date”; (c)
the ease of researchers’ communication about the data; and, (d) the possibility of bias
in analytical process(p. 77, abstract). We were cognizant of these issues as the three of
us undertook our analysis with computer software following the committee’s previous
analysis of the data. After reviewing the literature, we confirmed to each other our
suspicions that we wouldn’t find changes in our analysis (e.g., new themes, differing
results), as Linnea knew and Elizabeth and Steve soon understood that computer-assisted
data analysis is largely dependent upon our work in entering and coding the data. We
wondered, however, after continued searches into the literature, how the software might
improve our own trustworthiness and confirmability (Lincoln & Guba, 1985) of our
findings, thereby allowing us to perhaps go deeper into the data, and expand our findings.
Ryan (2009) wrote about “trustworthiness” in her own study, concluding that while the
software did not increase rigor for her work, nor change the analysis, it did allow her to
improve her organizational abilities, and she felt the trustworthinessof her work
increased, due to the linking capabilities of the software in being able to represent her
analyses and connect various sources of data to interpretations and themes. Ryan
concluded: “Software programs . . . enable researchers to make visible their
methodological processes for a more 'trustworthy' study” (p. 158). Ryan’s work
continued to influence our thinking throughout our analysis, as we felt our diverse
perspectives made such “trustworthiness” more complex via our different lenses and our
different approaches to qualitative data understanding and analysis. We expand on this
issue in our discussion section.
Davis and Meyer (2009) examined the procedures of analysis, comparing hand
coding with computer-assisted coding, detailing advantages, disadvantages and common
assumptions held in the use of computer-assisted qualitative data analysis in sports
psychology research. Particular caveats included the unpredictable malfunctions of
computers and the misperception that the program will analyze for the researcher (as
quantitative software might). They suggested that it is essential that the researcher learn
the software before using it for a project, and that you cannot separate the researcher from
the programthat is, the researcher decides which features and codes to use and when;
the program will simplify organization, recall of data, and visual representation. As
mentioned earlier, Linnea had extensive previous experience with the software, and Steve
was computer-proficient in many quantitative analysis programs, so we felt that we had
sufficient experience to undertake this project.
Gilbert (2002) further investigated the issue of coding and balancing “closeness
and “distance” from the data. We concur that the use of the software left us feeling
closeor (for some) overly-familiar with the data, sometimes making analysis difficult.
4 The Qualitative Report 2012
Her emphasis on the tactile-digital divideseemed a bit dated for our usage, however.
None of us had difficulty working with data in a digital manner from a computer usage
Method of Data Collection/Sampling
Since our college-level (college of education) committee was charged with
assisting faculty in the documentation and reporting of the quality of faculty teaching
(college-level constitutional charge), we chose to survey our faculty (using internet-based
survey software about their perceptions of their
collaborations in reflecting about teaching. We believed that providing faculty with an
alternative to student-course evaluations, via the documentation of their professional
activities, would provide faculty with additional materials with which to make their case
for tenure and promotion. Our tenure and promotion process requires strong
documentation and persuasive materials to show how we’ve improved our teaching in the
years leading up to tenure application. We sent out a link to an internet-based survey
which contained three questions:
1. What is something you have changed in your teaching recently and why have you
2. Do you have conversations within your program about effective teaching? Please
provide sample topics.
3. Do you have conversations with others outside of your program about effective
teaching? Please provide sample topics.
We received 39 responses (out of approximately 148 full-time education faculty).
Data were copied and transcribed from the internet survey software, and initially
organized by each of the three questions.
Methods of Data Analysis
Initially, each of the six members of the original committee (representing six
different departments in the college of education) was asked to reflect on the responses.
At our May, 2010 meeting, we chose to break into three groups, each group further
examining one of the three questions. We came back together as a group to share our
themes from our group coding, and reviewed all three questions as a group. We agreed
upon themes from each of the questions, and wrote about it in an executive summary to
the entire faculty. We chose the themes (Table 1) for each question based upon our small
group analysis.
Linnea L. Rademaker, Elizabeth J. Grace, and Stephen K. Curda 5
Table 1. Themes from Hand-Analysis
Question (n=39)
Themes from small group hand-analysis
Question 1: What have you
changed in your teaching
and why have you
Technology: faculty are weaving technology into their
teaching in a wide variety of formats
Diverse Learners: Faculty are incorporating universal design,
differentiating instruction, and using literature specific to
General Things: Faculty are incorporating theoretical
frameworks, such as Darling-Hammond; changing
expectations; chanting texts; using topics outside of texts;
altering assignments, using current events; incorporating
more discussion
Question 2: Have you had
conversations in your
department or program
about effective teaching?
83% said
Curriculum: Conversations surrounding curriculum and
curriculum issues dominated this category
Pedagogy: Faculty also discussed issues of pedagogy and
meeting the needs of all learners
Assessment: Some faculty found issues with assessment in
their courseshow to accurately and adequately assess
Question 3: Have you had
conversations outside of
your department or
program about effective
teaching? (participants
indicated they have
conversations with PK-12
school personnel,
friends/family, people
outside of work, and
colleagues in national
80% said
Our teaching: This topic dominated the findings for this
question, indicating that many faculty discuss teaching with
other faculty or others outside of their department or
The nature of teaching: This indicates that some faculty
engage in philosophical discussions about what it means to
teach in higher education and to teach future teachers
Research, and other tasks related to teaching: Some faculty
discuss other tasks of being a higher education professor,
specifically their research and issues related to it
Policy/issues of education: Some (but few) indicated
conversations about education policy and other issues in
6 The Qualitative Report 2012
For Elizabeth, this was a new adventure in learning and doing simultaneously, as
she was learning to use and trust the software, and learning to work with Linnea and
Steve as collaborators. Not only was Elizabeth skeptical that the software would be
capable of providing added value, but she had nagging doubts from her traditionalist
ethnographic upbringing that using technology in any way might “taint” the nature of the
analysis in ways that “doing it by hand” (tantamount, in her mind, then, to “organically”)
would not. Elizabeth knew that Linnea was aware of these doubts but also believed that
talking about them would have the possibility of tainting the data in another direction, so
for her part, she proceeded with the study in the spirit of seeing what would happen,
shelving, for the moment, her extremely binary-minded nature.
We initially broke out the data into the three questions (similar to the original
committee’s work, as indicated above). However, each of us looked at the responses to
each question, and began to create “tree nodes” as we noted them in each response. In
this way computer initial coding is very similar to the work we did as a larger committee
to analyze the data in that we each assigned meaning to parts of responses. At first the
meanings, or “tree nodes” were descriptive in nature. For instance, if the respondent said
that they talked with other colleagues at national conferences about teaching methods
within a similar course, we would assign that a tree node of “conversations about
We thought back to Gilbert’s (2002) study. She identified a third stage of coding
awareness that she calls a “metacognitive shift” (p. 220). During our analysis and coding
sessions, we found it helpful to include reflection on our own biases—especially in our
choosing of “tree nodes”. While we felt our “node” choices were descriptive, they still
represented our “choices”, seen through our lenses. Significantly, all three of us include
our lens as “education” faculty. As faculty in a college of education, we are part of
teacher preparation programs that strive to cultivate reflective thinking in our graduates.
We would also hope to see the same qualities, such as reflective thinking, modeled by our
faculty. Gilbert cautions novice software users to reflect “about how and why one works
in a particular way” (p. 220). It was important for increasing trustworthiness for us to
acknowledge and reflect upon our biases and the ways that we incorporated our own
situational awareness as part of the college of education into our analysis of themes and
issues. Discussion
Interpretations and Findings
In this section we discuss what we consider to be significant findings for us as
faculty from diverse departments working on a collaborative project. First, we found that
a unique aspect of using CAQDAS vs. hand-coding/analysis was the ability of the
program to create visual representations of datacharts, graphs, trees—that helped us in
our analysis (we elaborate below) and potentially can help readers to more deeply
understand our interpretations. Part of this aspect is the ease and agility with which such
graphics can be made. Second, we discussed how when several of us had coded by hand
in the past, it was often difficult to justify the work involved in changing and re-arranging
codings and categories, or to try out new arrangements, due to the nature of paper and
Linnea L. Rademaker, Elizabeth J. Grace, and Stephen K. Curda 7
penciland the messinessthat such changes often created, as described below. In
CAQDAS, it is easy to move things around and try different coding categories and
A third and final issue we discuss here is related to the nature of the software to
increase the ability of three diverse researchersdifferent departments, different teaching
areas, and different learning stylesto be able to work together to come to a group
understanding of the issues found within our data. Steve felt that it would be important
for the three of us to be as diverse as our survey respondents (in his quantitative world of
reliability and validity), as we were also members of the group (college of education)
being studied. That we were as diverse in our areas of expertise as our working
paradigms helped us to represent ourselves and our participants but also caused divergent
thinking about analysis and themes. However, the use of CAQDAS helped us to
understand each other’s viewpoints, and work to represent multiple perspectives in our
analysis. We elaborate on this aspect below.
In addressing the first issue and thinking about the visual aspects of CAQDAS,
we offer this example from our analysis sessions. When we were analyzing Question 2
(“Have you had conversations about teaching within your department or program?”), we
decided to try a few graphs. We created a tree-cluster by word similarity, and the words
“curriculum,” “pedagogy,” and “assessment came up frequently, as we expected based
upon our initial, group hand-coding. However, the tree node allowed us to visualize
subcategories by grouping similar words together under the umbrella words. The visual
offered a striking presentation that illustrated the wealth and depth of the conversations.
Linnea found the visual helped her to think about all of the themes (nodes) as part of a
larger collectivealmost like a family. Grouped together were phrases from participants
that are part of teaching or pedagogy, such as:
Collaboration with colleagues about curriculum
Sharing classroom materials and resources
Conversations about program contexts
Strategies for teaching
Incorporating philosophy and theory into teaching
Blending theory into practice
Using technology in pedagogy
Meeting diverse student learning styles and needs
These phrases, and more, demonstrated the vast array of topics about which our faculty
converse, think about, and reflect upon with each other. While we saw these across the
responses when we hand-coded on several pieces of paper, our intent at that time was
more of the quality of data ordering (Charmaz, as cited in Seidel, 1998, p. 4). However,
as we were able to return to the mass of data responses in a visual way, via the tree graph,
we were able to concentrate on the depth of responses around a single theme or issue—
that of conversations about our teaching or pedagogy. The striking visual display inspired
our thinking about the reflective opportunity we had given faculty through the tool of this
survey. A second issue in coding is that of “trying out” changes in coding categories.
While hand-coding, we were hesitant to make many changes, as the volume of data
8 The Qualitative Report 2012
causes a “chaotic moment”, when the data is re-connected randomly, as in pre-coding.
For Elizabeth the process of creating a graph via the push of a button initially caused a
“fear” response. She asked, “Wait, can you get the other idea back?” When she realized
that we could “try out” endless possibilities of visual connections and representations, she
was excited. We were all impressed with the versatility of the graphics, being able to “try
out” various graphics, yet still have the group’s foundational analysis from which to
work. Thinking back to our original work analyzing the data with the use of a group of
six committee members, we found that most committee members were hesitant to
question the initial analysis of the pairs of the committee members that we set up. When
each committee pair brought back their coding categories to the whole group, the group
accepted them without question, and moved towards a broader picture of the data through
thematic categorization. However, with CAQDAS we were able to try out different
themes and key coding categories, without the worry of losing our initial categories. The
trying out actually is a technique supported by several data analysis theorists, as a means
of rigorous attention to the data as the foundation (our own thinking and experiences)
from which we chose our themes (Corbin & Strauss, 2008; Mertens, 2005; Miles &
Huberman, 1994). CAQDAS allowed us to re-arrange categories, while still keeping our
initial thematic categories. In fact, we were able to look at them side-by-side in a visual
representation. We also created visual representations of all of the codes used in a
question (Figure 1).
Figure 1. Coding Categories for “What have you changed in your teaching?
Linnea L. Rademaker, Elizabeth J. Grace, and Stephen K. Curda 9
Graphics like that in Figure 1 assisted those of us who tend to be more visual, and helped
us to work at making meaning and understanding within visual learning styles, which
could then be translated back to language understanding such as is represented within this
paper. As mentioned earlier, Gilbert (2002) cautioned CAQDAS users to be wary of the
digital learning curve when using software, and we felt that this particular caveat did not
apply to us. However, a more applicable caveat that is related to the difference in users as
mentioned by Gilbert is that of the difference in learning styleslearning that is
represented by the way we, as researchers, read and understand our data. In writing more
recently about his seminal Multiple Intelligences (Gardner, 1983) theory, Gardner and
Moran (2006) posited that “the intelligences can be grouped together for various
purposes” (p. 228). This is, perhaps, what happened with the use of CAQDAS and the
visual representations—we, as separate and distinct researchers, with unique learning
styles, were able to come together and find understanding of each other and the data, as
well as represent understanding through the software.
A third finding is that of being able to easily utilize multiple researchers in a
single project. The ease with which we were able to share suggestions and coding
categories with each other capitalized on the ability of the software to offer data
presentation in multiple ways. We discovered that we could incorporate all three
researchers within one file, share it electronically (via an electronic server), and continue
coding. However, this was somewhat hampered by the way our university has purchased
the product—we only have 10 licenses for faculty. The remaining licenses are on the
server, and can only be accessed in the university computer labs. While one of us had the
full software on her own work computer, the remaining two researchers had to access the
software via the computer labs, limiting work time to office time. Since we are a
university that serves primarily working adult commuters, we have multiple campuses,
and faculty often teach at multiple locations traveling throughout a metropolitan area, and
in addition to working from home, these reduced our ability to make use of the computer
We realized as we began this project that we probably wouldn’t find “different”
themes, and this realization was confirmed as we began to work together. We knew that
the hard work of coding qualitative data still comes from within the mind of the
researcher and is intricately tied to the perspective of each researcher (Coffey &
Atkinson, 1996), which in our case proved to be vast, as we each represented vastly
different learning styles and different perspectives within education. However, we were
pleasantly surprised at how CAQDAS allowed easy storage and easy access to large
amounts of language (and other types of) data. We also appreciated the ease with which
we could manipulate and change around categories, themes and codes. Both of these
points confirmed Ryan’s (2009) findings of the software’s ability to increase the
researcher’s organizational abilities.
Still, using any new software, even for those who commonly work in computer-
mediated environments, requires time. We found it difficult to negotiate enough time to
teach the novice members of our group how to work with the software. It was
enlightening, however, to try different features of CAQDAS and find that these “trial and
10 The Qualitative Report 2012
error” efforts resulted in helping us view the data in new and different ways (i.e., the
graphs mentioned above) and to understand each other as diverse scholars. The “trying
out” resulted in unique moments for each of us that expanded our understanding of what
another member was trying to express related to data analysis or interpretation. Our
suggestion is for researchers to allow sufficient time to “play around” with the software,
and to try to look at their data from multiple perspectives. Additionally, we suggest
diverse scholars can work together through the medium of CAQDAS to provide more
opportunity for collaboration in data analysis and interpretation.
Bazeley, P. (2006). The contribution of computer software to integrating qualitative and
quantitative data and analyses. Research in the Schools, 13(1), 64-74.
Coffey, A., & Atkinson, P. (1996). Making sense of qualitative data: Complementary
research strategies. Thousand Oaks, CA: Sage.
Corbin, J. M., & Strauss, A. C. (2008). Basics of qualitative research: Techniques and
procedures for developing grounded theory (3rd ed.). Thousand Oaks, CA: Sage.
Davis, N. W., & Meyer, B. B. (2009). Qualitative data analysis: A procedural
comparison. Journal of Applied Sport Psychology, 21(1), 116-124.
Gardner, H. (1983). Frames of mind: The theory of multiple intelligences. New York,
NY: Basic Books.
Gardner, H., & Moran, S. (2006). The science of multiple intelligences theory: A
response to Lynn Waterhouse. Educational Psychologist, 41(4), 227-232.
Gilbert, L. S. (2002). Going the distance: ‘Closeness’ in qualitative data analysis
software. International Journal of Social Research Methodology, 5(3), 215-228.
Johnston, L. (2006). Software and method: Reflections on teaching and using QSR
NVivo in doctoral research. International Journal of Social Research
Methodology, 9(5), 379-391.
Lage, M. C., & Godoy, A. S. (2008, special edition). Computer-aided qualitative data
analysis: Emerging questions. RAMRevista de Administraςάo Mackenzie, 9(4),
Lee, R. M., & Esterhuizen, L. (2000). Computer software and qualitative analysis:
Trends, issues and resources. International Journal of Social Research
Methodology, 3(3), 231-243.
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage.
Lu, C.-J., & Shulman, S. W. (2008). Rigor and flexibility in computer-based qualitative
research: Introducing the coding analysis toolkit. International Journal of
Multiple Research Approaches, 2, 105-117.
Mertens, D. M. (2005). Research and evaluation in education and psychology:
Integrating diversity with quantitative, qualitative, and mixed methods (2nd ed.).
Thousand Oaks, CA: Sage.
Miles, M. B., & Huberman, M. (1994). Qualitative data analysis. Thousand Oaks, CA:
Ryan, M. (2009). Making visible the coding process: Using qualitative software in a post-
structural study. Issues in Educational Research, 19(2), 142-159.
Linnea L. Rademaker, Elizabeth J. Grace, and Stephen K. Curda 11
Seidel, J. V. (1998). Appendix E: Qualitative data analysis. The Ethnograph, 4, 1-15.
Retrieved from
Smith, C., & Short, P. M. (2001). Integrating technology to improve the efficiency of
qualitative data analysisA note on methods. Qualitative Sociology, 24(3), 401-
Author Note
Dr. Linnea L. Rademaker is an assistant professor in the department of
Educational Foundations & Inquiry in the National College of Education (NCE) at
National Louis University (NLU), where she teaches research methods and action
research courses to master's and doctoral students. Linnea's interests include action
research to empower teachers, narrative methods, and case study evaluation. Linnea also
is a founding partner of FuturEd, LLC, a private educational evaluation and consulting
firm located in Indiana. Correspondence regarding this article can be addressed to Dr.
Linnea L. Rademaker, phone: 847-947-5043; e-mail:
Dr. Elizabeth J. Grace is an assistant professor with the Diversity in Learning and
Teaching department in NCE at NLU. She teaches foundations, assessment and
mathematics, history, and research at the master's and doctoral levels. Her interest areas
include disability studies in education, community/inclusion, methods and uses of
research, applied theatre, and philosophy. Correspondence regarding this article can also
be addressed to Dr. Elizabeth J. Grace, phone: 847-947-5138; e-mail:
Dr. Stephen K. Curda is Director of Assessment and Innovative Technology in
NCE at NLU. His responsibilities include planning, developing, and implementing
program assessment and faculty development activities across the College with respect to
the mission and goals of the NCE. He also teaches and advises master's and doctoral
students. His research focus areas include technology integration, Instructional Systems
Design, and assessment and evaluation. Correspondence regarding this article can also be
addressed to Dr. Stephen K. Curda, phone: 847-947-5255; e-mail:
Copyright 2012: Linnea L. Rademaker, Elizabeth J. Grace, Stephen K. Curda, and
Nova Southeastern University
Article Citation
Rademaker, L. L., Grace, E. J., & Curda, S. K. (2012). Using computer-assisted
qualiative data analysis sofware (CAQDAS) to re-examine traditionally analyzed
data: Expanding our understanding of the data and of ourselves as scholars. The
Qualitative Report, 17(Art. 43), 1-11. Retrieved from
... What I mean by this is its clarity and instant follow-up on what has been done so far. (I7) Indeed, the authors of scientific articles regarding CAQDAS point to similar gains and promise directly or indirectly that software use can improve the quality, rigour, and trustworthiness of research (Anderson-Gough et al., 2017;Carvalho & Santos, 2021;Jones & Diment, 2010: O'Kane et al., 2021Oswald, 2019;Rademaker et al., 2012;Silverman, 2013;Welsh, 2002). ...
... Our interviewees especially appreciated the possibility of managing the structure of codes as well as creating and altering them as needed. Even though they did not employ this functionality in practice, as we mentioned earlier, they praised CAQDAS for providing the ability to alter all project elements in response to novel data (Bringer et al., 2006;Gibbs, 2013;Rademaker et al., 2012). Researchers value the system of analytic annotations, enabling them to simultaneously gather and analyze data. ...
Full-text available
We intend this article to be predominantly for researchers who would like to use software of the CAQDAS family in their research projects. Our experience as researchers and CAQDAS instructors shows that the introduction of software into the analytic process often poses a challenge. This is true for both novice and more seasoned researchers with entrenched habits of conducting research in a traditional manner. The literature available provides few answers to the questions preoccupying inexperienced users; our aim is to close this gap by providing useful information. We discerned major problem areas for scholars who are just setting off on their journey with CAQDAS. We tackle these issues in our article by providing replies to key questions: Is CAQDAS for me? How can I effectively search for and identify a suitable program? What should I bear in mind when using CAQDAS? By pitting benefits against limitations of CAQDAS implementation, we intend to address the expectations of using software in research.
... The program also integrated with RefWorks (which was used to manage the citations of the current study) to keep track of references pertinent to the data. One of the positive findings from research on qualitative software programs was that the data and analysis can be shared between authors, supervisors, peers, and students easily (Rademaker, Grace, & Curda, 2012). ...
... The data analysis in this research consists of three phases: data cleaning, text clustering and text analysis by the software tool [i.e. Computer-Assisted Qualitative Data Analysis Software (CAQDAS)] (Rademaker et al., 2015). CAQDAS can help the author to clean the data, namely, remove the technical summary, authors, the annexes and the supplementary materials. ...
Purpose The purpose of this study is to assess how intergovernmental panel on climate change’s (IPCC’s) sixth assessment report of Working Group I (WGI), a Summary for Policymakers (SPM), has evaluated the current climate change situation. The author uses the qualitative content analysis (QCA) method (i.e. summative content analysis [SCA]) to identify critical points of the SPM. Design/methodology/approach To better understand the WGI report and its SPM, the author uses the thematic analysis, often called the QCA method. This study takes more steps and uses one of the main qualitative content analysis approaches (i.e. SCA). Therefore, QCA (SCA) can help the author count the occurrence of certain words using computer-assisted qualitative data analysis software that quantifies the words in the data. This process helps the author to understand codes and patterns (e.g. the concept in the results section). Findings Interestingly, the AR6 says more about adaptation for policy makers in sections C (Climate information for risk assessment) and D (Mitigation of future climate change) than in sections A and D. Finally, this study concludes that the IPCC WGI SPM has addressed evidence on global climate change policymaking for SPMa, SPMd (mitigation-based strategies and mitigation policy [MP]) and SPMb, SPMc (adaptation policy and adaptation-based strategy). Research limitations/implications First, this study refers only to the contribution of WGI, a SPM. The findings of this study do not necessarily provide a full understanding of what the AR6 WGI SPM says about climate change. It points out that the QDA Miner software and Voyant tool do not include all variables and examples where mitigation and adaptation-based strategies are discussed. The guidance for coding is based on the approved version of IPCC AR6 WGI SPM. The final limitation is that the relatedness of key words (e.g. confidence, high and warming) is sometimes ambiguous; even experts may disagree on how the words are linked to form a concept. Thus, this method works at the keyword level. A more intelligent method would use more meaningful information than keywords. Originality/value This study used qualitative data analysis (SCA) to explore what was said about climate change in four sections of the IPCC AR6 WGI SPM, which may influence current and future global climate change policymaking.
... Data were coded quantitatively using the auto coding feature in the NVivo software. Computerassessed coding has been increasingly used over manual coding as it allows the researcher to process a higher quantity of textual data in a shorter period while benefiting from the methodical ability to examine, manipulate, and analyze segments of interest (Rademaker et al., 2012). ...
Full-text available
This paper explores the knowledge, attitudes, and behaviors about emerging hazards, environmental change, and relocation among community groups in Utqiaġvik (Barrow) of the North Slope Borough (NSB), Alaska. This region has been experiencing accelerating erosion and warmer temperatures, permafrost thawing, more frequent and intense storm surges, and increased maritime traffic and extractive industries with ice loss, with direct or cascading effects on the mixed ethnic and indigenous communities. This paper used engagement activities (Participatory Applied Theater) and qualitative approaches (focus groups) during three consecutive summers 2016-2018 to evaluate the risk perceptions and interpretations towards coastal changes and relocation as an adaptive response in this U.S. strategic yet remote location. Each focus group session started with risk ranking activities about regional hazards to assess knowledge and perceptions of risk, followed by an interactive script reading of an Iñupiat disaster legend to facilitate discussion about risk reduction options and engagement with the survey questions. Focus groups were audio recorded, transcribed, and analyzed using qualitative data analysis software Nvivo and a hybrid coding strategy. Results indicate that relocation is considered by some participants but is not planned for nor implemented by community groups, families, or the local government to reduce the hazard risks. However, widespread recognition of accelerated hazards and environmental changes, and the need for adaptation could lead to consideration of relocation in the future. This study provides a case of disaster risk reduction in a remote place with unique place-specific characteristics (e.g. particular forms of subsistence, corporate monopolies, Traditional Ecological Knowledge, and social organizations), but also shaped by significant external influences, accompanied by a changing landscape of risk from the slow and rapid onset of environmental changes.
... Data software was not used in this study as it was a relatively small-scale study, and it was considered that detailed manual scrutiny of the transcripts both by the principal investigator and coinvestigator was more likely to yield insightful themes and a relevant coding schema. It could be argued that the use of software may have aided the research if the sample and number of interview transcripts had been larger, and a coding procedure such as line-by-line coding had been the approach of choice (Rademaker et al., 2012;Woods et al., 2016). ...
Full-text available
Problem: The Safewards' model identifies factors that can lead to conflict and addresses these factors, using ten interventions, within inpatient mental health wards aiming to reduce "conflict and containment." The Department of Health (2014) and Care Quality Commission (2017) supported the use of Safewards to reduce restrictive practice across all mental health settings in the UK, but its application to adolescent mental health remains relatively unexplored. This study therefore aims to address the research question: "What are the factors influencing the success of ten Safewards' interventions when implemented onto an acute adolescent ward?" Methods: Eight healthcare assistants and two nurses who had attended Safewards' training participated in semi-structured interviews four months after Safewards was introduced to an acute adolescent ward. The interviews were transcribed verbatim and analyzed using thematic analysis. Data analysis was conducted inductively by developing data-driven themes. Findings: Many of the factors influencing Safewards' success in adolescent mental health (e.g., acuity; dependence on nonregular staffing; lack of leadership and operating procedures) paralleled the evidence found in adult services. Conclusions: This study contributes new information by implementing "mutual help" and "calm down" principles with adolescents, as well as discussing barriers of operational procedures and benefits of patient involvement.
... One of the paths to achieve this level is using relational analysis techniques such as meta-analysis of scientific text data (Koseoglu, 2016). The current study proposes a type of systematic literature review using a computer-assisted qualitative data analysis software -CAQDAS (Rademaker, Grace, & Curda, 2012) to identify the main elements used by the social management scholars, clustering it in relevant dimensions toward the area. ...
Full-text available
Social management is a broad multidisciplinary field; thus, it requires consensus among scholars. This paper responds to two questions: What elements and categories are prioritized in social management studies by Brazilian scholars? Is there novelty in the elements of SM at the Brazilian approach or it is a superposition with other international perspectives? The methods were lexical analysis, descending hierarchical classification techniques, correspondence analysis, and similarity analysis. The hierarchical clustering of elements extracted from the literature from 1990 to 2019 reveals that the field of social management has six dimensions and originated from previous studies about social participation, governance, and others. Despite the criticism on the concept, there is consensus among Brazilian authors; however, few new ideas emerge, showing a slow grown in the field, besides a low internationalization level. The advance of the praxis is minimizing the level of abstraction; Brazilian scholars defend the triad society, state, and market, but their studies focused more on state and society. On the contrary, international studies recognized the role of firms in social management. The differentiation of the Brazilian perspective of social management is a cognitive citizenship, educational and pedagogical process. Keywords: social management; categories; descending hierarchical classification; IRAMUTEQ
... Simplemente nos ayudan a manejar gran volumen de datos, manteniendo la información organizada, ordenada y accesible para recuperarla fácilmente y a visualizar nuestras relaciones y resultados. También facilitan el análisis en grupo (Rademaker et al., 2012). Existe controversia entre defensores y detractores de los CAQDAS (acrónimo de Computer Assisted Qualitative Data Analysis Software). ...
Full-text available
Comprender para cuidar, esa es la pieza pendiente de encajar en las ciencias de la salud. El desarrollo tecnológico, asentado en el conocimiento positivo y la metodología empírico analítica, ha supuesto un avance sin precedentes de las disciplinas que toman la salud y la enfermedad a su cargo. No obstante, frente a la generalización de procesos, una subjetividad latente, existencial y mortal, demanda la unicidad del ser humano que enferma, sufre y muere. Desde este horizonte, es de recibo atender a la metodología científica que sustenta estas demandas. Este es el fin último de esta obra, una actualización de la metodología de la investigación cualitativa que discurre desde los supuestos epistémicos, pasando por marcos teóricos y los últimos programas de análisis de datos, hasta la publicación de resultados. Toda disciplina de ciencias de la salud, ya sea enfermería, medicina, sexología, fisioterapia u otras, ha de vérselas antes o después con estos menesteres. A allanar y facilitar ese camino, desde una perspectiva científica y metodológica, se afana esta obra.
While the popularity of Destination-Country Image (DCI) research has increased exponentially in the literature, there has been relatively little published about the projected DCI from the supply side. Documentary, as a non-fictional work, is indispensable in the dissemination of the DCI. This study aims to propose the construct of projected DCI, and to investigate the congruency between Chinese and Western documentaries on the projection of China's DCI, so as to seek a better development of China's DCI in the era of globalization. Computer Assisted Qualitative Data Analysis Software (CAQDAS) NVivo was employed and both Wild China and Aerial China were collected to conduct a comparative analysis. Finally, a new construct for projected DCI was proposed, which contains macro projected DCI and micro projected DCI. The results show that the DCI of China constructed by Wild China is: a country with magnificent natural scenery, unique and abundant wildlife resources, and diverse, fragile and damaged ecological environment, but still maintaining harmonious coexistence between man and nature. While Aerial China is: a country with charming natural scenery, diverse cultural customs, and the coexistence of ancient wisdom and young vitality.
This paper discusses the influence of local governments over the realization of «Double First-Class» project in China. Based on the qualitative analysis of 36 universities’ mid-term reports, the spillover effect on local economic and social development was found to be enhancing the core competitiveness of local governments. Text analysis of eight local governments’ «Double First-Class» implementation reports showed that the way of their promoting the «Double First-Class» construction were investment and talent introduction. The local participation in the «Double First-Class» competition under the constraint of national financial resources leads to the Matthew effect of university development. Universities selected for «Double First-Class» will get more resources and develop better, while other universities will be left behind. Therefore, the central government needs to intervene the competition of local governments.
Philanthropic foundations routinely evaluate and measure the performance of nonprofit organisations to which they distribute funds, as well as the programmes that are funded. Another aspect of philanthropic foundations’ evaluation processes, which receives comparatively little attention within academic or practitioner literature, concerns evaluations of grant applications. While the focus of philanthropic evaluation literature is mainly on ‘how evaluation is done’, the focus of this article is on ‘how evaluation is understood’. This article details perspectives from interviews with 28 managers and trustees of Public Ancillary Funds as part of a wider study on the accountability of foundations. These public foundations must fundraise from the public, and donations to them are deductible against the taxable income of the donor, resulting in a significant element of accountability to the public for their effectiveness and evaluation of the distribution of their funds. Four main themes emerged through the exploration of how evaluation is understood from the perspective of these senior foundation leaders: motivations, values, criteria and processes of evaluation.
Full-text available
Software to support qualitative research is both revered and reviled. Over the last few decades,users and skeptics have made competing claims about the utility, usability and ultimate impact ofcommercial-off-the-shelf (COTS) software packages. This paper provides an overview of thedebate and introduces a new web-based Coding Analysis Toolkit (CAT). It argues that knowledgeable,well-designed research using qualitative software is a pathway to increasing rigor andflexibility in research.
Full-text available
In published mixed methods studies, qualitative and quantitative approaches have typically been combined by using them side-by-side or sequentially, until the point when the separately generated results are interpreted and conclusions drawn. Integration of different forms of data during analysis, or of different approaches within a single analysis, is much less commonly reported. In this paper, integration of these types is shown to be facilitated by use of computer software. Such integration is seen as occurring: (a) when text and numeric data are combined in an analysis; (b) when data are converted from one form to another during analysis; or (c) when combination and conversion occur together iteratively or in generating blended data for further analyses. Examples are provided to illustrate these various, computer-facilitated approaches to mixing methods.
The CAQDAS (Computer Aided Qualitative Data Analysis Software) identifies software packages, created to support qualitative research. Some researches have confidence on software convenience, effectiveness and easiness, but others have concerns about using this kind of tool. There is some discussion regarding computers usage to support qualitative research, establishing an opportune and necessary debate about this subject. The purpose of this essay is to present and analyze the main concerns related to computers usage in qualitative data analysis. Different opinions and arguments are presented, considering the pros and cons of CAQDAS, based on a bibliographic review focused in four main points, related to contradictory researchers' opinions: the effectiveness of software data management; the increase of researcher closeness with data; the easiness of researchers' communication process; the possibility of bias in the analytical process. It is discussed too some recent Brazilian experiences in CAQDAS usage, to illustrate the points discussed herein.
This study about researchers' transitions to using qualitative data analysis (QDA) software identified three stages of 'closeness to the data': the tactile-digital divide; the coding trap; and the metacognitive shift. The 'tactile-digital divide' involves adapting to working on screen instead of paper, an initially distancing process. As users gain comfort with the software, they experience 'the coding trap'--an issue of too much closeness to the data. Users warned that there was a tendency to become 'bogged down' in coding, and developed strategies to provide analytical distance. 'The metacognitive shift' involves learning to think about software processes with the same level of reflectivity that should accompany qualitative research processes in general, including developing strategies for error recognition. These transitions invite reflections on the nature of 'cognitive tools' and expertise with them, which lead to implications for evaluating research and considering trustworthiness.
Developments during the 1990s in the use of computer software for qualitative data analysis are surveyed. Salient trends are identified as are wider issues associated with software use. Also listed are some of the resources now available to potential and actual users of computer-assisted qualitative data analysis software (CAQDAS).