Article

Applying mixed methods under the framework of theory‐driven evaluations

University of Akron
New Directions for Evaluation 11/2004; 1997(74):61 - 72. DOI: 10.1002/ev.1072

ABSTRACT The application of mixed methods under the framework of theory-driven evaluations can minimize the potential tension and conflict of mixing qualitative and quantitative methods, as well as compensate for each method's weaknesses. Mixed methods should not be applied indiscriminantly, however, but rather contingently under particular conditions as described in this chapter.

0 Bookmarks
 · 
54 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The purpose of this article is to discuss the conceptual framework and strategies used in theory-driven evaluations in relation to mixed methods research and to explore the opportunities and challenges emerging from theory–driven applications. Theory-driven evaluations have frequently applied mixed methods in the past, and these experiences provide some insightful information for future development of mixed methods. In theory-driven evaluations, the application of mixed methods is justified and applied under a conceptual framework called program theory. The conceptual framework of program theory provides a plan and agenda for mixed methods to work collaboratively and de-emphasizes their differences and incompatibilities. Based upon the conceptual framework of program theory, this article provides several strategies for combining qualitative and quantitative methods in theory-driven evaluations. Procedures in applying these strategies are systematically illustrated. Finally, this article discusses challenging issues related to the future development of mixed methods, such as implications of the use of pure versus modified forms of mixed methods and the advocacy of mixed methods research as a "method" paradigm versus a "method use" paradigm. Mixed methods research is the systematic combination of qualitative and quantitative methods in research or evaluation. There has been a growing interest in this topic (Johnson & Onwuegbuzie, 2004). Advocates have argued that mixed methods can overcome weaknesses of a single (qualitative or quantitative) method (Greene & Caracelli, 1997; Howe, 1988; Johnson & Onwuegbuzie, 2004; Sechrest & Sidana, 1995). Greene and Caracelli (1997) provided the following major justifications for mixed methods: (a) triangulation: combining qualitative and quantitative methods to study the same phenomenon in order to gain convergence and increase validity (Denzin, 1970), (b) compensatory: using strengths of each method to overcome the weaknesses of the other to enrich the study of a phenomenon, and (c) expansion: using each method to obtain a fuller picture of a phenomenon. Quantitative and qualitative purists, however, view these two approaches as being based upon incompatible premises and techniques, and argue that mixing methods is neither meaningful nor valuable to pursue (Guba, 1990). Johnson and Onwuegbuzie (2004) have argued that there are some commonalities between quantitative and qualitative methods, and mixed methods research can narrow the divide between quantitative and qualitative researchers, enhancing the quality of a study. So far, many discussions or debates about mixed methods have been concentrated on philosophical or methodological issues. The discussion or development of mixed methods also can benefit from experiences based on the application of mixed methods in the field. Practical feedback can provide insightful information about strategies used in combining different methods, and the opportunities and challenges faced in such applications. This type of information could energize the future development of mixed methods. Theory-driven evaluations have frequently applied mixed methods in the past (Chen, 1990, 1997, 2005). The purpose of this article is to discuss some practical experiences of using mixed methods in theory-driven evaluations. More specifically, in this article, I will discuss the conceptual framework and strategies used in theory-driven evaluation that apply mixed methods and the opportunities and challenges emerging from such applications.
    RESEARCH IN THE SCHOOLS Mid-South Educational Research Association. 01/2006; 13:75-83.
  • Source
  • [Show abstract] [Hide abstract]
    ABSTRACT: Abstract In this paper, we describe the use of a mixed method design to evaluate the impact of an integrated assessment system on science teachers’ assessment perceptions and practice. This design was selected to address the methodological challenges of multilevel and multisite evaluation presented by a complex innovation. We provide an example of a cross-level exploratory analysis, facilitated by the integration of quantitative and qualitative methods; this provides one solution to the interpretation of multilevel, multisite evaluation data when a purposive sample is too small to use sophisticated quantitative techniques. The findings indicate that aggregated results would be misleading and discrepant cases would be masked. 1 Evaluating the Effects of an Integrated Assessment System on Science Teachers’ Assessment Perceptions and Practice The Science Education for Public Understanding Program received funds from the National