Article
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

This study conceptualizes the adoption process for new technology-based research methodologies. Using the case of “qualitative comparative analysis” (QCA) we apply several theoretical frameworks and identify champions of the adoption of the new methodology. The paper draws upon 216 articles across 36 A*- and A-ranked journals listed in the Scopus database. The study conceptualizes the adoption process as follows: inception (inventor)→ domain-specific multi-level elaboration (innovators) → diffusion (champions; domain-specific advocates) → production (developers) → mass acceptance (majority) and adds the impact of various role-players to existing models. Additionally, this study shows how seven scholars acted as early innovators to champion the acceptance of QCA. The study recommends a model for full idea adoption with four tipping points. The paper extends both methodology and QCA research and helps inform improvements in research and practice by identifying gaps in the idea adoption journey not yet covered by the extant literature.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... A proselytizer will tell other people to go to a place which he likes (Decrop & Snelders, 2004) and they legitimate the tastes of newcomers in favor of their restaurant (Hyde, 2014). An objective of proselytization is to induce adoption from self to many (Villiers et al., 2021). Members whose beliefs are shaken by disconfirmation, proselytize as a means to restore their confidence (Gal & Rucker, 2010). ...
Article
A foodie’s review about a restaurant will be persuasive as it is an extension of the self-internal consistency emanating from his lifestyle. Narcissism plays a psychological driver and moderates the effect of lifestyle. The purpose of this study is to empirically test the mediating role of proselytization and the moderating role of dietary plan and narcissism on the relationship between lifestyle and affective commitment. Snowball sampling technique from an informal sampling frame was used for data collection. The effect on proselytization is strong for low narcissism in the vegetarian dietary plan, and high narcissism in the non-vegetarian dietary plan condition. The marketing implication is that restaurants and eateries in the hospitality sector should offer foodie reviewers consumption opportunities at debut of new dishes.
Article
Full-text available
One view of heuristics is that they are imperfect versions of optimal statistical procedures considered too complicated for ordinary minds to carry out. In contrast, the authors consider heuristics to be adaptive strategies that evolved in tandem with fundamental psychological mechanisms. The recognition heuristic, arguably the most frugal of all heuristics, makes inferences from patterns of missing knowledge. This heuristic exploits a fundamental adaptation of many organisms: the vast, sensitive, and reliable capacity for recognition. The authors specify the conditions under which the recognition heuristic is successful and when it leads to the counterintuitive less-is-more effect in which less knowledge is better than more for making accurate inferences.
Article
Full-text available
This study explains how to disentangle the relationships between outcomes and the configurations of marketing brand tactics and consumer attributes for a particular marketing phenomenon. We demonstrate that qualitative comparative analysis (QCA) can be implemented in marketing contexts, and that it can explain marketing phenomena to the standards of rigour, generality and complexity demanded by scientific research. Fuzzy set QCA (fsQCA) need not be feared; it can be a very useful case-based method for marketing theorists. The "thought experiment" featuring the hypothetical Dorah Explorah brand demonstrates fsQCA's value and its similitude with real markets, and confirms that a single attribute, marketing tactic or condition can affect the examined outcome differently when it is part of a different configuration, although it may not be necessary or sufficient for the outcome by itself. We extend the literature on marketing theory creation by drawing on social psychology and management disciplines (for methodology) and Heider's (1958) balance theory to propose a specific hypothesis. We then test this hypothesis via an experimental manipulation. We present the theoretical background supporting the study's hypothesis and make a strong plea for marketing scholars to develop theories using truly useful, highly predictive asymmetrical logic. We hope that this paper will act as a tutorial for marketing researchers, novices and experts, making the application of fsQCA as a methodology and as a set of techniques easier and more transparent. We explicitly highlight the configurationally important aspects of qualitative research in empirical marketing studies and comparative scientific enquiry.
Article
Full-text available
Even though several scholars describe the telling weaknesses in such procedures, the dominating logic in research in the management sub-disciplines continues to rely on symmetric modeling using continuous variables and null hypothesis statistical testing (NHST). Though the term of reference is new, somewhat precise outcome testing (SPOT) procedures are available now and, along with asymmetric modeling, enable researchers to better match data analytics with their theories than the current pervasive theory–analysis mismatch. The majority (70%+) of articles in the leading journals of general management, marketing, finance, and the additional management sub-disciplines are examples of the mismatch. The mismatch may be a principal cause for the scant impact of the majority of articles. Asymmetric modeling and SPOT rests on the principal tenets of complexity theory rather than overly shallow and simplistic symmetric modeling and reporting of NHST findings. Though relatively rare, examples of asymmetric modeling and SPOT are available now in the management literature. The current lack of instructor knowledge and student training in MBA and PhD programs of asymmetric modeling and SPOT are the likely principal reasons for this scarcity.
Article
Full-text available
Purpose: This article describes and explains the arrival of the current tipping-point in shifting from bad-to-good science. This article identifies Hubbard (2016a) as a major troublemaker in attacking the current dominant logic supporting bad science, that is, null hypothesis statistical testing (NHST) in management science. Focus: Crossing the tipping-point is occurring now (2016–2017) from bad-to-good in the behavioral and management sciences. Because of the occurrence of supporting conditions, continuing to ignore the vast amount of criticism of bad science is no longer possible. Bad science includes the pervasive use of directional predictions (i.e., “as X increases, Y increases”) and NHST. The time is at hand to replace both with a good science: computing with words (CWW) to predict outcomes accurately and consistently, constructing models on a foundation of complexity theory, and testing models using somewhat precise outcome testing. Recommendations: Achieving parsimony does not equate with achieving simplicity and the shallow analysis of only predicting directional relationships. Researchers need to build the requisite variety into models and predict precise outcomes. Doing so requires recognizing the relevancy of equifinality (alternative configurations occur that indicate the same outcome), sign reversals, and causal asymmetry tenets of complexity theory. Testing for predictive validity of models using additional samples is a must. Managerial implications: Executives benefit from learning that CWW accurately is possible; do not accept models of only directional predictions; demand substance in research findings; look at the XY plots of CWW complex antecedent statements and both simple and complex outcomes that you need to know about (e.g., purchase behavior).
Article
Full-text available
Purpose – Individuals showing high consumer ethnocentrism (CE) prefer domestic over foreign-made products and their preferences may contribute to barriers to international market entry. Therefore, how to identify such consumers is an important question. Shankarmahesh’s (2006) review reveals inconsistencies in the literature with regard to CE and its antecedents. To shed theoretical and empirical light on these inconsistencies, the purpose of this paper is to contribute two new perspectives on CE: first, a typology that classifies ethnocentric consumers by the extent to which they support government-controlled protectionism and consumer-controlled protectionism; and second, a configurational (recipe) perspective on the antecedents. Design/methodology/approach – The study applies fuzzy-set qualitative comparative analysis of survey data from 3,859 consumers. The study contrasts the findings with findings using traditional statistical hypotheses testing via multiple regression analysis. Findings – The results reveal several configurations of antecedents that are sufficient for consistently explaining three distinct types of CE. No single antecedent condition is necessary for high CE to occur. Practical implications – The findings help global business strategists in their market entry decisions and in their targeting and segmentation efforts. Originality/value – The authors show the value of asymmetrical thinking about the relationship between CE and its antecedents. The results expand understanding of CE and challenge conventional net-effects thinking about its antecedents.
Article
Full-text available
The study of scientific discovery—where do new ideas come from?—has long been denigrated by philosophers as irrelevant to analyzing the growth of scientific knowledge. In particular, little is known about how cognitive theories are discovered, and neither the classical accounts of discovery as either probabilistic induction (e.g., H. Reichenbach, 1938) or lucky guesses (e.g., K. Popper, 1959), nor the stock anecdotes about sudden "eureka" moments deepen the insight into discovery. A heuristics approach is taken in this review, where heuristics are understood as strategies of discovery less general than a supposed unique logic discovery but more general than lucky guesses. This article deals with how scientists' tools shape theories of mind, in particular with how methods of statistical inference have turned into metaphors of mind. The tools-to-theories heuristic explains the emergence of a broad range of cognitive theories, from the cognitive revolution of the 1960s up to the present, and it can be used to detect both limitations and new lines of development in current cognitive theories that investigate the mind as an "intuitive statistician." (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Interest has burgeoned, in recent years, in how social networks influence individual creativity and innovation. From both the theoretical and empirical points of view, this increased attention has generated many inconsistencies. In this article we propose that a conceptualization of the idea journey encompassing phases that the literature has so far overlooked can help solve existing tensions. We conceptualize four phases of the journey of an idea, from conception to completion: idea generation, idea elaboration, idea championing, and idea implementation. We propose that a creator has distinct primary needs in each phase: cognitive flexibility, support, influence, and shared vision, respectively. Individual creators successfully move through a phase when the relational and structural elements of their networks match the distinct needs of the phase. The relational and structural elements that are beneficial for one phase, however, are detrimental for another. We propose that in order to solve this seeming contradiction and the associated paradoxes, individual creators have to change interpretations and frames throughout the different phases. This, in turn, allows them to activate different network characteristics at the appropriate moment and successfully complete the idea journey from novel concept to a tangible outcome that changes the field.
Article
Full-text available
Rather than taking a variable-oriented approach, the study here extends Ragin's (1999) perspective on studying conjunctive paths or “causal recipes” for a limited number of cases (usually n 300 – or, in the study here, n>30,000). The aim here is to provide a primer in theory and practice of qualitative comparative analysis (QCA) — a method that goes beyond considering the net contributions of individual variables in influencing a dependent variable. The study aims to describe alternative conjunctive paths (estimated algorithms using Boolean algebra and set theory) that associate with a given outcome. The “outcome condition” being modeled (predicted) in the study are cases (travelers) who complete more than 24 airline trips for personal reasons annually; the label “extreme” or “X” air travelers identifies these cases. Though small in share of numbers (.2%) of Americans, X air travelers are mighty in share (7%) of air-trips for personal reasons. The study finds that completing a university/college degree is a necessary, but not sufficient, simple antecedent condition that identifies X air travelers; the conjunction of completing college and very frequent vacation travel is a sufficient, but not necessary, complex antecedent condition for identifying X air travelers. The approach can be useful in estimating conjunctive conditions leading to specific actions by consumers and executives.
Conference Paper
Full-text available
We present the basic concepts of experimental design, the types of goals it can address, and why it is such an important and useful tool for simulation. A well-designed experiment allows the analyst to examine many more factors than would otherwise be possible, while providing insights that could not be gleaned from trial-and-error approaches or by sampling factors one at a time. We focus on experiments that can cut down the sampling requirements of some classic designs by orders of magnitude, yet make it possible and practical to develop an understanding of a complex simulation model and gain insights into its behavior. Designs that we have found particularly useful for simulation experiments are illustrated using simple simulation models, and we provide links to other resources for those wishing to learn more. Ideally, this tutorial leaves you excited about experimental designs - and prepared to use them - in your upcoming simulation studies.
Article
The present analysis is a reframing of an earlier study conducted by the author to compensate for perceived deficiencies in previous studies on police decisions in sexual assault complaints. Specifically, qualitative comparative analysis was employed at the micro-social level to reveal justification scenarios, employed by investigating officers, which resulted in attrition at the police level. It was found that police employed the legal model in justifying “unfounded” designations while police employed both legal and extralegal models in justifying designations of “departmental discretion.” Further rese arch, expanding the database through interview s and participant observation, is necessary to fully explore justification scenarios for police designations of sexual assault complaints
Chapter
Comparative research is exploding with alternative methodological and theoretical approaches. In this book, experts in each one of these methods provide a comprehensive explanation and application of time-series, pooled, event history and Boolean methods to substantive problems of the welfare state. Each section of the book focuses on a different method with a general introduction to the methods and then two papers using the method to deal with analysis concerning welfare state problems in a political economy perspective. Scholars concerned with methodology in this area cannot afford to overlook this book because it will help them keep up on proliferating methodologies. Graduate students in political science and sociology will find this book extremely useful in their careers.
Article
The analysis of data from market research has, until fairly recently, been reliant upon statistical techniques that were developed during the nineteenth and early twentieth centuries for uses entirely other than the analysis of survey and other types of observational, non-experimental data. Such techniques rely on reviewing and relating the frequency distributions of variables that have been concocted and measured by researchers. This paper argues that key features of such ‘frequentist’ statistics are also limitations that need to be recognised by academics, market research practitioners and the managers to whom they report findings. By focusing on variable distributions across cases, they overlook patterns of within-case configuration; they seek out only symmetrical, linear patterns by reviewing the ‘net effects’ of individual variables; they rely on a very circumscribed view of statistical inference from samples to populations; they are not good at demonstrating causal connections between variables or at handling system complexity. A follow-up paper, ‘Rethinking data analysis – part two: some alternatives to frequentist approaches’, examines ways of approaching data sets that can be seen as viable alternatives.
Article
In ‘Rethinking data analysis – part one: the limitations of frequentist approaches'’ (Kent 2009) it was argued that standard, frequentist statistics were developed for purposes entirely other than for the analysis of survey data; when applied in this context, the assumptions being made and the limitations of the statistical procedures are commonly ignored. This paper examines ways of approaching the analysis of data sets that can be seen as viable alternatives. It reviews Bayesian statistics, configurational and fuzzy set analysis, association rules in data mining, neural network analysis, chaos theory and the theory of the tipping point. Each of these approaches has its own limitations and not one of them can or should be seen as a total replacement for frequentist approaches. Rather, they are alternatives that should be considered when frequentist approaches are not appropriate or when they do not seem to be adequate to the task of finding patterns in a data set.
Article
Products vary in the degree to which they are members of product categories. Understanding how product attributes influence product categorization is important for marketing decisions about product development, brand extension, and product positioning. Existing attribute-level measures of product attitude emphasize evaluation rather than categorization. Measures of category membership either are at the overall product level or emphasize feature-based attributes, whereas product attributes are often continuous in nature. A comprehensive approach for conceptualizing product categorization is provided by fuzzy set theory, with its basic postulate that no clear boundaries exist between members and nonmembers of a fuzzy set. The authors conceptualize product categories as fuzzy sets in which products have degrees of membership along specific attributes, argued to arise as a result of the degree to which a product possesses an attribute. Two fuzzy set-based measures of category membership are developed that assess gradedness of category membership at the attribute level, which is then combined across attributes to reach overall measures of gradedness for a product. The results of a study demonstrate the validity of these measures and illustrate important attribute-level analyses that are not possible with existing approaches.
Article
This study identifies how complex real-world marketing phenomena have been reported since the introduction of Qualitative Comparative Analysis (QCA) as a methodology and set of data analysis and theory development tools in 1987. We provide insights into the idea adoption journey as reflected by marketing researchers’ published studies in 216 marketing-related articles across A- and A*-ranked marketing journals from 1988-2018. We find that QCA as a methodology and a set of tools is not yet firmly established in academic marketing literature. The paper identifies gaps, not covered in the extant literature, in the adoption stages of QCA as a research methodology. The paper suggests some research areas that need to be addressed to help inform improvements in marketing research practice. The paper extends both methodology and marketing research by reconciling insights from a fragmented emerging literature, and highlights the dynamics and interrelationships in the extant literature.
Book
Principles of Forecasting: A Handbook for Researchers and Practitioners summarizes knowledge from experts and from empirical studies. It provides guidelines that can be applied in fields such as economics, sociology, and psychology. It applies to problems such as those in finance (How much is this company worth?), marketing (Will a new product be successful?), personnel (How can we identify the best job candidates?), and production (What level of inventories should be kept?). The book is edited by Professor J. Scott Armstrong of the Wharton School, University of Pennsylvania. Contributions were written by 40 leading experts in forecasting, and the 30 chapters cover all types of forecasting methods. There are judgmental methods such as Delphi, role-playing, and intentions studies. Quantitative methods include econometric methods, expert systems, and extrapolation. Some methods, such as conjoint analysis, analogies, and rule-based forecasting, integrate quantitative and judgmental procedures. In each area, the authors identify what is known in the form of `if-then principles', and they summarize evidence on these principles. The project, developed over a four-year period, represents the first book to summarize all that is known about forecasting and to present it so that it can be used by researchers and practitioners. To ensure that the principles are correct, the authors reviewed one another's papers. In addition, external reviews were provided by more than 120 experts, some of whom reviewed many of the papers. The book includes the first comprehensive forecasting dictionary.
Article
A successful career as an academic researcher is generally driven by intrinsic interest, good taste in problem selection, careful execution (effort), and good communication. Different approaches have proven to be successful and different researchers are suited to different styles. Nonetheless, some general characteristics underlie much successful research. Importantly, unless a person is interested in a problem (and others are also), there is little chance the work will be completed and, if it is, it will have impact. Further, most successful research addresses a relevant problem and is analyzed and communicated in a straightforward way. Importantly, however, no single approach is best. Rather, each researcher is best off tailoring their approach to their own skills and interests.
Article
Building upon the national innovation system perspective and using a fuzzy set qualitative comparative analysis approach (fsQCA), we propose an integrating framework to determine the conditions that lead to high levels of national innovation capability outcomes. We discriminate between five conditions, viz., building national institutions, developing human capital and research systems, improving infrastructures, and facilitating business and market conditions. We do so by analyzing data collected from the Global Innovation Index database containing 74 indicators and 133 countries between 2012 and 2015. The results show no singular path leading to high levels of innovation capability but there are three configurations of conditions. Two configurations highlight that the combination of three distinct conditions is sufficient for a country to reach a high innovation capability (one in which market conditions ‘are not necessary’ and one in which institutions ‘are not necessary’ conditions). The third configuration highlights that the combination of all five conditions is necessary for a country to reach a very high innovation capability. Some crucial implications of these findings for theory and practice are discussed.
Article
In the travel industry, electronic word of mouth (eWOM) elicits a major influence on consumers’ decision making. Travel retailers are facing the new challenges derived from the different nature of their competitors—big hypermarkets, for instance, are extending their brands to travel services—and the challenges derived from online comments that consumers have access to. With a sample of 263 tourists, and using a fuzzy-set qualitative comparative analysis data analysis, this paper shows how the selection between a specialized travel agency and a private label (PL) agency is influenced by five factors: the usefulness attached to online reviews by users and the valence of those online reviews, attitude and experience with PL, and the individual's value consciousness. The contribution of this paper not only comes from the novelty of considering PL in the context of travel agencies, but also from using a relatively novel data analysis approach useful for analyzing management issues.
Article
A deep transformation of the electrical grid system at the global level is expected in the near future, with new economic relations between energy suppliers and consumers. This is the smart grid framework, which can integrate the behavior of all connected users, who become prosumers, to ensure a sustainable energy supply in an efficient, economical and safe manner. Smart grid projects have a significant impact on electricity consumers through shifts in consumer behavior, culture and lifestyle. The aim of this paper is to explore the degree of investigation of the main socio-economic features, in terms of private and social costs, which are relevant for smart grids development and public acceptance. To this aim, a literature review of 148 peer-reviewed scientific journal articles on smart grids has been conducted, developing an original taxonomy for the socio-economic features in terms of private (direct) costs directly associated with the monetary costs paid by consumers, and social (indirect) costs constituted by consumers’ perception, privacy, cyber security and regulation. The importance of the analysis of social costs arising from externalities is that they may hinder the deployment of specific technologies, even if these technologies appear to be useful based on private costs. The results reveal that the explored literature mainly deals with private costs, although an emerging literature starts to address the social costs that may hamper smart grids deployment. The paper reviews new opportunities and challenges for further research, bridging the gap between engineering and socio-economic research areas, including business, organizational applications, policy and security issues. The resulting enrichment of interdisciplinary know-how can be the basis for the sustainable development of current and future generations.
Article
Recent studies in various economic sectors in the U.S.A., Brazil, China, South Korea, and Australia provide evidence of the precursors of customer equity (value, brand, and relationship equity) and their influence on behavior intentions and customer lifetime value (CLV). The aim of this study is to measure customer equity through CLV, design a model for CLV drivers and establish its predictive capacity in two samples obtained at different points in time. The sample comprises customers who have contracts with telecommunications operators in Spain, and the study uses a holdout sample to cross-validate the final sample. A predictive model, developed with partial least squares, analyzes the sector, and assesses the comparability of the four main competing companies. The findings support the importance of customer perceived value in building relationship quality and in brand equity, and reveal intentional loyalty as a precursor of future economic results.
Article
Market orientation (MO) and marketing performance measurement (MPM) are two of the most widespread strategic marketing concepts among practitioners. However, some have questioned the benefits of extensive investments in MO and MPM. More importantly, little is known about which combinations of MO and MPM are optimal in ensuring high business performance. To address this research gap, the authors analyze a unique data set of 628 firms with a novel method of configurational analysis: fuzzy-set qualitative comparative analysis. In line with prior research, the authors find that MO is an important determinant of business performance. However, to reap its benefits, managers need to complement it with appropriate MPM, the level and focus of which vary across firms. For example, whereas large firms and market leaders generally benefit from comprehensive MPM, small firms may benefit from measuring marketing performance only selectively or by focusing on particular dimensions of marketing performance. The study also finds that many of the highest-performing firms do not follow any of the particular best practices identified.
Article
Traditional variable-centred analyses of marketing data are not well suited to the discovery of logical relationships between combinations of factors. This paper suggests that we may need to rethink what we mean by a 'case' and to view cases as configurations of characteristics rather than units of analysis. The processes of using combinatorial logic and fuzzy logic are explained. A new piece of software is introduced and applied to a dataset so that traditional analysis and fuzzy-set qualitative comparative analysis results can be compared.
Article
Consider groups of partying college students failing to helpfully assist someone in life-threatening distress from alcoholic poisoning. Anecdotal evidence (Davis and DeBarros, 2006) supports the social-norming theory subfield of unresponsive bystander research by Latane and Darley (1970) and others (Cialdini and Goldstein, 2004). This article is a call for structurally transforming the dynamics of the unfolding dramas in natural groups where alcoholic poisoning leading to death occurs. The present article includes the proposal for a quasi-experiment of natural groups (members of fraternities and sororities) in naturally occurring contexts (party situations) using placebo, a standardized training for intervention programs for servers (TIPS) designed for peer intervention, and two versions of advanced TIPS designed to structurally introduce a designated interventionist (DI). The DI and DI training designs are crafted to overcome the unresponsive bystander effect. The proposal includes thought experiments to explain both short-and long-term dependent measures of program impact in such quasi-experiments that include immediate measures of alcohol drinking and intervention knowledge, the medium-term creation and assignment of a group DI position, and the long-term interventionist behavior of groups appointing persons holding DI appointments versus groups not making such appointments.
Article
The use of Internet social network sites has become an international phenomenon. These websites enable computer-mediated communication between people with common interests such as school, family, and friendship. Popular sites include MySpace and Facebook. Their rapid widespread usage warrants a better understanding. This study contributes to our understanding by empirically investigating factors influencing user adoption of these sites. We introduce the Social Network Site Adoption model to examine the effect of perceptions of normative pressure, playfulness, critical mass, trust, usefulness, and ease of use on usage intention and actual usage of these sites. Structural equation modeling was used to examine the patterns of inter-correlations among the constructs and to empirically test the hypotheses. All the hypothesized determinants have a significant direct effect on intent to use, with perceived playfulness and perceived critical mass the strongest indicators. Intent to use and perceived playfulness have a significant direct effect on actual usage.
Article
Service innovation is a primary source of competitive advantage and a research priority. However, empirical evidence about the impact of innovativeness on new service adoption is inconclusive. A plausible explanation is that service innovation has thus far been studied using new product frameworks that do not fully capture the complexity of new service assessments by customers. We propose a different, holistic framework, which posits that new service adoption does not depend on individual service attributes, but on specific configurations of such attributes. We investigate this framework in a luxury hotel service context, using qualitative comparative analysis, a set-membership technique that is new to service research and suitable for configuration analyses. Results confirm that individual service attributes have complex trade-off effects and that only specific combinations of attributes act as sufficient conditions for new service adoption. Moreover, the composition of such combinations differs according to the different coproduction requirements. Our findings contribute to managerial practice by providing new insights for improving the service-development process and the launch strategy for new services. They also augment extant service knowledge by demonstrating why interdependencies among various innovation attributes are important to consider for gaining an accurate understanding of new service adoption.
Article
This article provides a first systematic mapping of QCA applications, building upon a database of 313 peer-reviewed journal articles. We find out that the number of QCA applications has dramatically increased during the past few years. The mapping also reveals that csQCA remains the most frequently used technique, that political science, sociology, and management are the core disciplines of application, that macrolevel analyses, medium-N designs, and a monomethod use of QCA remain predominant. A particular focus is also laid on the ratio between the number of cases and number of conditions and the compliance to benchmarks in this respect.
Book
In this book, J. L. Mackie makes a careful study of several philosophical issues involved in his account of causation. Mackie follows Hume's distinction between causation as a concept and causation as it is ‘in the objects’ and attempts to provide an account of both aspects. Mackie examines the treatment of causation by philosophers such as Hume, Kant, Mill, Russell, Ducasse, Kneale, Hart and Honore, and von Wright. Mackie's own account involves an analysis of causal statements in terms of counterfactual conditionals though these are judged to be incapable of giving a complete account of causation. Mackie argues that regularity theory too can only offer an incomplete picture of the nature of causation. In the course of his analysis, Mackie critically examines the account of causation offered by Kant, as well as the contemporary Kantian approaches offered by philosophers such as Bennett and Strawson. Also addressed are issues such as the direction of causation, the relation of statistical laws and functional laws, the role of causal statements in legal contexts, and the understanding of causes both as ‘facts’ and ‘events’. Throughout the discussion of these topics, Mackie develops his own complex account of the nature of causation, finally bringing his analysis to bear in regard to the topic of teleology and the question of whether final causes can be justifiably reduced to efficient causes.
Article
Since the design of experiments was first introduced by Fisher 90 years ago, this scientific and statistical approach to system interrogation for acquiring knowledge has enjoyed success across industries and among products, processes, and services. In the last decade, military test organizations have been promoting the use of design of experiments (DOE) as the preferred method of constructing and analyzing test programs. Increasingly, design of experiments is being used to greater effect and its impact is reaching groups less experienced in the method. Stories of successful application continue to have a common thread: detailed, effective planning. But not all organizations have members experienced in DOE test planning. And although planning papers and how-to case studies have appeared in the literature, the volume of these contribution types is dwarfed by theory and methods papers. If an experiment is planned and the planning process is documented, how would one go about assessing that plan? If the desire is to gauge the probability of experiment success as defined by a robust and truthful understanding of the system revealed upon analysis of the data, can the plan be assessed prior to test execution? This article proposes guidelines and evidence that span all phases of the experiment cycle, which can inform assessment of experiment planning soundness. The experiment cycle of plan, design, execute, and analyze (consistent with literature and texts) is used to structure the discussion, geared toward an audience somewhat familiar with the DOE method. Checklists are provided for each experiment phase coupled with descriptions of what would constitute fingerprints of successful implementation.
Chapter
After reading this chapter, you should be able to: Locate QCA as an approach and grasp its key epistemological foundations Understand how and why QCA is “case oriented” and how one should use QCA to engage in a dialogue between cases and theories Understand the specific conception of causality conveyed in QCA— multiple conjunctural causation— and its practical consequences Reflect on the usefulness of QCA to reach a certain level of generalization beyond the observed cases Grasp key common features of QCA techniques in terms of formalization, replication, transparency, and different types of uses Become accustomed to some key technical terms and use the appropriate, QCA-specific terminology To better understand QCA and its various techniques and applications, it is important to locate it both in its historical epistemological context and in its relationship vis-à-vis other methods of social scientific inquiry. 1 In its more recent developments it dates back The ...
Article