Article

Realistic Realism about Unrealistic Models

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

What follows is a selective and somewhat,abstract summary,of my,thinking about economics, outlined from two perspectives: first historical and autobiographical, then systematic and comparative. The first angle helps understand motives,and trajectories of ideas against their backgrounds,in intellectual history. My story turns out to have both unique and generalizable aspects. The second approach,outlines some,of the,key concepts and arguments as well as their interrelations in my philosophy of economics, with occasional comparisons,to other views. More space will be devoted to this second perspective than to the first. HISTORICAL

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Moreover, the antirealism on offer is local in both a disciplinary sense (it applies to decision theory but might not hold for other areas of economics) and a temporal sense (it concerns the current state of decision theory but might not hold for future developments in the discipline). In section 9, I relate my antirealist account of decision models to the controversy about mentalism and behaviourism in decision theory, and to the realist accounts of decision models advanced by Dietrich and List (2016) and Mäki (2009bMäki ( , 2012. In particular, I reject behaviourism and defend the use of psychological constructs in decision theory. ...
... This has been a long paper, but it would be incomplete without relating, at least cursorily, the antirealist account of decision models on offer to the recent debate about the mentalist versus the behaviourist interpretation of preferences and other decision-theoretic constructs, and to two notable realist accounts of decision models, namely those advanced by Dietrich and List (2016) and Mäki (2009bMäki ( , 2012. ...
Article
Full-text available
I examine some behavioural and heuristic models of individual decision-making and argue that the diverse psychological mechanisms these models posit are too demanding to be implemented, either consciously or unconsciously, by actual decision makers. Accordingly, and contrary to what their advocates typically claim, behavioural and heuristic models are best understood as ‘as-if’ models. I then sketch a version of scientific antirealism that justifies the practice of as-if modelling in decision theory but goes beyond traditional instrumentalism. Finally, I relate my account of decision models to the recent controversy about mentalism versus behaviourism, reject both positions, and offer an alternative view.
... Importantly, the actual theory in DT (Gibbs, 1975(Gibbs, , 1985 can be different from the models applying it, as individual models often do not apply all (1) assumptions or (2) empirically measurable elements of the theory (Mäki 1992(Mäki , 2009Wimsatt, 2007). For example, regarding the latter (2), actual empirical tests of protection motivation theory (PMT) by its developer Rogers (1975Rogers ( , 1983 do not always apply all elements of PMT, such as costs and rewards (Siponen et al., 2023). ...
Conference Paper
Full-text available
In information systems (IS) and IS security (ISS) literature, models are commonly divided into variance and process models. In other scientific disciplines, models are instead commonly divided into stage-less versus stage models. This division is also useful in ISS for two reasons. First, despite common claims, most IS and ISS models, especially in behavioral research, may not be variance models. Second, not only users' ISS behavior but also their reasons for it may change over time. Stage models can be helpful in capturing this development and change in terms of idealized stages. However, while stage models exist in IS(S), their philosophical foundations benefit from clarifications. For instance, the requirements for stage theories cannot be unreservedly copied from other disciplines, such as health psychology, for use in ISS research. ISS scholars must consider a case-by-case basis in building a stage model. To aid in this, cyber security examples are used here to illustrate the concepts and usefulness of stage models. I also explain how stage models differ from process models, which also model change.
... (e.g., Hausman, 1998;Mäki, 2005Mäki, , 2011Goldthorpe, 2016); y (2) puesto que no existe referencia a inobservables, el vínculo entre las descripciones de una teoría o modelo y el concepto de verdad es un asunto relacionado con diferentes grados de abstracción e idealización presentes en el diseño de modelos explicativos, no de establecer compromisos metafísicos sobre la realidad social (Mäki, 2009;Sugden, 2013). ...
Article
Full-text available
Tradicionalmente el realismo científico (postura según la cual el éxito de las ciencias depende de la verdad de sus contenidos), implica la distinción entre entidades observables e inobservables. Sin embargo, cuando vamos al caso de las ciencias sociales parece ser que dicha distinción no se sostiene. ¿Significa esto que el realismo científico es una postura imposible para las ciencias sociales? En este artículo defiendo la idea que el realismo científico sí es posible dentro de la filosofía de las ciencias sociales, pero de una forma diferente a la forma tradicional. Para esto sigo las transformaciones recientes del realismo científico en filosofía de las ciencias para sostener la necesidad de un análisis local de la postura. Al analizar el caso particular de las ciencias sociales observamos que la caracterización del realismo científico puede prescindir de la distinción observable/inobservable y, en cambio, debemos enfocar la discusión en el plano de los supuestos detrás de la generación de explicaciones, especialmente en el caso de las explicaciones causales. La principal consecuencia de este giro es que, en ciencias sociales, la defensa del realismo científico no está a nivel de los compromisos ontológicos, sino en los compromisos metodológicos que guían el diseño de una investigación.
... Models or theories that include all the variables would simply be unusable. Variants of arguments along this line of false models still being useful and explanatory have been defended in the past, for example, by Uskali Mäki with his account of models as isolations (Mäki 2009; For an overview of other arguments see Weisberg 2015). ...
Preprint
Full-text available
Climate change mitigation has become a paradigm case for both externalities in general and for the game-theoretic model of the Tragedy of the Commons (ToC) in particular. This situation is worrying as we have reasons to suspect that some models in the social sciences are apt to be performative such that they can become self-fulfilling prophecies. Framing climate change mitigation as a hardly solvable coordination problem may force us into a worse situation, by changing real-world behaviour to fit our model, rather than the other way around. But while this problem of the performativity of ToC has been noted in a recent paper in this journal by Matthew Kopec, we find his proposed strategies for dealing with their self-fulfilling nature lacking. Instead of relying on the idea that modelling assumptions are always strictly speaking false, we illustrate that the problem may be better framed as a problem of underdetermination between competing explanations. Our goal here is to provide a framework for choosing between this set of competing models that allows us to avoid a 'Russian Roulette'-like situation in which we gamble with existential risk.
... 23 Idealization plays a key role in the methodology of other social sciences, especially of economics. 24 For example, homo economicus is the result of a consistent abstraction-idealization process. One of the fundamental axioms of neoclassical economics, the law of diminishing marginal utility, followed from the highlighting of Weber-Fechner's law in psychophysics, which highlights that the growth of subjectively perceived intensity of recurrent stimuli with the same physical intensity is always decreasing. ...
... Therefore, we decided to go with Blaug's own terminology and use the word "absolutist" for Blaug's standpoint. 25 See The Methodology of Positive Economics: Reflections on the Milton Friedman Legacy (Mäki 2009) for the relevance of F53 today and Uskali Mäki's contribution to the tension between instrumentalism and realism. Especially the inclusion of the debate on truth in economics appears worthwhile to study in the light of our genealogy but would take us off track at this point. ...
... Models or theories that include all the variables would simply be unusable. Variants of arguments along this line of false models still being useful and explanatory have been defended in the past, for example, by Uskali Mäki with his account of models as isolations (Mäki 2009; For an overview of other arguments see Weisberg 2015). ...
Article
Climate change mitigation has become a paradigm case both for externalities in general and for the game-theoretic model of the Tragedy of the Commons (ToC) in particular. This situation is worrying, as we have reasons to suspect that some models in the social sciences are apt to be performative to the extent that they can become self-fulfilling prophecies. Framing climate change mitigation as a hardly solvable coordination problem may force us into a worse situation, by changing real-world behaviour to fit our model, rather than the other way around. But while this problem of the performativity of the ToC has been noted in a recent paper in this journal by Matthew Kopec, his proposed strategies for dealing with their self-fulfilling nature fall short of providing an adequate solution. Instead of relying on the idea that modelling assumptions are always strictly speaking false, this paper shows that the problem may be better framed as a problem of underdetermination between competing explanations. Our goal here is to provide a framework for choosing between this set of competing models that allows us to avoid a ‘Russian Roulette’-like situation in which we gamble with existential risk.
... A useful review of several of these issues can be found inMäki (2009).Lawson (1997) offers the most comprehensive discussion of the question of realism in economics. 2 I have argued elsewhere that this is a misnomer, since Pasinetti includes Sraffa, who, although personally and intellectually close to John Maynard Keynes, can hardly be called a Keynesian (seeMarcuzzo, 2014). ...
Chapter
Full-text available
Recent economic and financial crises have exposed mainstream economics to severe criticism, bringing present research and teaching styles into question. Building on a solid and vivid tradition of economic thought, this book challenges conventional thinking in the field of economics. The authors turn to the work of Luigi Pasinetti, who proposed a list of nine methodological and theoretical ideas that characterize the Classical Keynesian School. Drawing inspiration from both Keynes and Sraffa, this school has forged a long-standing and ambitious research programme often advocated as a competing paradigm to mainstream economics. Overall, the Classical Keynesian School provides a comprehensive analytical framework into which most non-mainstream schools of thought can be integrated. In this collection, a group of leading scholars critically assess the nine main ideas that, in Pasinetti's view, characterize the Classical-Keynesian approach, evaluating their relevance for both the history of economics and for present economic research.
... Models or theories that include all the variables would simply be unusable. Variants of arguments along this line of false models still being useful and explanatory have been defended in the past, for example, by Uskali Mäki with his account of models as isolations (Mäki 2009; For an overview of other arguments see Weisberg 2015). ...
Preprint
Full-text available
Climate change (CC) has become a paradigm case for externalities in general and for the Tragedy of the Commons (ToC) model by Hardin in particular. This is worrying as we have reasons to suspect that models like ToC are performative, such that they might become self-fulfilling prophecies. In this paper, we aim to enhance a strategy proposed by Matthew Kopec to cope with the self-fulfilling nature of ToC. First, we show how Kopec’s strategy about emphasising that ToC relies on strictly speaking false assumptions is unlikely to be a successful strategy. To construct a more promising strategy we argue that the argument of underdetermination implies that the employment of a specific model is an active choice that is guided by pragmatic criteria. Furthermore, picturing underdetermination in the case of CC as a form of Russian Roulette provides a rationale to choose between these underdetermined models.
... Among these, Daniel Hausman and Alex Rosenberg set out to test and stretch the heritage of logical empiricism; Bert Hamminga imported and applied the structuralist framework of Sneed and Stegmueller; I started moving toward a self-made scientific realism tailored for the peculiarities of economics; there were a few others such as Maurice Lagueux and Margaret Schabas (regarding the approaches of Hausman, Rosenberg and myself, see e.g. Hausman, 2009Hausman, , 2017Mäki, 1996Mäki, , 2009cMireles-Flores, 2008;Rosenberg, 2009). ...
Article
Full-text available
The paper sketches a story about how and why the field of economic methodology / philosophy of economics emerged (as a further step in specialization), and how it has evolved intellectually and institutionally. It considers the field as an institutionalized form of higher-order reflection on the discipline of economics and suggests that such reflection, also in its pre-field form, has recurring triggering conditions (e.g. alleged failures in economics, fundamental controversy, launch of new research style) and functions (e.g. criticism, defense, programmatic statement). It lists topics of inquiry that derive from the concerns economists and others have about the discipline (as a modelling discipline and a policy science). It offers consolation to those who worry about effectively addressing academic economists as the primary audience, suggesting there are other valuable audiences (such as philosophers of science and policy makers). It gives examples of how critical conversation will ensure progress in the field.
... As we know, according to M. Weber the ideal type is a research "utopia" that has no formal analogue in a particular slice of social reality [2]. The ideal type as a theoretical construct is formed by analyzing the empirical reality based on a specific research perspective [3]; selected focus of knowledge of phenomena connects isolated single events, characters, actions into the "space of mental connections devoid of internal contradictions" [4]. ...
... Capturar esta verdad no requiere ninguna desidealización a través de relajar esos supuestos." En Mäki, U., (2009a), "Realistic realism about unrealistic models", en A Handbook of the Philosophy of Economics, ed. Harold Kincaid and Don Ross. ...
Article
Full-text available
En este trabajo intentaremos analizar aquellos supuestos que "hacen posible" el realismo posible de Uskali Mäki. Sólo adhiriendo a determinados supuestos podemos sostener que se puede predicar la verdad de los modelos económicos que pretende este autor. Los supuestos son: en torno a la realidad, a los modelos y a la noción de verdad. Introduciremos el realismo posible en el marco del realismo científico y señalaremos la importancia del realismo ontológico. Según este último en la propuesta de Mäki, no sólo se sostiene la existencia de un mundo externo sino también la consideración de que está formado por una estructura compuesta por mecanismos. Presentaremos la consideración MISS de los modelos por la que éstos se definen en términos de aislamientos e idealizaciones, representaciones y sistemas subrogados. Profundizaremos en las nociones de verdad y semejanza y la relación entre ambas. Explicaremos por qué Mäki sostiene que los modelos son portadores de verdad.
... For the moment, we will leave aside the issue of whether a digital twin of a huge and complex system like the energy economy is feasible or not. To describe these, we rely on Mäki's taxonomy of model realism [58], and distinguish between those models that can only serve as substitutes for reality, and those which qualify as surrogate representations of reality. Mäki characterises substitute models as those that are disconnected from the real world but which nevertheless may be used as test beds for developing novel techniques. ...
Article
Full-text available
Energy economy models are central to decision making on energy and climate issues in the 21st century, such as informing the design of deep decarbonisation strategies under the Paris Agreement. Designing policies that are aimed at achieving such radical transitions in the energy system will require ever more in-depth modelling of end-use demand, efficiency and fuel switching, as well as an increasing need for regional, sectoral, and agent disaggregation to capture technological, jurisdictional and policy detail. Building and using these models entails complex trade-offs between the level of detail, the size of the system boundary, and the available computing resources. The availability of data to characterise key energy system sectors and interactions is also a key driver of model structure and parameterisation, and there are many blind spots and design compromises that are caused by data scarcity. We may soon, however, live in a world of data abundance, potentially enabling previously impossible levels of resolution and coverage in energy economy models. But while big data concepts and platforms have already begun to be used in a number of selected energy research applications, their potential to improve or even completely revolutionise energy economy modelling has been almost completely overlooked in the existing literature. In this paper, we explore the challenges and possibilities of this emerging frontier. We identify critical gaps and opportunities for the field, as well as developing foundational concepts for guiding the future application of big data to energy economy modelling, with reference to the existing literature on decision making under uncertainty, scenario analysis and the philosophy of science.
Article
This is a review article* of a book celebrating Pasinetti’s 90th birthday in 2020, Pasinetti and the Classical Keynesians: Nine Methodological Issues, edited by Enrico Bellino and Sebastiano Nerozzi. His demise in 2023 is, of course regrettable - but it is no excuse for giving a view of a book that was published in his lifetime. The view expressed in this Review Article considers Pasinetti as a vintage member of the Cambridge Keynesian Group, which encompasses, in addition to those near Keynes, those second-generation scholars who also made a contribution—sometimes decisive—to various aspects of Keynesian Economics.
Article
This paper seeks to convince historians that investigating how tractability has shaped individual and collective modeling choices in economics is a valuable endeavor. To do so, I first survey the economic methodology literature on tractability, one that grew out of methodologists’ attempts to explain why their authors make unrealistic assumptions. I then compare these accounts with the few instances where 20th century economists discussed tractability explicitly. This short survey suggests that there is a need for historians to document the collective dynamics at work when tractability motives are invoked. I suggest that disentangling theoretical, empirical and computational tractability might be fruitful, but also difficult. I ask how and why choices made for tractability purposes meant to be idiosyncratic and temporary often become collectively entrenched, sometimes creating “tractability traps.” Finally, I consider the existence of “tractability standards” that differ across time and fields.
Article
Full-text available
Specialisation, by seeking theoretically deeper explanations or more accurate predictions, is common in the sciences. It typically involves splitting, where one model is further divided into several or even hundreds of narrow-scope models. The Information Systems (IS) literature does not discuss such splitting. On the contrary, many seminal IS studies report that a narrow scope is less strong, less interesting, or less useful than a wider scope. In this commentary, we want to raise the awareness of the IS community that in modern scientific progress, specialisation-an activity that generally narrows the scope and decreases the generalisability of a hypothesis-is important. The philosophy of science discusses such positive developments as splitting and trading off a wide scope in favour of accuracy. Narrowing the scope may increase value, especially in sciences where practical applicability is valued. If the IS community generally prefers a wider scope, then we run the risk of not having the information necessary to understand IS phenomena in detail. IS research must understand splitting, how it results in narrowing the scope, and why it is performed for exploratory or predictive reasons in variance, process, and stage models. ARTICLE HISTORY
Chapter
Although the virtues are implicit in Catholic Social Teaching, they are too often overlooked. In this pioneering study, Andrew M. Yuengert draws on the neo-Aristotelian virtues tradition to bring the virtue of practical wisdom into an explicit and wide-ranging engagement with the Church's social doctrine. Practical wisdom and the virtues clarify the meaning of Christian personalism, highlight the irreplaceable role of the laity in social reform, and bring attention to the important task of lay formation in virtue. This form of wisdom also offers new insights into the Church's dialogue with economics and the social sciences, and reframes practical political disagreements between popes, bishops, and the laity in a way that challenges both laypersons and episcopal leadership. Yuengert's study respects the Church's social tradition, while showing how it might develop to be more practical. By proposing active engagement with practical wisdom, he demonstrates how Catholic Social Teaching can more effectively inform and inspire practical social reform.
Article
Full-text available
The aim of the paper is an attempt to show that the personalist economy – by referring to the idea of a person, containing and expressing all the dimensions of being human in relation to the world of people and things – provides the ability to integrate issues of ethics and economics. To achieve this goal, first, the concept of personalist economics will be presented. Then, ideas common for economics and the ethics of space, which can be a basis for integrating their issues, will be identified and characterized. Finally, the author tries to present the consequences which may result from such an integration, both for the economy as a science and as a practice of everyday life.
Article
Full-text available
Much has been written about Adam Smith’s economics, but one unanswered question is whether his method of inquiry is a modelling approach. It is therefore interesting to investigate to what extent Smith can be described as an economic modeller. Such an investigation is presented in this paper. By studying elements of modelling methods developed by those who influenced Smith, as well as his own ways of doing economics, together with his general insights on how science, including economics, should be practiced, we show that his method of inquiry is hugely based on modelling empirical phenomena.
Article
In this innovative book, Hasok Chang constructs a philosophy of science for 'realistic people' interested in understanding and promoting the actual practices of inquiry in science and other knowledge-focused areas of life. Inspired by pragmatist philosophy, he reconceives the very notions of reality and truth on the basis of his concept of the 'operational coherence' of epistemic activities, and offers new pragmatist conceptions of truth and reality as operational ideals achievable in actual scientific practice. Rejecting the version of scientific realism that is concerned with claiming that our theories correspond to an ultimate reality, he proposes instead an 'activist realism': a commitment to do all that we can actually do to improve our knowledge of realities. His book will appeal to scholars and students in philosophy, science and the history of science, and all who are concerned about the place of science and empirical truth in society.
Chapter
This chapter claims that how economists actually treat the consumption function, both theoretically and empirically, seems to be at odds with traditional structural realism. It argues that while economics aims to find out the way the world actually is, economic theory—including theoretical staples such as the consumption Euler equation—do not always state literal truths. Yet, such methods persist. The chapter’s investigation will reveal that the Euler equation is deployed as part of a realist methodology to identify scale-dominant behaviors. These analyses permit the economist to conduct counterfactual reasoning and sometimes (via policy) to intervene on the world itself; the implication is when this is successful, the economist has uncovered real structure.
Chapter
Full-text available
This chapter sets out to answer two questions: first, what is the nature of the economy as a general reality? and, second, what are its constituents? The chapter specifically delves into the nature of the economy and only enumerates the components that will be analyzed in subsequent chapters. It takes into account the explorations into the nature of the economy undertaken by Aristotle, Cartwright, Lawson, and Mäki, finding some general coincidences among them. The conclusion is that the economic realm consists of accidental entities or properties that cannot be separated from underlying variable structures, capacities, and institutions, or from the agents that have diverse motivations influencing economic actions. This contingent realm calls for local analyses.
Chapter
The metaphysical analysis of the economy and economic entities calls for a definition of the metaphysical “categories” of being that will be used in this analysis. This chapter will first introduce the Aristotelian categories, but it will also explore the metaphysical categories posited by Nancy Cartwright, Tony Lawson, and Uskali Mäki (in alphabetical order). The aim of this chapter is not to present all the ontological insights of these proposals, but to glean the categories that they employ to consider their potential use in this book. These authors take a critical look at contemporary standard economics. It is not the purpose of this chapter to focus on their criticisms, but only to pick up the concepts that might help to build an ontological analysis.
Book
Full-text available
Mapping Mainstream Economics: Genealogical Foundations of Alternativity seeks to establish a definition of the mainstream economics, and by extension the alternatives to it, by adopting a genealogical approach: tracing the methodological development of the economic mainstream through its ancestry, which allows for a definition of the mainstream that is separate from politically charged categories or gridlocked academic arguments between received schools of thought. The book follows the evolution of the economic mainstream through four major transformations of the discipline: from political to analytical economics, debates around a logical empiricist economics, the consolidation of neoclassical economics, and the recent expansion of the mainstream. For each of these steps, the key point of departure is explored, illustrated through the work of leading authors at the time. Thus, the book draws on recent research from the history of economic thought and debates the crucial role of historic concepts of economics for alternativity in the field. To put the approach into practice, it examines the relation between today’s mainstream economics and two of its alternatives: ecological economics and degrowth. Finally, the book reflects on recent exciting developments in the discourse on alternativity and sheds light on some distant relatives of today’s mainstream. This book marks a significant contribution to the literature on the debates around the state and nature of mainstream, alternative, and heterodox economics.
Article
Full-text available
As economics became a model-based science, ontological nature, cognitive status, and practical uses of economic models came under the spotlight of philosophers of economics and economic methodologists. However, what was strikingly missing was the interest in the cultural dimension of economic modeling. Some calls for thematizing "cultural framework" (Mäki), "enculturation" (Goldschmidt, Remmele), or "culture patterns" (Benton) of economic models have appeared in recent years, and this paper aims at addressing such calls. To this end, we start with the artifactual approach to economic models (Morgan, Knuuttila, Halsmayer), which cuts across the idealization-construction debate, and complement this approach with the cultural-semiotic component, drawing from the symbolic anthropology of Clifford Geertz. We thus come up with an interpretation of economic models as cultural artifacts, which enables us to address the insufficiently explored question of style in economic modeling using Nelson Goodman's semiotic account of style.
Article
Robert Lucas' ([1972b] 1981a) article on the neutrality of money represented the first effective challenge to Samuelson’s neoclassical synthesis methodological separation between static microeconomic optimisation and macroeconomic dynamics. Lucas rejected disequilibrium price dynamics, as expressed by the Walrasian tâtonnement and auctioneer mechanisms. Lucas’ new treatment of equilibrium as an expectational concept, determined by the rational behaviour of information processing agents, was not restricted to market clearing competitive economies. Lucas’ effort to compare alternative rational expectations models of price stickiness (including his 1972 original formulation) led him to stress the notion of descriptive realism of the models’ main assumptions, which played an important role in his original discussion of model robustness.
Article
In the late 1970s Paul Samuelson drafted the outline of a paper, never published, with a critical assessment of the theoretical innovations of postwar development economics. He found it a “vital” but essentially “not tractable” subject, with a “voluminous” and “repetitive” literature. This article discusses how that assessment fits in Samuelson’s published writings on economic development, throughout several editions of his textbook Economics, and in papers he wrote before and after that assessment. Increasing returns posed a main analytical hurdle, together with the elusive attempt to provide “laws of motion” of economic development. Samuel son’s notion of “tractability” may be traced back to Peter Medawar’s well-known definition of science as the “art of the soluble.”
Chapter
The goal of this paper is to reconstruct Nagel’s approach to idealization in the sciences and present his views as a viable option. In a nutshell, the theory that emerges can be described as follows: There are various types of idealization, which can be found in theoretical and experimental laws, and which, according to Nagel, play various important epistemic roles. In particular, they help organize complex knowledge and allow for approximations to truth. A cognate of idealization, which Nagel does not refer to under this label but describes in terms of assimilation to the familiar and in terms of analogy, is key to scientific models, whose value is primarily heuristic in nature. The fact that theories involve idealization supports an instrumentalist perspective on science, a perspective which also determines a particular scope of the account of idealization. By these lights, particular types of questions simply do not arise, for instance: questions pertaining to the alleged peculiarity of scientific representation, the truth of law statements, etc.
Article
The history of modern economics abounds with pleas for more pluralism as well as pleas for more unification. These seem to be contradictory goals, suggesting that pluralism and unification are mutually exclusive, or at least that they involve trade-offs with more of one necessarily being traded off against less of the other. This paper will use the example of Paul Samuelson's Foundations of Economic Analysis (1947) to argue that the relationship between pluralism and unification is often more complex than this simple dichotomy suggests. In particular, Samuelson's Foundations is invariably presented as a key text in the unification of modern economics during the middle of the twentieth century; and in many ways that is entirely correct. But Samuelson's unification was not at the theoretical (causal and explanatory) level, but rather at the purely mathematical derivational level. Although this fact is recognized in the literature on Samuelson, what seems to be less recognized is that for Samuelson, much of the motivation for this unification was pluralist in spirit: not to narrow scientific economics into one single theory, but rather to allow for more than one theory to co-exist under a single unified derivational technique. This hidden pluralism will be discussed in detail. The paper concludes with a discussion of the implications for more recent developments in economics.
Article
Full-text available
The aim of this article is to question the epistemic presuppositions of applying behavioural science in public policymaking. Philosophers of science who have examined the recent applications of the behavioural sciences to policy have contributed to discussions on causation, evidence, and randomised controlled trials. These have focused on epistemological and methodological questions about the reliability of scientific evidence and the conditions under which we can predict that a policy informed by behavioural research will achieve the policymakers’ goals. This paper argues that the philosophical work of Helen Longino can also help us to have a better and fuller understanding of the knowledge which the behavioural sciences provide. The paper advances an analysis of the knowledge claims that are made in the context of policy applications of behavioural science and compares them with the behavioural research on which they are based. This allows us to show that behavioural policy and the debates accompanying it are based on an oversimplified understanding of what knowledge behavioural science actually provides. Recognising this problem is important as arguments that justify reliance on the behavioural sciences in policy typically presume this simplification.
Chapter
We first map the various concepts used in the book into critical realism (CR) domains. We then discuss the implications of behavioral TCE. Particularly, we discuss the mutual relevance of behavioral TCE and international business research. On the one hand, behavioral TCE is relevant to international business because the OLI paradigm incorporates a TCE element and can be similarly rendered behavioral and subsumed under behavioral TCE after being ontologized in CR, and because the process of rendering OLI behavioral resolves many issues with traditional OLI. On the other hand, international business has the potential to become the base camp for behavioral TCE due to various types of distance that international business settings create. A fundamental deficiency in traditional TCE is its unawareness and thus near-total neglect of the implications of distances, despite the fact that distances affect cognitive bounds. This chapter subsequently suggests some important future directions. We highlight that cultural distance is a symmetrical construct and recommend that its productive use lies in how it is perceived in emic-etic interactions. We conclude by suggesting that a behavioral theory is a self-conscious one, which treats the firm as an explanans and thus an ‘instrument for adaptation’ in value creation and competitiveness enhancement.
Article
Full-text available
The recent literature on economic models has rejected the traditional requirement that their epistemic value necessary depended on them offering actual explanations of phenomena. Contributors to that literature have argued that many models do not aim at providing how-actually explanations, but instead how-possibly explanations. However, how to assess the epistemic value of HPEs remains an open question. We present a programmatic approach to answering it. We first introduce a conceptual framework that distinguishes how-actually explanations from how-possibly explanations and that further differentiates between epistemic and objective how-possibly explanations. Secondly, we show how that framework can be used for methodological appraisal as well as for understanding methodological controversies.
Article
Full-text available
This paper adds to the philosophical literature on mechanistic explanation by elaborating two related explanatory functions of idealisation in mechanistic models. The first function involves explaining the presence of structural/organizational features of mechanisms by reference to their role as difference-makers for performance requirements. The second involves tracking counterfactual dependency relations between features of mechanisms and features of mechanistic explanandum phenomena. To make these functions salient, we relate our discussion to an exemplar from systems biological research on the mechanism for countering heat shock—the heat shock response (HSR) system—in Escherichia coli (E. coli) bacteria. This research also reinforces a more general lesson: ontic constraint accounts in the literature on mechanistic explanation provide insufficiently informative normative appraisals of mechanistic models. We close by outlining an alternative view on the explanatory norms governing mechanistic representation.
Article
We undertake a comprehensive descriptive and comparative ontology of capital in the history of economic thought post-1870. Beginning with the pioneering contributions of Menger, Böhm-Bawerk, Clark and Knight, we reassess the familiar dualistic ontology of capital that contrasts 'materialist' and 'fundist' approaches. Advancing beyond this dualism, we find that the ontology of capital is an evolving mosaic presenting many nuances and overlapping with other ontologies concerning notions of time and atomism. There is no substitute for examining the diverse theories , causal explanations and conceptual systems in which capital is embedded. In episodic capital controversies, economists have employed distinctive metaphors of capital revealing hidden presuppositions that imply specific functional and dispositional properties of capital. Ontological comparison can uncover implicit ideas about capital, as evidenced in the metaphors used by Böhm-Bawerk, Hayek and Robinson. The benefits of a descriptive and comparative approach are further illustrated in our critical appraisal of the modern monetary ontology of capital associated with Piketty, business finance and growth accounting. Differentiated by their specific ontologies, each explanation of capital in market economies should be regarded as at best a very partial account, though our assessment shows that some explanations are relatively more fragmentary and impoverished than others.
Article
Full-text available
Theoretical laws need to be conjoined with auxiliary assumptions in order to be empirically testable, whether in natural or social science. A particularly heated debate has been developing over the nature and role of these assumptions in economic theories. The so called “F(riedman)-Twist” (“the more significant the theory, the more unrealistic the assumptions”, Friedman 1953) as well as some later criticisms by authors like Musgrave, Lawson, Mäki and Cartwright will be examined. I will explore the apparent conflict between the Popperian desideratum to pursue the independent testability of auxiliary assumptions and the idealizational theoretical means needed to isolate causal variables.
Article
The hypothetico-deductive (H-D) method is reported to be common in information systems (IS). In IS, the H-D method is often presented as a Popperian, Hempelian, or natural science method. However, there are many fundamental differences between what Popper or Hempel actually say and what the alleged H-D method per Hempel or per Popper means in IS. To avoid possible misunderstanding and conceptual confusion about the basic philosophical concepts, we explain some of these differences, which are not mentioned in IS literature describing the H-D model. Due to these distinctive differences, the alleged H-D method per Hempel or per Popper in IS cannot be regarded as the H-D model per Hempel or per Popper. Further, the H-D model is sometimes confused with another model in IS, the deductive-nomological (D-N) model of explanations. Confusing the H-D and D-N methods can also produce stagnation in the fundamental methodological thinking in IS. As one example, the H-D model (per Hempel or per Popper) does not require hypotheses to be based on existing theories or literature. As a result, misunderstanding the H-D model in IS may seriously limit new hypothesis or theory development, as the H-D model in the philosophy of science allows guessing and imagination as the source for hypotheses and theories. We argue that although IS research (1) generally does not follow the H-D method (per Hempel or per Popper), and (2) should not follow the H-D method, (3) we can still learn from the H-D method and criticisms of it. To learn from the H-D method, we outline method of hypothesis (MoH) approaches for further discussion. These MoH approaches are not hypothetico-deductive, but hypothetico-inductive-qualitative or hypothetico-inductive-statistical. The former MoH endeavors to be suitable for qualitative research, while the latter is aimed for statistical research in IS.
Chapter
This chapter draws a sharp distinction between Friedman’s and Lucas’s methodologies. Friedman’s famous positivist study stands in the focus to aid in considering his stance regarding both the role models play in causal understanding and the relationship between causal understanding and theoretical assumptions. The framework for analyzing Friedman’s theorizing strategy is Frank H. Knight’s and his predecessor Max Weber’s social scientific methodologies. By this comparison Friedman’s methodological strategy is regarded as a special mix of descriptive inaccuracy and causal inadequacy. Abandoning causal understanding follows from Friedman’s mistake of conceiving the dismissal of real entity properties as implied by poor descriptive performance of models. Lucas in his microfoundations project restored the connection between theories and socio-economic reality at the level of assumptions which he took necessary for establishing causal understanding. Whilst Lucas also minimized the descriptive capabilities of his models, by preserving the relevant real entity properties he made it possible for his theory to contribute to the understanding of the causal structure underlying large-scale fluctuations.
Chapter
This summarizing chapter places our main conclusions into the context of hermeneutics and highlights some shortcomings of the most influential Lucas interpretations so as to stir up controversy. It is argued, first, that any interpretation in the history of economic thought and the methodology of economics ought to rest upon a textual basis as broad as possible, including the available unpublished works; and second, that a prerequisite for any sensible interpretive effort is an adequate analytical engine conceived as a subset of our pre-given knowledge. Here semirealism has served as a framework appropriate for scrutinizing the relationship between macro-level phenomena and economic entities and has facilitated a realist interpretation of Lucas’s microfoundations project. On this basis some neglected aspects of Lucas’s theorizing practice have come to the fore. Interpreting micro-founded economics as a realist endeavour has some implications as to the future of economics. For Lucas building economics upon the choice-theoretic framework has held the promise of bringing the history of economics to an end.
Chapter
In this chapter, I introduce and defend the hypothesis that all (formalized) models in sociology are toy models. I claim that all models in sociology necessarily include many Aristotelian and Galilean idealizations, and that they are all extremely simple in that they represent only a small number of explanatory factors. All models that have been developed in sociology hitherto are toy models. All models that will be developed in the future will be toy models – even if they are run on high performance computers (which has yet to be the case). I will call this hypothesis the weak only-toy-models hypothesis and distinguish it from the strong only-toy-models hypothesis. The latter states that all (formalized) models in the social sciences are toy models. This includes other disciplines such as political science and economics, e. g. neoclassical economics models. To defend the strong only-toy-models hypothesis would be to go beyond the scope of this article. Drawing on modern sociological theory and the philosophy of the social sciences, I defend the weak only-toy-models hypothesis. In the discussion, I propose the concept of an extensive model as an antonym to the concept of the toy model.
Article
Full-text available
The present paper tries to show that in the discussion on whether modelling or not, is better to capture truth in the social world, that is not what is mainly being discussed. We will sustain that the main question in this discussion is not, principally, methodological but ontological. As a representative of the “to model position” we will refer to Uskali Mäki´s Possible Realism, and as one of the “not to model position” we will use Tony Lawson´s Critical Realism. What will be sustained is that the main differences between these positions as regards methodology to access the social world, lie in the different ontologies about the social realm. We will also try to find out if there is any possibility of an “in between position”, or at least the chance of a dialogue between these different epistemological trends.
Article
Full-text available
This article applies an ontology-based approach to economic experiments, emphasizing their differences with respect to physical science experiments. To contextualize our discussion, a conciliatory Weberian view of the similarities and differences between natural and social sciences is provided. After that, some ontological features of the social sciences' domain are highlighted, together with their problematic effect on experimental economics. Specifically, we focus on human beings' representational capacities and intentionality, their cultural and conventionally mediated forms of social interaction, and the holistic openness, instability and uncertainty of the social world. Finally, we emphasize the severe under-determination of theory by evidence affecting social science, as well as the related problems of empirical ambiguity, confirmatory biases and propensity to pseudoscientific practices in experimental economics.
Chapter
In this chapter, we address the question of what health economic models represent. Are they realistic? And, does model realism matter? Or, is model usefulness in terms of informing pricing, reimbursement, and prescribing decisions all policymakers care about? The usefulness of models is circumscribed given that: (1) market failure is inherent in healthcare and (2) models oversimplify the preference structure underlying choices. We suggest, however, that models which employ the ceteris paribus clause can be useful in order to isolate factors that play a role in healthcare decision-making and ultimately characterize agents’ multiattribute utility functions through discrete choice experiments. As a result, policymakers gain important knowledge about decision criteria in the healthcare system.
Article
Full-text available
The most common argument against the use of rational choice models outside economics is that they make unrealistic assumptions about individual behavior. We argue that whether the falsity of assumptions matters in a given model depends on which factors are explanatorily relevant. Since the explanatory factors may vary from application to application, effective criticism of economic model building should be based on model-specific arguments showing how the result really depends on the false assumptions. However, some modeling results in imperialistic applications are relatively robust with respect to unrealistic assumptions.
Article
Full-text available
All economic models involve abstractions and idealisations. Economic theory itself does not tell which idealizations are truly fatal or harmful for the result and which are not. This is why much of what is seen as theoretical contribution in economics is constituted by deriving familiar results from different modelling assumptions. If a modelling result is robust with respect to particular modelling assumptions, the empirical falsity of these particular assumptions does not provide grounds for criticizing the result. In this paper we demonstrate how derivational robustness analysis does carry epistemic weight and answer criticism concerning its non-empirical nature and the problematic form of the required independence of the ways of derivation. The epistemic rationale and importance of robustness analysis also challenge some common conceptions of the role of theory in economics.
Article
Full-text available
Musgrave (1981) proposed a typology of assumptions, developed further by Maki (2000), to defend the idea that the truth of assumptions is often important when evaluating economic theories against those economists who consider only predictive success to be relevant for this purpose. In this paper I propose a new framework for this typology that sheds further light on the issue. The framework consists of a distinction between first-order assumptions that state the absence or lack of effect of some factor F, and second-order assumptions that explicate the purposes for which or the reasons why particular first-order assumptions are imposed. Given this distinction, Musgrave's main contention can be reformulated as the claim that, even though the falsity of first-order assumptions is often unproblematic, it is important that the second-order assumptions be true. I go on to introduce the notion of a tractability assumption, which is a second-order assumption according to which a first-order assumption is imposed in order to make a particular problem tractable. It is argued that a realist will want to relax a first-order assumption imposed for reasons of tractability as such assumptions are not even approximately true. These amendments to the Musgrave-Maki typology are suggested in order to improve our understanding of what moves scientists when they choose particular first-order assumptions, many of which are false, and in order to argue that the practice of doing so can be supported from a realist perspective of science.
Article
Full-text available
Economic models often include unrealistic assumptions. This does not mean, however, that economists lack a concern for the truth of their assumptions. Unrealistic assumptions are frequently imposed because the effects are taken to be negligible or because the problem at hand is intractable without them. Using the Musgrave-Maki typology as the point of departure, these claims are defended with respect to theories proposed by Solow, Hall and Roeger concerning productivity growth and the mark-up. Since they are unobservable, their values need to be inferred from the values of observable variables. Assumptions such as perfect competition and constant returns to scale are used for making this inference or measurement problem tractable. Other assumptions are justified in terms of negligibility. These findings support the fecundity of the (amended) Musgrave-Maki typology of assumptions - including the notion of a tractability assumption proposed here. Finding ways of relaxing tractability assumptions turns out to be an important source of progress in economics.
Article
Full-text available
A comparison of models and experiments supports the argument that although both function as mediators and can be understood to work in an experimental mode, experiments offer greater epistemic power than models as a means to investigate the economic world. This outcome rests on the distinction that whereas experiments are versions of the real world captured within an artificial laboratory environment, models are artificial worlds built to represent the real world. This difference in ontology has epistemic consequences: experiments have greater potential to make strong inferences back to the world, but also have the power to isolate new phenomena. This latter power is manifest in the possibility that whereas working with models may lead to 'surprise', experimental results may be unexplainable within existing theory and so 'confound' the experimenter.
Article
Chapter
This article defines the concept of realism and explores its implications for ontology, defending the idea of ontological realism and its relevance to economics, while rejecting the idea of some special ‘realist ontology’ that informs us about the ways of the real world. The main focus is on what it is for the world (its constituents, structure, and ways of functioning) to exist. Economics-relevant scientific realism suggests that much of the social world is characterized by non-causal science-independence. Implications of this are outlined for causation, social construction, economics-dependence of the economy, modelling, and truth in economics.
Article
Book
Nancy Cartwright argues for a novel conception of the role of fundamental scientific laws in modern natural science. If we attend closely to the manner in which theoretical laws figure in the practice of science, we see that despite their great explanatory power these laws do not describe reality. Instead, fundamental laws describe highly idealized objects in models. Thus, the correct account of explanation in science is not the traditional covering law view, but the ‘simulacrum’ account. On this view, explanation is a matter of constructing a model that may employ, but need not be consistent with, a theoretical framework, in which phenomenological laws that are true of the empirical case in question can be derived. Anti‐realism about theoretical laws does not, however, commit one to anti‐realism about theoretical entities. Belief in theoretical entities can be grounded in well‐tested localized causal claims about concrete physical processes, sometimes now called ‘entity realism’. Such causal claims provide the basis for partial realism and they are ineliminable from the practice of explanation and intervention in nature.
Article
The paper seeks to offer [1] an explication of a concept of economics imperialism, focusing on its epistemic aspects; and [2] criteria for its normative assessment. In regard to [1], the defining notion is that of explanatory unification across disciplinary boundaries. As to [2], three kinds of constraints are proposed. An ontological constraint requires an increased degree of ontological unification in contrast to mere derivational unification. An axiological constraint derives from variation in the perceived relative significance of the facts explained. An epistemological constraint requires strong fallibilism acknowledging a particularly severe epistemic uncertainty and proscribing against over-confident arrogance.
Article
Explanatory unification—the urge to “explain much by little”—serves as an ideal of theorizing not only in natural sciences but also in the social sciences, most notably in economics. The ideal is occasionally challenged by appealing to the complexity and diversity of social systems and processes in space and time. This article proposes to accommodate such doubts by making a distinction between two kinds of unification and suggesting that while such doubts may be justified in regard to mere derivational unification (which serves as a formal constraint on theories), it is less justified in the case of ontological unification (which is a result of factual discovery of the actual degree of underlying unity in the world).
Article
A newly emerged field within economics, known as geographical economics, claims to have provided a unified approach to the study of spatial agglomerations at different spatial scales by showing how these can be traced back to the same basic economic mechanisms. We analyse this contemporary episode of explanatory unification in relation to major philosophical accounts of unification. In particular, we examine the role of argument patterns in unifying derivations, the role of ontological convictions and mathematical structures in shaping unification, the distinction between derivational and ontological unification, the issue of how explanation and unification relate, and finally the idea that unification comes in degrees.
Article
One prominent aspect of recent developments in science studies has been the increasing employment of economic concepts and models in the depiction of science, including the notion of a free market for scientific ideas. This gives rise to the issue of the adequacy of the conceptual resources of economics for this purpose. This paper suggests an adequacy test by putting a version of free market economics to a self-referential scrutiny. The outcome is that either free market economics is self-defeating, or else there must be two different concepts of free market, one for the ordinary economy, the other for science. Both conclusions will impose limits on the applicability of the ordinary economic concept of the market to the study of science.
Article
In the recent writing on the methodology of economics there has been a shortage of systematic analyses of realism and explanation and an absence of analysis of their inter-relationships. This article attempts to provide a detailed account of the structure of economic explanation built upon realist and essentialist premises. Invisible-hand explanations characteristic of Austrian economics and the question of using the quantity theory for explaining inflation are used as illustrations. From another perspective, the analysis amounts to providing a reconstructive interpretation of the deep structure of the Austrian approach to explaning economic phenomena. It is suggested that realist-essentialist explanations be analysed in terms of redescription, reduction, ontological identification, and unification: Austrian (as well as some other) explanaions can be analysed as reductive theoretical redescriptions of economic phenomena using ontological identification statements and pursuing ontological unification of apparently diverse phenomena. Among these and other things, questions related to the deductive-nomological model, prediction, the implications of subjectivism and the role of common sense are discussed.
Article
The cultural and epistemic status of science is under attack. Social and cultural studies of science are widely perceived to offer evidence and arguments in support of an anti-science campaign. They portray science as a mundane social endeavour, akin to religion and politics, with no privileged access to truthful information about the (socially unconstructed) real world. Science is under threat and needs defence. Old philosophical legitimations have lost their bite. Alarm bells ring, new troops have to be mobilised. Call economics, the good old friend of the status quo depicting it as a generally beneficial social order while accommodating a rather mundane picture of human behaviour. In contrast to constructivist and relativist sociology of scientific knowledge, economic accounts of science seek to provide a rigorous defence of the cultural and epistemic legitimacy of science by accommodating plausible elements in the sociological accounts and by embedding them in invisible-hand arguments about the functioning of some market-like structure within science. Viewed through economic spectacles, science re-emerges from the ashes as stronger and more beautiful than ever. A spectator raises an innocent question: is economics itself strong and beautiful enough to offer such alleviating services? In order to examine the emerging issue of disciplinary credibility, we need to look at economics itself more closely, and we need to address traditional issues in the philosophy of science as well as less traditional issues of reflexivity. We will see that the above caricature concerning the role of economics in the science wars calls for heavy qualifications if not wholesale rejection (no comment here on the caricatured role of the SSK).
Article
This article shows how the MISS account of models—as isolations and surrogate systems—accommodates and elaborates Sugden’s account of models as credible worlds and Hausman’s account of models as explorations. Theoretical models typically isolate by means of idealization, and they are representatives of some target system, which prompts issues of resemblance between the two to arise. Models as representations are constrained both ontologically (by their targets) and pragmatically (by the purposes and audiences of the modeller), and these relations are coordinated by a model commentary. Surrogate models are often about single mechanisms. They are distinguishable from substitute models, which are examined without any concern about their connections with the target. Models as credible worlds are surrogate models that are believed to provide access to their targets on account of their credibility (of which a few senses are distinguished).
Article
In order to examine the fit between realism and science, one needs to address two issues: the unit of science question (realism about which parts of science?) and the contents of realism question (which realism about science?). Answering these questions is a matter of conceptual and empirical inquiry by way local case studies. Instead of the more ordinary abstract and global scientific realism, what we get is a doubly local scientific realism based on a bottom-up strategy. Representative formulations of the former kind are in terms of the truth and reality of the posits of current science, in terms of warranted belief, in terms of mind-independent unobservable entities. Using illustrations mainly from the social sciences, doubly local scientific realism denies the global applicability of such formulations and seeks to make adjustments in their elements in response to information about local units of science: It is sufficient for a realist to give the existence of an entity (and the truth of a theory) a chance, while in some areas we may be in s position to make justified claims about actual existence (and truth). Logical inquiry-independent existence is sufficient for the social and human sciences, while mind-independence will be fine for many other domains. It should not be insisted that the theoretical posits of realist science be strict unobservables in all areas: most theoretical posits of the social sciences are idealized commonsensibles, such as elements in folk psychology. Unsurprisingly, this sort of local strategy will create space for realism that is able to accommodate larger areas of science without sacrificing traditional realist intuitions.
Article
This is an assessment of two recent philosophical accounts of the nature of economics, those given in Alexander Rosenberg's Economics - Mathematical Politics or the Science of Diminishing Returns? (1992) and in Daniel Hausman's The Inexact and Separate Science of Economics (1992). The focus is on how they portray the predictive capabilities of economics and the links between economic theory and empirical evidence. Some major suggestions of the two books are found wanting in interesting ways. Examples are Rosenberg's explanation of the predictive weakness of economics in terms of its folk psychological roots and his depiction of economics as a branch of political philosophy and applied mathematics; and Hausman's claim that the 'economists' deductive method' is appropriate while 'economics as a separate science' is not.
Article
The contrastive approach to explanation is employed to shed light on the issue of the unrealisticness of models and their assumptions in economics. If we take explanations to be answers to contrastive questions of the form, then unrealistic elements such as omissions and idealizations are (at least partly) dependent on the selected contrast. These contrast-dependent assumptions are shown to serve the function of fixing the shared causal background between the fact and the foil. It is argued that looking at the explanations offered by economic models in contrastive terms helps to be precise about their explanatory potential, and hence, to assess the adequacy of their unrealistic assumptions. I apply the insights of the contrastive approach to the 'new economic geography' models, and to a selection of criticisms directed at them. This case illustrates how a contrastive analysis can help the solution of disputes concerning the unrealisticness of particular models.
Article
A model is a representation of something beyond itself in the sense of being used as a representative of that something, and in prompting questions of resemblance between the model and that something. Models are substitute systems that are directly examined in order to indirectly acquire information about their target systems. An experiment is an arrangement seeking to isolate a fragment of the world by controlling for causally relevant things outside that fragment. It is suggested that many theoretical models are ('thought') experiments, and that many ordinary experiments are ('material') models. The major difference between the two is that the controls effecting the required isolation are based on material manipulations in one case, and on assumptions in the other.
Article
It is argued that rather than a well defined F-Twist, Milton Friedman's 'Methodology of positive economics' offers an F-Mix: a pool of ambiguous and inconsistent ingredients that can be used for putting together a number of different methodological positions. This concerns issues such as the very concept of being unrealistic, the goal of predictive tests, the as-if formulation of theories, explanatory unification, social construction, and more. Both friends and foes of Friedman's essay have ignored its open-ended unclarities. Their removal may help create common ground for more focused debate in economics.
Article
It is argued that J H von Thuënen was a realist who deliberately used unrealistic assumptions to pursue a true account of a major aspect of the determination of agricultural land-use patterns. The assumptions of the simplest model of concentric rings are examined to show that this highly unrealistic model deserves a realist interpretation: the idealising assumptions serve the purpose of neutralising the impact of a number of factors on the land-use pattern and thereby help focus on the causal contribution of one major factor, namely distance from the market. It is von Thuënen's conviction that this factor and its causal contribution are real rather than fictional, and that his basic model truthfully captured them. The potential truth of von Thuënen's simplest model does not require that its assumptions are true nor that the representation of the resulting land-use pattern is true. This reading of von Thünen's theory is contrasted with the traditional fictionalist 'as-if' interpretation of Der isolierte Staat espoused by Hans Vaihinger, Peter Hall, and others. It is pointed out that economists and geographers both hold notions of realism that fail to accommodate von Thünen's realism. Although economists tend to conflate realism and realisticness and therefore fail to understand that realism is compatible with the use of unrealistic assumptions, human geographers have adopted highly specific ideas of realism that fail to do justice to von Thünen's theory. Because much of theorising in both disciplines follows the Thünian strategy, there is an important lesson to be learnt both for supporters and for opponents of that strategy.
Article
Two related goals are pursued. First, the development of, and debates around Oliver Williamson's version of transaction cost economics are organised in terms of an emerging metatheoretical framework. It proposes looking at economic theorising and its changes in terms of rival theoretical isolations which are often responses to challenging explanatory questions. As a side product, Williamson's strategy of theorising is portrayed. Second, using transaction cost economics as an illustration and as a source of inspiration, the paper amends and refines the earlier framework of theoretical isolation by incorporating notions of explaining and explained items; notions of progress (in questions as well as increased causal penetration and increased degree of unification); and the notion of the dynamics of dispute.
Article
Economic theory is often criticized for the lack of ‘realism’ of its assumptions. Milton Friedman rebutted such criticism with the famous dictum ‘the more significant the theory, the more unrealistic the assumptions’. Friedman's position, often called the ‘F-twist’, stems from his failure to distinguish three different types of assumption. Negligibility assumptions state that some factor has a negligible effect upon the phenomenon under investigation. Domain assumptions specify the domain of applicability of the theory. Heuristic assumptions are a means of simplifying the logical development of the theory. It is argued that Friedman's dictum is false of all three types of assumption. Finally, it is conjectured that what began as a negligibility assumption may be changed under the impact of criticism first into a domain assumption, then into a mere heuristic assumption; and that these important changes will go unnoticed if the different types of assumption are not clearly distinguished from one another.