Book

# Theories of Scientific Method: An Introduction

Authors:

## Abstract

What is it to be scientific? Is there such a thing as scientific method? And if so, how might such methods be justified? Robert Nola and Howard Sankey seek to provide answers to these fundamental questions in their exploration of the major recent theories of scientific method. Although for many scientists their understanding of method is something they just pick up; in the course of being trained, Nola and Sankey argue that it is possible to be explicit about what this tacit understanding of method is, rather than leave it as some unfathomable mystery. They robustly defend the idea that there is such a thing as scientific method and show how this might be legitimated. The book begins with the question of what methodology might mean and explores the notions of values, rules and principles, before investigating how methodologists have sought to show that our scientific methods are rational. Part 2 of the book sets out some principles of inductive method and examines its alternatives including abduction, IBE, and hypothetico-deductivism. Part 3 introduces probabilistic modes of reasoning, particularly Bayesianism in its various guises, and shows how it is able to give an account of many of the values and rules of method. Part 4 considers the ideas of philosophers who have proposed distinctive theories of method such as Popper, Lakatos, Kuhn and Feyerabend and Part 5 continues this theme by considering philosophers who have proposed naturalised; theories of method such as Quine, Laudan and Rescher. The book offers readers a comprehensive introduction to the idea of scientific method and a wide-ranging discussion of how historians of science, philosophers of science and scientists have grappled with the question over the last fifty years.
... [2][3][4]. 33 The multi-criterial solution to the demarcation problem is of limited theoretical and 34 practical utility. It merely shifts the question: from identifying a single property 35 common to all sciences to identifying many properties common to some. ...
... Nonetheless, the equivalence between scientific knowledge and information 101 compression has been dismissed as a principle of secondary importance by later 102 philosophers (e.g. Karl Popper (1902Popper ( -1994 [32]), and today clearly does not occupy the 103 foundational role that it arguably deserves [33]. 104 Philosophical resistance to equating science with information compression might 105 partially be explained by two common misconceptions. ...
... The remarkable aspect of this result is that the K function was not constructed with 549 the explicit purpose of accommodating Occam's razor, but was derived from the mutual 550 information function following a postulated equivalence of knowledge with 551 pattern-detection. The finding that Occam's razor is intrinsic to K is a striking support 552 for the notion that knowledge is information compression and that simplicity and 553 elegance are not an arbitrary aesthetic values that people (including scientists) choose 554 to impose on knowledge, as scholars have argued [33]. To the extent that it underlies 555 the encoding of patterns, simplicity is knowledge. ...
Article
Full-text available
This essay unifies key epistemological concepts in a consistent mathematical framework built on two postulates: 1-information is finite; 2-knowledge is information compression. Knowledge is expressed by a function $$K(Y;X)$$ and two fundamental operations, $$\oplus, \otimes$$. This $$K$$ function possesses fundamental properties that are intuitively ascribed to knowledge: it embodies Occam's razor, has one optimal level of accuracy, and declines with distance in time. Empirical knowledge differs from logico-deductive knowledge solely in having measurement error and therefore a "chaos horizon". The $$K$$ function characterizes knowledge as a cumulation and manipulation of patterns. It allows to quantify the amount of knowledge gained by experience and to derive conditions that favour the increase of knowledge complexity. Scientific knowledge operates exactly as ordinary knowledge, but its patterns are conditioned on a "methodology" component. Analysis of scientific progress suggests that classic Popperian falsificationism only occurs under special conditions that are rarely realised in practice, and that reproducibility failures are virtually inevitable. Scientific "softness" is simply an encoding of weaker patterns, which are simultaneously cause and consequence of higher complexity of subject matter and methodology. Bias consists in information that is concealed in ante-hoc or post-hoc methodological choices. Disciplines typically classified as pseudosciences are sciences expressing extreme bias and therefore yield $$K(Y;X) \leq 0$$. All knowledge-producing activities can be ranked in terms of a parameter $$\Xi \in (-\infty,\infty)$$, measured in bits, which subsumes all quantities defined in the essay.
... Significantly Nola and Sankey (2007) explain how this "conception of realism is extended to the unobservable items postulated in science, including items for which we do not (yet) have a word" (p. 339). ...
... As a consequence positivism suffers from the false premise that what is truthful is knowable. But, as we point out above with reference to the 'electron' example in Nola and Sankey (2007), the truthseeking character of scientific research means that we may not yet know what is true. Therefore, positivism is hamstrung by being limited to saying that what is known is that which is known empirically, and that that knowledge is true knowledge. ...
... A frequently used paradigm for scientific research is the Hypothetico-Deductive paradigm (Figure 1(a)), which has been practiced by researchers for years [1,2,3]. In this paradigm, researchers first make observations which is usually a data collection process, and then raise a question. ...
... Table 1, the date is written in old style used by Kepler. 2 To obtain Gregorian style dates, we just need to add 10 days on top of Kepler's dates [6]. The "Mars' Angular Position" from Sun is the Mars' longitudes in heliocentric ecliptic coordinates computed by Kepler. ...
Preprint
Full-text available
The research paradigm of the Observation--Hypothesis--Prediction--Experimentation loop has been practiced by researchers for years towards scientific discovery. However, with the data explosion in both mega-scale and milli-scale scientific research, it has been sometimes very difficult to manually analyze the data and propose new hypothesis to drive the cycle for scientific discovery. In this paper, we introduce an Explainable AI-assisted paradigm for science discovery. The key is to use Explainable AI (XAI) to help derive data or model interpretations and science discoveries. We show how computational and data-intensive methodology -- together with experimental and theoretical methodology -- can be seamlessly integrated for scientific research. To demonstrate the AI-assisted science discovery process, and to pay our respect to some of the greatest minds in human history, we show how Kepler's laws of planetary motion and Newton's law of universal gravitation can be rediscovered by (explainable) AI based on Tycho Brahe's astronomical observation data, whose works were leading the scientific revolution in the 16-17th century. This work also highlights the importance of Explainable AI (as compared to black-box AI) in science discovery to help humans prevent or better prepare for the possible technological singularity which may happen in the future.
... Tartışmalı bir başka konu ise nitel araştırmalarda kuramın yeridir. Aslında bir etkinliğin "bilimselliği" ve bilimsellikte hipotezin-kuramın yeri; hipotez-gözlem ilişkisi; "gözlem veya sınama" düzenekleri sonucunda ulaşılan verilerden hiç aksamayan sonuçlara bağlı olarak "yasa" mı çıkarılacağı; yoksa düzenliliklerdeki küçük aksamaları yasanın reddi yahut belli bir ihtimalle gerçekleşmesi biçiminde mi yorumlanacağı; veyahut da sıklıkla mı yoksa istisnai olarak gerçekleşen durumların mı bilimsel araştırmaya konu edileceği türünden sorular bilim felsefesi ve yöntembilimin çetrefilli konularındandır (Keat ve Urry, 2001;Lakatos, 1989;Nola ve Sankey, 2007;Rosenberg, 2014). Burada ayrıntılara girme olanağı yoktur. ...
... Ne pozitivizm bir tanedir, ne de antipozitivizm/yorumlamacılık. Bunların "açıklama", "görgüllük", "gözlem" "gerçeklik", "bilimsellik" gibi temel kavramlarda ne denli görüş ayrılıkları olduğuna dair güçlü bir yazın olduğunu (Keat ve Urry, 2001;Coşkun, 2017, Freund, 1991Craib, 1984;Nola ve Sankey, 2007) söylemek yeterlidir. Bu tartışmalara bağlı olarak çalışma şu sorulara cevap aramaktadır: a) Türkçe nitel çalışmalarda yerleşik bir "nitel" terminoloji var mıdır? ...
Conference Paper
Full-text available
... 10 In cases like the oracle, in which the system generates reliably good predictions, most of us would wonder why the oracle works so well. Indeed, it is often the case, in the history of science that natural regularities and correlations become the target of explanation (Nola and Sankey 2014). Thus, for the scientifically-minded, the mysterious predictive power of the oracle would become the explanandum. ...
... Scientists are held to a higher epistemic standard than ordinary epistemic agents (Nola and Sankey 2014). This is one of the reasons why they can serve the role of providing expert testimony. ...
Article
Full-text available
What does it mean to trust the results of a computer simulation? This paper argues that trust in simulations should be grounded in empirical evidence, good engineering practice, and established theoretical principles. Without these constraints, computer simulation risks becoming little more than speculation. We argue against two prominent positions in the epistemology of computer simulation and defend a conservative view that emphasizes the difference between the norms governing scientific investigation and those governing ordinary epistemic practices.
... Falta de espacio impide discutir la existencia de un método científico 'único, fijo y establecido.' (Ver al respecto: Nola and Irzik (2003); Nola and Sankey (2007); Suchting (1995)). Bricmont (2017)p.228 ...
Preprint
Full-text available
... Research approach is a plan and procedure which include the detailed method of data collection, analysis, and interpretation and applied to achieve the research objectives (Nola & Sankey, 2010). In this study use the deductive approach and use process to arrive at a rational conclusion regarding the impact of 3PL service on the user satisfaction level in Apparel Manufacturing Company in Sri Lanka to the generalization of known fact. ...
Conference Paper
Full-text available
Abstract: The organizations have to compete with many other organizations in the same industry. As a result, they must identify the requirements of the internal and external environments, as well as their own, in order to produce the best results.Strategic planning is an essential process for all organizations. As we all know, what most of the people in Sri Lanka believe is that Sri Lankan public universities have failed to achieve their targets well. The strategic planning aspect of Sri Lankan public universities is poor and they have failed to achieve the strategic objectives well. This is highlighted as an important requirement of the strategic plan for public sector universities that can make radical changes. But most public universities failed to come up with the most suitable strategic plan for them, which was investigated in this study. The researcher collected the information from multiple sources, in-depth desk research, and in-depth interviewresearch to identify the barriers to strategic planning in public sector universities. This study used a purposive sampling method to collect data and conducted seven key in-depth interviews with university system strategic directors and registrars. The researchers identified six new themes as barriers in the strategic planning process. Lack of awareness in the strategic planning process, digital adoption, solidarity, transformation in management skills, impact on the external environment, and public perception via social media and media. Keywords: Barriers, Strategic Planning, Universities, Strategy, Multi-sources
... On the basis of what can logical principles be adjudicated, if not by assuming a base logic on which they may be compared? Similarly, with respect to method, it would appear that there must be some stable principles guiding the choice of theory on the basis of observation-which the above quote seems to concede (Nola and Sankey 2014). ...
Article
Full-text available
This essay offers a conception of logic by which logic may be considered to be exceptional among the sciences on the backdrop of a naturalistic outlook. The conception of logic focused on emphasises the traditional role of logic as a methodology for the sciences, which distinguishes it from other sciences that are not methodological. On the proposed conception, the methodological aims of logic drive its definitions and principles, rather than the description of scientific phenomena. The notion of a methodological discipline is explained as a relation between disciplines or practices. Logic serves as a methodological discipline with respect to any theoretical practice, and this generality, as well as logic’s reflexive nature, distinguish it from other methodological disciplines. Finally, the evolution of model theory is taken as a case study, with a focus on its methodological role. Following recent work by John Baldwin and Juliette Kennedy, we look at model theory from its inception in the mid-twentieth century as a foundational endeavour until developments at the end of the century, where the classification of theories has taken centre-stage.
... Inductive research involves moving from specific observations towards broad generalizations that influence the formation of concepts and theories (Locke, 2007). Compared to the deductive reasoning that moves from generalizations to the specific insights when forming a theory (Locke, 2007;Nola and Sankey, 2007). The inductive research approach involves three stages: making an observation, identifying a pattern from those observations, and developing a theory. ...
Thesis
Full-text available
The systems that we are designing are becoming increasingly more complex and interconnected, and without proper awareness and holistic understanding of how our designs integrate into the context of the broader designed ecosystem, our attempts to solve these problems may run the risk of created additional unintended consequences. There is a need for the development of new approaches, to aid in tackling complexity while addressing the risk of creating unintended consequences. This study aims to investigate how product and service designers, in agency settings, can integrate aspects of systems thinking into their design process, and understanding its effect on addressing unintended consequences. In this context, unintended consequences are defined as "outcomes that are not the ones foreseen and intended by a purposeful action" (Merton, 1936). To test the hypothesis that" Systems Thinking," while being paired with the human- centered "Design Thinking" approach, will enable a human-centered approach to understand interconnected systems and assist in addressing unintended consequences, the approach of Research through Design is used. Design Thinkers, System Thinkers, and Design Leaders, spanning six different countries, were all interviewed, and their responses were thematically grouped and synthesized. The results identified gaps in the current Design Thinking process and areas were Systems Thinking could add value. Furthermore, a process model and interactive game were suggested as proposed intervention, and the paper illustrates how the prototypes were developed, tested, and iterated upon. Next steps in the development of these proposed interventions are expressed. Lastly, recommendations were made for further academic exploration in this field.
... 589 The remarkable aspect of this result is that the K function was not constructed with 590 the explicit purpose of accommodating Ockham's razor, but was derived from the 591 mutual information function following a postulated equivalence of knowledge with 592 pattern-detection (see Supplementary information section 5.1). The finding that 593 Ockham's razor is intrinsic to K is a compelling support for the notion that knowledge 594 is information compression and that simplicity and elegance are not arbitrary aesthetic 595 values that people (including scientists) choose to impose on knowledge, contrary to 596 what some scholars have argued [37]. To the extent that it underlies the encoding of 597 patterns, simplicity is knowledge. ...
Preprint
Full-text available
This essay proposes mathematical answers to meta-scientific questions including "how much knowledge is produced by research?", "how rapidly is a field making progress?", "what is the expected reproducibility of a result?", "what do we mean by soft science?", "what demarcates a pseudoscience?", and many others. From two simple postulates - 1) information is finite; 2) knowledge is information compression - we derive a function $$K(y;x\tau)=\frac{T(y)-T(y|x \tau)}{(T(y)+T(x)+T(\tau)}$$, in which the total information $$T()$$ contained in an explanandum $$y$$ is lossless or lossy compressed via an explanans composed of an information input $$x$$ and a "theory" component $$\tau$$. The latter is a factor that conditions the relationship between $$y$$ and $$x$$, with an information "cost" equivalent to the description length of the relationship itself. This function is proposed as a simple and universal tool to understand and analyse knowledge dynamics, scientific or otherwise. Soft sciences are shown to be simply fields that yield relatively low K values. Bias turns out to be information that is concealed in methodological choices, thereby reducing K. Disciplines typically classified as pseudosciences are suggested to be sciences that suffer from extreme bias: their informational input is greater than their output, yielding $$K(y;x\tau) < 0$$. The essay derives numerous general results, some of which may be counter-intuitive. For example, it suggests that reproducibility failures are inevitable, and that the value of publishing negative results may vary across fields and within a field over time. Therefore, there may be conditions in which the costs of reproducible research practices such as publishing negative results and sharing data may outweigh the benefits. The theory makes several testable predictions concerning science and cognition in general, and it may have numerous applications that future research could develop, test and implement to foster progress on all frontiers of knowledge.
... After that, the steps of students to discuss with different groups also increase their experience in fostering self-confidence in learning mathematics. It is in line with the opinion of [22] in conveying the knowledge that exists in a person. The students can express their knowledge as an expression of self-confidence in learning. ...
... Penggunaan filsafat dalam mengkaji agregat psikis tersebut memungkinkan untuk melihat gejala secara utuh dan menyeluruh, artinya tidak berhenti pada dimensi-dimensi tertentu yang bisa terukur, melainkan menyentuh aspek yang sulit diukur namun ada, berfokus pada proses berpikir untuk menemukan kebijaksanaan melalui penemuan kebenaran, yang memiliki sifat mengkaji obyek secara menyeluruh, berusaha menemukan hakekat atau esensi terdalam dari suatu obyek, serta kehati-hatian dalam menemukan kebenaran. Nola menyebutnya sebagai metode untuk "menemukan" ilmu (Nola & Sankey, 2007), dan menempatkan kajian konseptual atau "kegiatan berteori" ini sebagai memenuhi nilai internal atau nilai intrinsik dari ilmu.. Pada makalah ini, yang dikajiadalah aspek pengalaman dan penghayatan spiritual dari manusia, yang dalam psikologi dikonseptualisasi sebagai self, dengan perspektif religiusitas Islam sebagai karakterisasi dari self-nya, yang membuatnya sesuai untuk disebut sebagai religious self. ...
Article
Full-text available
... [5,105]). Nonetheless, the equivalence between scientific knowledge and information compression has been presented as a principle of secondary importance by later philosophers (including for example Popper [41]), and today does not appear to occupy the foundational role that it arguably deserves [106]. ...
Article
Full-text available
This article proposes quantitative answers to meta-scientific questions including 'how much knowledge is attained by a research field?', 'how rapidly is a field making progress?', 'what is the expected reproducibility of a result?', 'how much knowledge is lost from scientific bias and misconduct?', 'what do we mean by soft science?', and 'what demarcates a pseudoscience?'. Knowledge is suggested to be a system-specific property measured by K, a quantity determined by how much of the information contained in an explanandum is compressed by an explanans, which is composed of an information 'input' and a 'theory/methodology' conditioning factor. This approach is justified on three grounds: (i) K is derived from postulating that information is finite and knowledge is information compression; (ii) K is compatible and convertible to ordinary measures of effect size and algorithmic complexity; (iii) K is physically interpretable as a measure of entropic efficiency. Moreover, the K function has useful properties that support its potential as a measure of knowledge. Examples given to illustrate the possible uses of K include: the knowledge value of proving Fermat's last theorem; the accuracy of measurements of the mass of the electron; the half life of predictions of solar eclipses; the usefulness of evolutionary models of reproductive skew; the significance of gender differences in personality; the sources of irreproducibility in psychology; the impact of scientific misconduct and questionable research practices; the knowledge value of astrology. Furthermore, measures derived from K may complement ordinary meta-analysis and may give rise to a universal classification of sciences and pseudosciences. Simple and memorable mathematical formulae that summarize the theory's key results may find practical uses in meta-research, philosophy and research policy.
... Setelah itu, adanya langkah siswa untuk berdiskusi ke kelompok yang berbeda pun menambah pengalaman siswa dalam memupuk rasa percaya diri dalam belajar matematika. Hal ini sependapat dengan pendapat Nola & Sankey (2010) dalam menyampaikan pengetahuan yang ada pada diri seseorang, yaitu kemampuan siswa dalam mengungkapkan pengetahuan yang dimiliki adalah salah satu bentuk aplikasi kemampuan percaya diri dalam belajar. ...
... Many factors have contributed to this, including the adoption of expert testimony in courts (Lawson, 1900;Imwinkelried, 1993;Freckelton, 1995;Lord Woolf, 1996), and the introduction of Government Chief and Departmental Chief Scientific Advisers (Parker, 2016). Scientists assume the role of impartial observer, assumed to be relatively free from bias and guided by the scientific method's impartiality (Walton, 1997;Nola & Sankey, 2014). Disciplinary experts are often called on to provide advice across a range of economic challenges, environmental problems and biological risks (Carrier, 2004). ...
Article
Full-text available
Sustainable development is widely recognized as an existential challenge. To address it, humanity needs to change its ways. However, people seem slow to act, not always understanding and often denying environmental imperatives, creating substantial social and psychological barriers. Social inertia and denial have been allegedly amplified by a public discourse increasingly distrustful of science. But is this discourse a rejection of science or an erosion of trust in how science is applied? The paper examines the main differences between environmental science and technology, reviews how the wider science-technology convergence has affected them and evaluates potential implications for sustainability challenges. We question whether the “convergence” between environmental science and technology, could be behind the growing public dissatisfaction and distrust of environmental science and policies. Although environmental science plays a role in enabling understanding and communicating complexity, technology requires political, social and economic skills, beyond conventional disciplinary expertise. To avoid putting academic freedom at risk, environmental technologists, a new breed of professionals, should have a clear understanding of scientific capacity and uncertainty and be able to engage with stakeholders, policy makers and the public to design integrated, interdisciplinary and holistic solutions, and also better define the many environmental problems we face.
... Paradigma yang mendasari penelitian ini adalah constructivism (Nola & Sankey, 2007), yaitu bahwa realitas DR didasarkan pada konstruksi yang dibangun oleh individu-individu pemeluk agamanya. Jadi frame of reference-nya adalah diri individu, dengan asumsi bahwa individu menginternalisasi dan atau mengalami secara personal keterhubungan dengan Tuhan, sehingga citra Tuhan pada diri individu (perceived Godnessity), hanya bisa dilihat melalui perspektif subyektif individu, bukan melalui pengukuran obyektif. ...
Article
Full-text available
The study aims to explore the concept of Religious Self (Diri Religius/DR). Sequential exploratory mixed methods was applied, which qualitative study precedes the quantitative. Fifteen participants involved in phenomenological based semi structured interviews that were subjected to thematic analysis. A total of 739 subjects filled out questionnaires in quantitative studies located in Bandung, Denpasar, Yogyakarta, Cimahi, Surabaya, Garut, and Depok. The first study resulted in four dimensions of DR. The second study shows that the three-dimensional model of DR (individual’s view of God's involvement in the exixtences, positive perspective on God's involvement in life events, and positive behaviors accompanied by positive emotions) significantly fit according to empirical data. DR proved to be a concept that can distinguish God-oriented individuals (theistic) to those who are not (non-theistic). Key words : religious self, religiosity, spirituality
... There are never sufficient grounds to claim that a hypothesis has been proven conclusively. But passing a certain number of tests successfully induces researcher communities to regard a given hypothesis as sufficiently proven to consider it an element of scientific knowledge (Nola and Sankey, 2007). ...
Article
Purpose: The article discusses selected methodological issues of natural and social sciences with particular consideration of behavioural economics to highlight the significance of experimental research. Design approach: The order of the issues covered is as follows: (a) science as a product of a research community, (b) basic cognitive activities in science, (c) a short description of social sciences, (d) a discussion on the methods applied in behavioural economics. Findings: The article offers a description of research procedure, its objectives and the methods applied therein; it has been stressed that testing theories and hypotheses involves exposing them to falsification; it has been emphasised that research conducted within the framework of social sciences is more difficult than in the case of natural sciences because of the large number of independent variables and the possible interaction between the researcher and research participants. Practical implications: The content presented in the article highlights the value of scientific findings as opposed to common-sense knowledge adopted with the disregard of the principles of proper methodology. Value: The authors believe that the emergence of behavioural economics was an attempt to overcome certain deficiencies in the methodology of classical economics by means of experimental research.
... Breaking this dialog into its fundamental elements, it can be boiled down to a few steps: consume, consider and extend [8]. Each of the primitive elements of the scientific method [9] -observe, hypothesize, gather data, test, refine and conclude -can also be mapped to one of these fundamental steps. If we expand the process beyond one individual, towards communities establishing scientific theories, trust in parallel work cannot be established without proper validation and testing; therefore, in the scientific scholarly workflow we need to insert a fourth step -"reproduce" before "extend." ...
Article
EDITOR'S SUMMARY Scholarly research is at the forefront of innovation, especially with a breadth of new technologies that can enhance the research process. However, in a race for scholars to produce more and more new findings, documentation practices and reproduction of results may be neglected. Lack of validation through reproduction can lead to a general distrust of scholarly research and experiments, but a more generous approach to information sharing could be the answer to this issue. Scholars have connected socially for centuries to share their ideas, and this practice has led to some truly innovative ideas that have shaped our world today. With willing participants sharing their ideas and their research methods, new findings can be reproduced and validated, creating a stronger and more trustworthy community of scholars.
... Determining that protocols such as cladogram comparisons and character mapping are problematic requires that we first acknowledge the intent of reasoning in biological systematics. The overarching goal of scientific inquiry is to acquire causal understanding of the phenomena we observe/describe, which also affords opportunities for predictions into the future (Hempel, 1965;Hanson, 1958;Rescher, 1970;Popper, 1983Popper, , 1992Salmon, 1984a;Van Fraassen, 1990;Strahler, 1992;Mahner and Bunge, 1997;Hausman, 1998;Thagard, 2004;Nola and Sankey, 2007;de Regt et al., 2009;Hoyningen-Huene, 2013). As a field of science, we should expect the objective of systematics to be consistent with that of other fields. ...
... 589 The remarkable aspect of this result is that the K function was not constructed with 590 the explicit purpose of accommodating Ockham's razor, but was derived from the 591 mutual information function following a postulated equivalence of knowledge with 592 pattern-detection (see Supplementary information section 5.1). The finding that 593 Ockham's razor is intrinsic to K is a compelling support for the notion that knowledge 594 is information compression and that simplicity and elegance are not arbitrary aesthetic 595 values that people (including scientists) choose to impose on knowledge, contrary to 596 what some scholars have argued [37]. To the extent that it underlies the encoding of 597 patterns, simplicity is knowledge. ...
Article
Full-text available
This article proposes quantitative answers to meta-scientific questions including "how much knowledge is attained by a research field?","how rapidly is a field making progress?", "what is the expected reproducibility of a result?", "how much knowledge is lost from scientific bias and misconduct?" "what do we mean by soft science?", "what demarcates a pseudoscience?". Knowledge is suggested to be a system-specific property measured by K , a quantity determined by how much the information contained in an explanandum is compressed by an explanans , which is composed of an information "input" and a "theory/methodology" conditioning factor. This approach is justified on three grounds: 1) K is derived from postulating that information is finite and knowledge is information compression; 2) K is compatible and convertible to ordinary measures of effect size and algorithmic complexity; 3) K is physically interpretable as a measure of entropic efficiency. Moreover, the K function has useful properties that support its potential as a measure of knowledge. Examples given to illustrate the possible uses of K include: the knowledge value of proving Fermat's last theorem; the accuracy of measurements of the mass of the electron; the half life of predictions of solar eclipses; the usefulness of evolutionary models of reproductive skew; the significance of gender differences in personality; the sources of irreproducibility in psychology; the impact of scientific misconduct and questionable research practices; the knowledge value of astrology. Furthermore, measures derived from K may complement ordinary meta-analysis and may give rise to a universal classification of sciences and pseudosciences. Simple and memorable mathematical formulae that summarize the theory's key results may find practical uses in meta-research, philosophy and research policy.
Chapter
Full-text available
Realism is an ontology, or theory of the nature of the world. As (Nochlin in Realism. Penguin, 1971) illustrated, when Plato set down the philosophy of idealism, his argument contained “a realism of Ideas—it puts Ideas outside the subject as particular self-existent entities rather than as objects within the subject” (p. 41). Realist analysis applies this ontological view of the world in order to make sense of it. Realist research is particularly concerned with identifying causality. Within the social sciences, adaptations to realism have been introduced to account for the complexity of lived experience. Two main branches of realism are Critical Realism and Social Realism. Each of these examine phenomena by using a realist ontology that also pays close attention to the socially, culturally, and politically constructed nature of knowledge in order to infer the most likely causal elements of the subject under investigation.
Article
In this paper we show why the postulation scheme based on forces and couples is detrimental when designing novel metamaterials. Instead, the most effective postulation scheme for mechanics is that based on the Principle of Virtual Work formulated by, for example, d'Alembert, Lagrange, Piola, and Paul Germain. In fact, generalized continuum mechanics can be formulated in a coherent way only by basing its foundations on the Principle of Virtual Work while the problem of synthesis of novel metamaterials can be efficiently confronted only when introducing generalized continua as their model at macroscopic level. Once work functionals are postulated, the set of involved generalized forces are identified using the representation theorem for distributions by Laurent Schwartz, forces in particular being zeroth order distributions. Actually, only when limiting the class of considered internal work functionals to first order distributions is the Principle of Virtual Work equivalent to the balances of forces and couples. The predominant Role of the Principle of Virtual Work has been denied by those inductivist scholars who claim that balance laws can be induced by experimentation. The Principle of Virtual Work postulation more truly fits in a falsificationist approach to epistemology.
Article
This paper provides a theoretical clarification of an important question raised by Olivér Kovács in Acta Oeconomica 69 (4) and points out further problems and possibilities. It clarifies what role considerations of complexity theory have played in the economic sciences so far and why. Focussing on the complementary phenomenon of emergence, the contribution shows where the limits of this approach lie within the discipline and to what extent serious problems of demarcation arise with regard to other disciplines of the social sciences. Accordingly, this paper aims to demonstrate the conditions under which economics can use concepts of emergence in a fruitful way.
Article
This paper focuses on how contributions are argued in research proposals in the humanities. Due to standardizing tendencies in research funding towards formats characteristic of science, technology, engineering and mathematics (STEM) subjects, there has been concern that the humanities are marginalized. In this study, ‘contribution statements’ were identified in proposals funded by the Bank of Sweden Tercentenary Foundation across the humanistic disciplines. These statements were systematically analyzed in terms of type and structure of contributions advanced. The results suggest that the humanities differ from the sciences in terms of specificity of focus, a high level of ‘acceptable serendipity’ in proposed outcomes, but that these disciplines structurally tend to adhere to the same types of research contribution arguments as STEM. A better understanding of the way in which humanities scholars frame contributions offers insight into how these fields change and how they relate to developments in the science policy and funding landscape.
Article
In this article, I critically examine a number of widely held beliefs about the nature of replication and its place in science, with particular reference to psychology. In doing so, I present a number of underappreciated understandings of the nature of science more generally. I contend that some contributors to the replication debates overstate the importance of replication in science and mischaracterize the relationship between direct and conceptual replication. I also claim that there has been a failure to appreciate sufficiently the variety of legitimate replication practices that scientists engage in. In this regard, I highlight the tendency to pay insufficient attention to methodological triangulation as an important strategy for justifying empirical claims. I argue, further, that the replication debates tend to overstate the closeness of the relationship between replication and theory construction. Some features of this relationship are spelt out with reference to the hypothetico-deductive and the abductive accounts of scientific method. Additionally, an evaluation of the status of replication in different characterizations of scientific progress is undertaken. I maintain that viewing replication as just one element of the wide array of scientific endeavors leads to the conclusion that it is not as prominent in science as is often claimed.
Article
Full-text available
Known as shovel head worms, members of Magelonidae comprise a group of polychaetes readily recognised by the uniquely shaped, dorso-ventrally flattened prostomium and paired ventro-laterally inserted papillated palps. The present study is the first published account of inferences of phylogenetic hypotheses within Magelonidae. Members of 72 species of Magelona and two species of Octomagelona were included, with outgroups including members of one species of Chaetopteridae and four of Spionidae. The phylogenetic inferences were performed to causally account for 176 characters distributed among 79 subjects, and produced 2,417,600 cladograms, each with 404 steps. A formal definition of Magelonidae is provided, represented by a composite phylogenetic hypothesis explaining seven synapomorphies: shovel-shaped prostomium, prostomial ridges, absence of nuchal organs, ventral insertion of palps and their papillation, presence of a burrowing organ, and unique body regionation. Octomagelona is synonymised with Magelona due to the latter being paraphyletic relative to the former. The consequence is that Magelonidae is monotypic, such that Magelona cannot be formally defined as associated with any phylogenetic hypotheses. As such, the latter name is an empirically empty placeholder, but because of the binomial name requirement mandated by the International Code of Zoological Nomenclature, the definition is identical to that of Magelonidae. Several key features for future descriptions are suggested: prostomial dimensions, presence/absence of prostomial horns, morphology of anterior lamellae, presence/absence of specialised chaetae, and lateral abdominal pouches. Additionally, great care must be taken to fully describe and illustrate all thoracic chaetigers in descriptions.
Article
Full-text available
Article
Full-text available
Forensic psychiatrists are often sought by the court of law to provide professional opinion on specific legal matters that have a major impact on the evaluee and possibly society at large. The quality of that opinion and recommendations rely on the quality of the analysis from the assessment results conducted by the psychiatrist. However, the definition and scope of a forensic psychiatric analysis is not clear. While existing literature on forensic psychiatric analysis generally includes organizing information, identifying relevant details, and formulating a set of forensic psychiatric opinions as components, there is no explicit and unified definition of these terms and process. This lack of clarity and guidelines may hinder forensic psychiatry from achieving its goal of providing objective information to the court or other relevant parties. Forensic psychiatric analysis exhibits numerous parallels to clinical reasoning in other fields of medicine. Therefore, this review aims to elaborate forensic psychiatric analysis through the lens of clinical reasoning, which has been developed by incorporating advances in cognitive sciences. We describe forensic psychiatric analysis through three prominent clinical reasoning theories: hypothetico-deductive model, illness script theory, and dual process theory. We expand those theories to elucidate how forensic psychiatrists use clinical reasoning not only to diagnose mental disorders, but also to determine mental capacities as requested by law. Cognitive biases are also described as potential threat to the accuracy of the assessment and analysis. Additionally, situated cognition theory helps elucidate how contextual factors influence risk of errors. Understanding the processes involved in forensic psychiatric analysis and their pitfalls can assist forensic psychiatrists to be aware of and try to mitigate their bias. Debiasing strategies that have been implemented in other fields of medicine to mitigate errors in clinical reasoning can be adapted for forensic psychiatry. This may also shape the training program of general psychiatrists and forensic psychiatrists alike.
Article
Full-text available
Conventional accounts of epistemic opacity, particularly those that stem from the definitive work of Paul Humphreys, typically point to limitations on the part of epistemic agents to account for the distinct ways in which systems, such as computational methods and devices, are opaque. They point, for example, to the lack of technical skill on the part of an agent (Burrell, 2016; Kaminski, 2017), the failure to meet standards of best practice (Saam, 2017; Hubig, 2017), or even the nature of an agent (Humphreys, 2009) as reasons why epistemically relevant elements of a process may be inaccessible. In this paper I argue that there are certain instances of epistemic opacity— particularly in computational methods such a computer simulations and machine learning processes—that (1) do not arise from, (2) are not responsive to, and (3) are therefore not explained by the epistemic limitations of an agent. I call these instances agent-neutral and agent-independent instances of epistemic opacity respectively. As a result, I also argue that conventional accounts of epistemic opacity offer a limited understanding of the full spectrum of kinds and sources of epistemic opacity, particularly of the kind found in computational methods. In particular, as I will show below, the limitations of these accounts are reflected in the way they fail to provide satisfactory explanations when faced with certain instances of opacity.
Chapter
This chapter describes my first encounters with constructivism in the science education community, the topic being elaborated by Ernst von Glasersfeld in a plenary lecture at a NARST meeting. I identified it as updated Bishop Berkeley’s radical empiricism. Apart from philosophical criticism, the deficiency of constructivism as a teaching method is documented. My two-year period (1992–1993) as foundation professor of science education at University of Auckland is elaborated, including a significant national debate with New Zealand’s powerful constructivist lobby. The origins and contents of the 1994 Routledge Science Teaching: The Contribution of History and Philosophy of Science are given; and its functioning as somewhat of a ‘roadmap’ for the discipline, being translated into five languages and reissued 20 years later in a revised and updated edition. My pendulum studies (a book and papers), and the large International Pendulum Project and subsequent anthology, are outlined. These demonstrate that HPS can enliven and transform even the most routine topics in science programmes.
Article
Three competing 'methods' have been endorsed for inferring phylogenetic hypotheses: parsimony, likelihood, and Bayesianism. The latter two have been claimed superior because they take into account rates of sequence substitution. Can rates of substitution be justified on its own accord in inferences of explanatory hypotheses? Answering this question requires addressing four issues: (1) the aim of scientific inquiry, (2) the nature of why-questions, (3) explanatory hypotheses as answers to why-questions, and (4) acknowledging that neither parsimony, likelihood, nor Bayesianism are inferential actions leading to explanatory hypotheses. The aim of scientific inquiry is to acquire causal understanding of effects. Observation statements of organismal characters lead to implicit or explicit why-questions. Those questions, conveyed in data matrices, assume the truth of observation statements, which is contrary to subsequently invoking substitution rates within inferences to phylogenetic hypotheses. Inferences of explanatory hypotheses are abductive in form, such that some version of an evolutionary theory(ies) is/are included or implied. If rates of sequence evolution are to be considered, it must be done prior to, rather than within abduction, which requires renaming those putatively-shared nucleotides subject to substitution rates. There are, however, no epistemic grounds for renaming characters to accommodate rates, calling into question the legitimacy of causally accounting for sequence data.
Technical Report
Full-text available
There is an increasing requirement for developing the knowledge base for operations analysis, in order to increase the output of science-based support for military decision-making and thus contribute to a more effective use of the resources of the Swedish Armed Forces. Military operations analysis is defined as the application of scientific methods to assist executive decision-makers. The focus on decision-making means that the choice of methods is always subordinate to the outcome of the decision. This projects recommends that the new research program for military operations analysis is directed towards developing methods and gather lessons that are used by operations analysts to create better conditions for science-based support to decision-makers. This project recommends the new research program for military operations analysis is divided into the development of methods (including the evaluation of methods), monitoring (including monitoring of operations analysis research and related fields) and practical lessons from operations analysis (including gathering of lessons from “front-line” operations analysts). The aim of this guidance is to create a long-term knowledge base for the operations analyst field as a complement to the current practical operations analysis. That enables the creation of operations analysis with high quality in both the short and the long term.
Article
Full-text available
Introduction For more than a half-century, the scientific method is one kind of quantitative methods which was the domain stream in different parts of academic and business life. This kind of inquiry is used to gathering data and perform professional activities, and decision-making by educators, business managers, and policymakers. Besides, qualitative methods consider as an alternative inquiry method for performing professional works which can provide a more detailed understanding of complex issues. Although this inquiry paradigms provide a different kind of opportunities and for researchers; however, each of them has some limitation for being applied as a professional inquiry. This paper is aimed to compare the scientific method and dialectic inquiry as professional inquiries in marketing research.
Conference Paper
Full-text available
Türkiye’de turizm sektörü politik nedenlerden kaynaklı durgunluk dönemleri haricinde sürekli gelişim gösteren bir sektördür. Turizm sektörünün son çeyrek asırdaki gelişimi gelecek yıllardaki gelişimi de özetler niteliktedir. Literatürde turizm sektörünün etkinlik incelemesi genellikle otellerin etkinlik incelemeleri şeklinde çalışılmıştır. Fakat bu çalışmada konaklama işletmelerinin yanında diğer turizm unsurları da analize dâhil edilerek Türkiye’nin turizm geçmişi incelenmiştir. Bu noktadan hareketle bu çalışmanın amacı, Türkiye’nin 2002-2018 yılları arasındaki turizm etkinliğini değerlendirmek ve modelde kullanılan girdi/çıktıların modele olan etkilerini incelemektir. Etkinlik incelemesi yapılırken Çok Kriterli Veri Zarflama Analizi (ÇKVZA) modelinin ayrım gücünü iyileştirmek için Ghasemi, Ignatius ve Emrouznejad (2014) tarafından geliştirilen hedef programlama tabanlı Çift Amaçlı- Çok Kriterli Veri Zarflama Analizi (ÇAÇKVZA) modeli kullanılmıştır. Çalışma sonucunda ulaşılan etkinlik değerlerine göre 2008, 2013 ve 2018 yılları turizm sektörü açısından etkin yıllar olarak ortaya çıkmıştır- Tourism industry in Turkey is an industry showing continuous improvement except for political reasons induced recession. The development of the tourism industry in the last quarter of the century also summarizes the development in the coming years. In the literature, the efficiency analysis of the tourism industry has generally been studied in the form of efficiency reviews of hotels. But this work other elements of tourism along with hotel industry was included in the analysis were examined Turkey's tourism history. From this point forth, the purpose of this study is to assess Turkey tourism efficiency between the years 2002-2018 and to examine the effects of model inputs/outputs used in the model. In order to improve the discriminatory power of Multi-Criteria Data Envelopment Analysis (MCDEA) model, the goal programming based Bi-Objective Multiple Criteria Data Envelopment Analysis (BiO-MCDEA) model developed by Ghasemi et al. (2014) was used in the efficiency analysis. According to the efficiency values reached as a result of the study, 2008, 2013 and 2018 have emerged as efficient years for the tourism industry
Article
Have been described the role of the scientific method of cognition and the role of scientific knowledge in the educational sphere in the period of Modern philosophy. Have been determined the role of the empirical method and the theoretical method in experiments in order to gain new knowledge and test hypotheses. Considered the role of scientific societies and academies as centers of development of scientific knowledge. Have been determined that the expansion of knowledge through the improvement of the scientific method has led to the need for changes in principles of education. Are described the demonstrations of physical experiments for educational purposes in universities and academies. Have been described the physical experiments on the example of a lecture of Jacques Rohault and Pierre Poliniere. The contribution of Francis Bacon and Rene Descartes, Isaac Newton in the study of the effectiveness and objectivity of the scientific method in research is described. Was considered that scientific research was made with the aim of practical efficiency for human life, and not as a self-goal. Knowledge and education should be formed based on the inductive method. The main ideas of the work "Reflections on the method" are considered as the publication of tipping point in the formation of Modern philosophy. Have been described the main ideas of 21 rules of application the scientific method in researches in this publication. Have been determined that in the Modern philosophy period science had a direct impact on education, as well-known representatives of science popularized and involved in their research students and young scientists through educational institutions. At this time, education has become an instrument of human socialization, so well-known educators of this period emphasized to the need of early child development (John Locke and Jean-Jacques Rousseau) and the universality of education (Jan Amos Comenius). It is determined that the application of the scientific method in the research activities of period of Modern philosophy is inextricably linked with their pedagogical practice. In this way it has led to active improvement and structuring of scientific methods, and to involving more young minds in scientific research and investigations. Which was a background to the germination of the next period of Classical Philosophy of Science
Chapter
A science base for enterprise interoperability was first proposed in 2006 as a mechanism to formalize knowledge being generated by researchers and applied by industry to facilitate collaboration between enterprises through mutual interoperability of their enterprise systems. Subsequently, the community of researchers and exploiters of Enterprise Interoperability research addressed this issue as a group, culminating in a project funded by the European Commission FP7 programme. In this chapter, the authors explore the structure for an Enterprise Interoperability Science Base defined in this project, based on analysis of its purposes, the knowledge already available from pragmatic research, and the lessons learned, both on interoperability and the theoretical structure of a science base. The resulting science base is now evolving from the body of knowledge used for its initial population to embrace new research results and issues. This chapter focuses on the structure devised for an Enterprise Interoperability Science Base capable of delivering benefit to a comprehensive range of stakeholders with research and industry interests.
Article
Full-text available
Contemporary forensic psychology is characterized by a relative lack of attention to theory building and conceptual analysis. In my view, this neglect of theory amounts to theoretical illiteracy and represents a significant obstacle to the explanation of crime and its management. In this paper I explore the problem of theoretical illiteracy for forensic psychological research and practice. First, I discuss why theory is important in science and the dangers of ignoring it. Second, I review the role of theory in addressing the myriad of practical problems facing human beings. Third, I outline three strategies to increase researchers and practitioners’ appreciation of theory construction and development: adopting a more comprehensive model of scientific method, epistemic iteration, and promoting model pluralism. Fourth, I examine two examples of core concepts from correctional psychology, that of dynamic risk factors and classification, and demonstrate how the above strategies can be used to address problems with these constructs.
Book
Full-text available
Acknowledgements First of all, we would like to express our sincere thanks to DW Akademie for this initiative. The importance of this study lies in its main research questions: What factors facilitate or inhibit aspiring Bangladeshi journalists to prepare for and join the profession and what factors enable or restrict them from acquiring the necessary skill sets (particularly in educational institutions and media outlets) to succeed in journalism. The insights gathered from this will aid both the industry and academia to help the profession of journalism progress to a new level.
Article
Full-text available
This article is concerned with the nature of scientific method and its importance for psychology. It begins by considering the nature of scientific methodology as an essential source for a proper understanding scientific method. This is followed by an outline, and short assessment, of three major theories of scientific method: hypothetico-deductive method, inductive method, and inference to the best explanation. Thereafter, a broad abductive theory of scientific method, that has particular relevance for psychology, is presented. Finally, the value of scientific method for a genuine education in science is considered.
Article
Full-text available
Philosophical analyses of scientific methodology have long understood intuition to be incompatible with a rule based reasoning that is often considered necessary for a rational scientific method. This paper seeks to challenge this contention by highlighting the indispensable role that intuition plays in the application of methodologies for scientific discovery. In particular, it seeks to outline a positive role for intuition and personal judgment in scientific discovery by exploring a comparison between the use of heuristic reasoning in scientific practice and engineering design. While these discussions share many features, it will also be shown that the successful use of heuristics in engineering design is often considered to depend on a crucial factor that is markedly absent from accounts of the use of heuristics in scientific discovery; experienced judgment. In the final sections of this paper, I will compare attitudes to the role of computer analysis in scientific and engineering practices, with the aim of showing how the limitations of scientific discovery machines reveal the need for including intuition in philosophical accounts of heuristic reasoning in scientific discovery.
Article
Full-text available
Hypotheses and theories are essential constituents of the scientific method. Many vaccinologists are unaware that the problems they try to solve are mostly inverse problems that consist in imagining what could bring about a desired outcome. An inverse problem starts with the result and tries to guess what are the multiple causes that could have produced it. Compared to the usual direct scientific problems that start with the causes and derive or calculate the results using deductive reasoning and known mechanisms, solving an inverse problem uses a less reliable inductive approach and requires the development of a theoretical model that may have different solutions or none at all. Unsuccessful attempts to solve inverse problems in HIV vaccinology by reductionist methods, systems biology and structure-based reverse vaccinology are described. The popular strategy known as rational vaccine design is unable to solve the multiple inverse problems faced by HIV vaccine developers. The term “rational” is derived from “rational drug design” which uses the 3D structure of a biological target for designing molecules that will selectively bind to it and inhibit its biological activity. In vaccine design, however, the word “rational” simply means that the investigator is concentrating on parts of the system for which molecular information is available. The economist and Nobel laureate Herbert Simon introduced the concept of “bounded rationality” to explain why the complexity of the world economic system makes it impossible, for instance, to predict an event like the financial crash of 2007–2008. Humans always operate under unavoidable constraints such as insufficient information, a limited capacity to process huge amounts of data and a limited amount of time available to reach a decision. Such limitations always prevent us from achieving the complete understanding and optimization of a complex system that would be needed to achieve a truly rational design process. This is why the complexity of the human immune system prevents us from rationally designing an HIV vaccine by solving inverse problems.
Chapter
Most physicists and cosmologists who believe that the universe is, or can be, eternal justify this belief by the fact that one may, conceivably, construct an eternal cosmological model, that is, a cosmological model that includes or entails an eternal universe. But is this correct? Can an eternal cosmological model, by itself, justify belief in the possibility of an eternal universe? In this chapter I argue that the answer to this question is negative. First, I argue that, if one does not engage with the philosophical arguments for a beginning of the universe, and if one does not included in the pool of explanatory options the hypothesis of an absolute beginning when evaluating which hypothesis best explains the discoveries of modern cosmology, then one’s belief in the possibility of an eternal universe cannot be justified solely by the fact that there exist several eternal cosmological models. I then argue, second, that even if an eternal cosmological model can justify this belief, no such model is currently successful, and the hypothesis that the universe had a beginning is, at present, the best explanation of the discoveries of cosmology.
Article
p> O problema da demarcação é uma das clássicas questões em filosofia da ciência. Ele não teve muita atenção nas últimas décadas entre os filósofos da ciência comparada àquela recebida nos inícios do século XX, contudo continua sendo importante. Pensamos existir várias formas de buscar uma solução ao desafio. Apresentaremos uma proposta que tem suas origens na filosofia da ciência de Thomas Kuhn (1973). A escolha de teorias é baseada nas virtudes cognitivas e pragmáticas das teorias. </p
Chapter
Statistical inference is the inference of properties of the distribution of variables of a population, from a sample selected from the population (Fig. 13.1). To do statistical inference, your conceptual research framework should define the relevant statistical structures, namely, a population and one or more random variables (Chap. 8, Conceptual Frameworks). The probability distributions of the variables over the population are usually unknown. This chapter is required for Chap. 20 on statistical difference-making experiments, but not for the other chapters that follow. Fig. 13.1 Statistical inference is the inference of properties of the probability distribution of variables
Chapter
This chapter characterises quantum chemistry, the main reducing agent in chemistry, as a Lakatosian research programme. The attraction of the concept of a research programme is that it allows us to think of quantum chemistry not as a single theory, but as a succession (or network) of co-operating theories. This in turn allows for a more balanced discussion on how quantum chemistry can be characterised as a reducing theory. It also allows us to consider a number of questions on the internal structure of quantum chemistry, such as the question whether quantum chemistry is a progressive or degenerating research programme.
Chapter
Full-text available
Article
The paper reviews three modes of rational inference: deductive, inductive and probabilistic. Many examples of each can be found in scientific endeavour, professional practice and public discourse. However, while the strengths and weaknesses of deductive and inductive inference are well established, the implications of the emerging probabilistic orientation are still being worked through. The paper discusses some of the recent findings in psychology and philosophy, and speculates about the implications for scientific and professional practice in general and OR in particular. It is suggested that the probabilistic orientation and Bayesian approach can provide an epistemological lens through which to view the claims of different approaches to inference. Some suggestions for further research are made.
Book
Full-text available
The aim of this book is to change the nature of science. At present science is shaped by the orthodox view that scientific theories are accepted and rejected impartially with respect to evidence, no permanent assumption being made about the world independently of the evidence. This view is untenable. We need a new orthodoxy, which sees science as making a hierarchy of increasingly attenuated metaphysical assumptions concerning the comprehensibility and knowability of the universe. This is the new conception of science argued for in this book. This new conception has a number of implications for the nature of science. One is that it is part of current scientific knowledge that the universe is comprehensible, even physically comprehensible (something not recognized at present). Another is that metaphysics and philosophy, instead of being excluded from science, are actually central to scientific knowledge. Another is that science possesses a rational, if fallible and non-mechanical method of discovery (not at present adequately understood). And yet another implication is that the whole picture of scientific method and rationality needs to be changed.
Article
Full-text available
Naive deductive accounts of confirmation have the undesirable consequence that if E confirms H, then E also confirms the conjunction H & X, for any X - even if X is utterly irrelevant to H (and E). Bayesian accounts of confirmation also have this property (in the case of deductive evidence). Several Bayesians have attempted to soften the impact of this fact by arguing that - according to Bayesian accounts of confirmation - E will confirm the conjunction H & X less strongly than E confirms H (again, in the case of deductive evidence). I argue that existing Bayesian "resolutions" of this problem are inadequate in several important respects. In the end, I suggest a new-and-improved Bayesian account (and understanding) of the problem of irrelevant conjunction.
Article
Full-text available
What has science actually achieved? A theory of achievement should (1) define what has been achieved, (2) describe the means or methods used in science, and (3) explain how such methods lead to such achievements. Predictive accuracy is one truth-related achievement of science, and there is an explanation of why common scientific practices (of trading off simplicity and fit) tend to increase predictive accuracy. Akaike's explanation for the success of AIC is limited to interpolative predictive accuracy. But therein lies the strength of the general framework, for it also provides a clear formulation of many open problems of research.
Article
Criteria of scientific value are of different kinds. This paper concerns ultimate criteria, i.e. the axiology of science. Most ultimate criteria are multi‐dimensional. This gives rise to an aggregation problem, which cannot be adequately solved with reference to attitudes and behaviour within the scientific community. Therefore, in many cases, there is no fact of the matter as to whether one theory is better than another. This, in turn, creates problems for methodology.
Article
Philosophers of science have paid little attention, positive or negative, to Lyotard’s book The postmodern condition, even though it has been popular in other fields. We set out some of the reasons for this neglect. Lyotard thought that sciences could be justified by non-scientific narratives (a position he later abandoned). We show why this is unacceptable, and why many of Lyotard’s characterisations of science are either implausible or are narrowly positivist. One of Lyotard’s themes is that the nature of knowledge has changed and thereby so has society itself. However much of what Lyotard says muddles epistemological matters about the definition of ‘knowledge’ with sociological claims about how information circulates in modern society. We distinguish two kinds of legitimation of science: epistemic and socio-political. In proclaiming ‘incredulity towards metanarratives’ Lyotard has nothing to say about how epistemic and methodological principles are to be justified (legitimated). He also gives a bad argument as to why there can be no epistemic legitimation, which is based on an act/content confusion, and a confusion between making an agreement and the content of what is agreed to. As for socio-political legitimation, Lyotard’s discussion remains at the abstract level of science as a whole rather than at the level of the particular applications of sciences. Moreover his positive points can be accepted without taking on board any of his postmodernist account of science. Finally we argue that Lyotard’s account of paralogy, which is meant to provide a ‘postmodern’ style of justification, is a failure.
Article
The detailed analysis of a particular quasi-historical numerical example is used to illustrate the way in which a Bayesian personalist approach to scientific inference resolves the Duhemian problem of which of a conjunction of hypotheses to reject when they jointly yield a prediction which is refuted. Numbers intended to be approximately historically accurate for my example show, in agreement with the views of Lakatos, that a refutation need have astonishingly little effect on a scientist's confidence in the ‘hard core’ of a successful research programme even when a comparable confirmation would greatly enhance that confidence (an initial confidence of 0.9 fell by a fraction of a percent in the refutation case and rose to only a fraction of a percent short of unity in the comparable confirmation case). Timeo Danaos et dona ferentis.
Article
In his book The Retreat to Commitment Professor Bartley raised an important problem: can rationalism (meaning by this something that contrasts, not with empiricism, but with irrationalism) can rationalism be held in a rational way, that is, in a way that complies with its own requirements? Or is there bound to be something irrational in the rationalist's position? Briefly, Hartley's answer was that an element of irrationalism is involved in extant versions of rationalism; however, Bartley proposed a new version of rationalism that can, he claimed, be held in a way that is rational according to its own account of rationality. Bartley called this ‘Comprehensively Critical Rationalism’. (Being a bit of a mouthful, this is often abbreviated to ‘CCR’, a practice I will follow.)
Article
Many philosophers agree that Hume was not simply objecting to inductive inferences on the grounds of their logical invalidity and that his description of our inductive behaviour was inadequate, but none the less regard his argument against induction as irrefutable. I argue that this constellation of opinions contains a serious tension. In the light of the tension, I re-examine Hume's actual sceptical argument and show that the argument as it stands is valid but unsound. I argue that it can only be converted into a sound one if our inductive behaviour can be characterized as a process of rule-governed ampliation. Drawing on some Bayesian ideas, I argue that our inductive behaviour probably cannot be characterized in that way, so our immunity from Hume is secure. Finally, I compare my response to Hume's argument with some other well known responses.
Article
According to the main tradition in epistemology, knowledge is a variety of justified true belief, justification is an undefinable normative concept, and epistemic principles (principles about what justifies what) are necessary truths. According to the leading contemporary rival of the tradition, justification may be defined or explained in terms of reliability, thus permitting one to say that knowledge is reliable true belief and that epistemic principles are contingent. My aim here is to show that either of these approaches will yield a solution to the problem of induction. In particular, either of them makes it possible to ascertain the reliability of induction through induction itself. Such a procedure is usually dismissed as circular, but I shall argue that it cannot be so dismissed if either approach is correct. As one would expect, the solution based on the traditional approach differs from the one based on the reliabilist alternative, but they have important features in common. The common elements are presented in sections I, II, and III, the elements specific to the reliabilist approach in sections IV, V, and VI, and those specific to the traditional approach in section VII.
Article
A Bayesian account of the virtue of unification is given. On this account, the ability of a theory to unify disparate phenomena consists in the ability of the theory to render such phenomena informationally relevant to each other. It is shown that such ability contributes to the evidential support of the theory, and hence that preference for the-ories that unify the phenomena need not, on a Bayesian account, be built into the prior probabilities of theories.
Article
Intuitionistic meta-methodologies, which abound in recent philosophy of science, take the criterion of success for theories of scientific rationality to be whether those theories adequately explicate our intuitive judgments of rationality in exemplary cases. Garber's (1985) critique of Laudan's (1977) intuitionistic meta-methodology, correct as far as it goes, does not go far enough. Indeed, Garber himself advocates a form of intuitionistic meta-methodology; he merely denies any special role for historical (as opposed to contemporary or imaginary) test cases. What all such positions lack is a base from which to inform, criticize, or restructure our core methodological intuitions. To acquiesce in this is to deny that exemplary cases can serve the sort of warranting role required for intuitionism. This point is reinforced by a series of reasons for denying the warranting role of pre-analytic judgments of rationality. These reasons point the way toward an improved approach to meta-methodology.
Article
The central problem with Bayesian philosophy of science is that it cannot take account of the relevance of simplicity and unification to confirmation, induction, and scientific inference. The standard Bayesian folklore about factoring simplicity into the priors, and convergence theorems as a way of grounding their objectivity are some of the myths that Earman's book does not address adequately.
Article
This paper examines the standard Bayesian solution to the Quine–Duhem problem, the problem of distributing blame between a theory and its auxiliary hypotheses in the aftermath of a failed prediction. The standard solution, I argue, begs the question against those who claim that the problem has no solution. I then provide an alternative Bayesian solution that is not question&hyphen;begging and that turns out to have some interesting and desirable properties not possessed by the standard solution. This solution opens the way to a satisfying treatment of a problem concerning ad hoc auxiliary hypotheses.
Article
Is a theory that makes successful predictions of new facts better than one that does not? Does a fact provide better evidence for a theory if it was not known before being deduced from the theory? These questions can be answered by analyzing historical cases. Einstein's successful prediction of gravitational light bending from his general theory of relativity has been presented as an important example of how "real" science works (in contrast to alleged pseudosciences like psychoanalysis). But, while this success gained favorable publicity for the theory, most scientists did not give it any more weight than the deduction of the advance of Mercury's perihelion (a phenomenon known for several decades). The fact that scientists often use the word "prediction" to describe the deduction of such previously known facts suggests that novelty may be of little importance in evaluating theories. It may even detract from the evidential value of a fact, until it is clear that competing theories cannot account for the new fact.