Chapter

Falsification and the Methodology of Scientific Research Programmes

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Two books have been particularly influential in contemporary philosophy of science: Karl R. Popper's Logic of Scientific Discovery, and Thomas S. Kuhn's Structure of Scientific Revolutions. Both agree upon the importance of revolutions in science, but differ about the role of criticism in science's revolutionary growth. This volume arose out of a symposium on Kuhn's work, with Popper in the chair, at an international colloquium held in London in 1965. The book begins with Kuhn's statement of his position followed by seven essays offering criticism and analysis, and finally by Kuhn's reply. The book will interest senior undergraduates and graduate students of the philosophy and history of science, as well as professional philosophers, philosophically inclined scientists, and some psychologists and sociologists.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... To develop an integrative conceptual lens, in this paper we follow Lakatos (1970) to first define, then juxtapose, and finally integrate the structural and behavioral perspectives on networks and networking. We summarize this multi-step approach in Figure 1. ...
... According to Lakatos (1970Lakatos ( , 1974, theories must be progressivethey must have conceptual and empirical distinctiveness relative to competing (i.e., approaches taken in different research programs) and even complementary theoretical approaches (i.e., approaches within the same research program). Conceptual distinctiveness pertains to the intellectual elements that define a perspective and make a theory self-sustaining, and empirical distinctiveness pertains to findings that qualify and reinvigorate key theoretical assumptions. ...
... Juxtaposing similarities and elucidating differences in assumptions and causal mechanisms are required for the integration and further development of the two theoretical perspectives (Lakatos 1970). These similarities and differences are reflected in conceptual premises about network search, epistemological assumptions and methodologies, agency or opportunity as a source of action, and causal mechanisms (summarized in Table 2). ...
Article
Full-text available
Research on social networks and networking has generated critical insights for theories and practices about social action in the work context. These insights mostly derive from two theoretical perspectives: one focuses on the network structures that influence individual actions and outcomes, and the other examines the behaviors that people manifest when they interact with others at work. Through theoretical comparison and development, the goal of this conceptual article is to define, compare, and identify areas for integration of structural and behavioral perspectives on social action. To guide future work, we augment the behavioral focus of networking scholarship with a structural emphasis and vice versa.
... Lakatos는 학문이 선형적 축적으로 발달하기 보다는 변칙사례(anomaly)의 등장과 대응을 통해 발달한다고 보았다 (Lakatos, 1976a). 이러한 관점은 Lakatos가 수학의 발달을 증명과 반박으로 본 것과 유사한 점이 있다 (Lakatos, 1976b). ...
... Lakatos는 학문이 선형적 축적으로 발달하기 보다는 변칙사례(anomaly)의 등장과 대응을 통해 발달한다고 보았다 (Lakatos, 1976a). 이러한 관점은 Lakatos가 수학의 발달을 증명과 반박으로 본 것과 유사한 점이 있다 (Lakatos, 1976b). 수학의 발달에 대해 Lakatos는 기존의 절대주의적 수리철학과는 달리, 수학적 정리가 한 번 증명되면 최종적인 참이라는 관점에 의문을 제기하였다 (An et al., 2010;Lee, 2013). ...
... Lakatos는 어떠한 추측에 대해 반례가 등장했을 때 대응하는 방식도 소개하 였다 (Jung et al., 2013 (You & Lee, 2008). Lakatos (1976a)는 수학 뿐 아니라 다른 학문의 발달에 대해서도 이와 유사한 입장을 취하고 있다. 먼저, Lakatos는 연구를 설명하 는 기본 단위로 연구 프로그램(research programme)을 제시한다. ...
... They argue that SIMs are intimately tied to episodic social processes that challenge the status quo and catalyze scientific growth, aiming to produce and diffuse new knowledge. Their concept resembles the older notion that research programs may exhibit a 'lifecycle' with progressive and degenerative stages (Lakatos, 1970). ...
... Students trained by leading intellectuals possess the skills to work in the direction of their mentors and can benefit from the opportunities associated with an emerging field of research. These early generations can build, in a local and cumulative fashion, a progressive research program (Lakatos, 1970). Depending on the breadth and potential of a new intellectual movement, retaining one's graduates can thus be beneficial for more than one generation of scholars. ...
... O cinturão protetor de um Programa de Pesquisa, por sua vez, é formado por hipóteses e teorias auxiliares criadas e articuladas com a função de protegerem o núcleo, redirecionando os próprios objetivos do Programa, se necessário. Por definição, este cinturão deve suportar e minimizar o impacto das críticas ao núcleo firme, se ajustando, ou mesmo sendo completamente substituído, para defender o núcleo, que será, assim, fortalecido (Lakatos, 1976). ...
... Finalmente, a heurística negativa impede que o Programa Etnomatemática seja declarado falso devido a alguma anomalia ou refutação -isto é, a falsidade incidirá sobre o cinturão de hipóteses auxiliares e não sobre o núcleo firme (Lakatos, 1976). A heurística positiva assume, então, a responsabilidade de estabelecer as regras necessárias para modificar o cinturão e eliminar tais anomalias. ...
Article
Full-text available
Resumo: O Programa Etnomatemática é um programa de pesquisa lakatosiano organizado intelectualmente por Ubiratan D'Ambrosio e que, como tal, é formado por um núcleo firme, um cinturão protetor e heurísticas positivas e negativas. A Educação das Relações Étnico-Raciais (ERER) se apresenta como parte de uma série de conquistas protagonizadas pelos Movimentos Negros Brasileiros, que assumem a educação como campo de luta contra o racismo e contra as desigualdades sociais. Neste artigo, de cunho teórico, argumentamos que os princípios da ERER-i) consciência política e histórica da diversidade; ii) fortalecimento de identidades e de direitos; e iii) ações educativas de combate ao racismo e a discriminações-historicamente se alinham com os princípios do Programa, o que o constitui como um Programa Antirracista. Como corpo que se expande e reflete sobre questões relevantes para a sociedade, especialmente para aquelas/es que historicamente sofrem tentativas de regulação, encerramos sugerindo que a comunidade do Programa Etnomatemática assuma a ERER como parte de seu núcleo firme. Palavras-chave: Programa Etnomatemática, Educação das Relações Étnico-Raciais, Educação Antirracista. Abstract: The Ethnomathematics Program, organized intellectually by Ubiratan D'Ambrosio, is a Lakatosian research program composed of a hard core, a protective belt, and positive and negative heuristics. Education for Ethnic-Racial Relations (ERER acronym in Portuguese) is one of the many achievements of the Brazilian Black Movements, which consider education as a means against racism and social inequalities. In this theoretical article, we argue that the principles of ERER – i) political and historical awareness of diversity; ii) strengthening identities and rights; and iii) educational actions to combat racism and discrimination – are historically aligned with the principles of the Program, characterizing it as an Anti-Racist Program. As a body that expands and reflects on issues relevant to society, especially for those who have historically been regulated, we conclude by suggesting that the Ethnomathematics Program community adopts ERER as part of its hard core. Keywords: Ethnomathematics Program. Education for Ethnic-Racial Relations. Anti-Racist Education.
... The latter approach significantly revises the Popperian idea of falsification based on several examples from the history of science showing that "falsification in the sense of naïve falsificationism (corroborated counterevidence) is not a sufficient condition for eliminating a specific theory: In spite of hundreds of known anomalies we do not regard it as falsified (that is, eliminated)" (Lakatos, 1970, p. 121). [1] In Lakatosian terms, the simplicity of integers ratios could represent the scientific core with several auxiliary pieces of evidence that defend the core (Lakatos, 1970). Examples of this auxiliary evidence might stem from neurophysiological findings that the auditory system tolerates small deviations from perfect-integer relationships (Tramo et al., 2001), or that neural processing of equal-temperament intervals is influenced by the degree of consonance, despite the complex ratios present in equal temperament (e.g., Bones et al., 2014;Itoh et al., 2003Itoh et al., , 2010. ...
... Examples of this auxiliary evidence might stem from neurophysiological findings that the auditory system tolerates small deviations from perfect-integer relationships (Tramo et al., 2001), or that neural processing of equal-temperament intervals is influenced by the degree of consonance, despite the complex ratios present in equal temperament (e.g., Bones et al., 2014;Itoh et al., 2003Itoh et al., , 2010. [2] In such an approach, therefore, the evidence showing the influence of historical, individual, and cultural factors on consonance perception should not be hastily dismissed as a refutation of the core, but as part of the normal life course of a progressive research programme (Lakatos, 1970). ...
... By 'research programs', we refer to 'theorizing environments grouped around a 'hard core' of shared assumptions' (Lakatos, 1970;Leone et al., 2021, p. 728). This 'hard core' is a system of shared meanings that constitutes what types of theories, methods, and phenomena participants of a research program find appropriate to work from and study (Ketokivi et al., 2017). ...
Article
Full-text available
Management scholarship's apparent lack of impact is a misconception based on the presumption that impact involves a direct and visible influence of papers or research projects on management practice. Theory-building impacts management practice in diverse, sometimes indirect and unnoticed, manifold ways. Supported by intermediaries such as management education, the media, and consulting, impacts emerge through interest-driven knowledge production that contributes to the wider uptake and reproduction of management theory's main ideas and assumptions. We draw on Jürgen Habermas's theory of knowledge and human interests, aiming to expand how impact from scholarship can be understood, and what forms it might take as part of the kinds of knowledge-constitutive interests that are pursued through theory-building. We elaborate these different forms, building a pluralist framework of what we call 'programmatic' and 'hybrid' types of impact. We advance the argument that diverse knowledge-constitutive interests pursued through theory-building contribute to our field's impact on management practice in distinct, yet complementary ways.
... 15 This is the terminology utilized in Pasinetti (1981Pasinetti ( , 2007, while in Pasinetti (1986) slightly different expressions are employed: 'pure preference' and 'pure labour' respectively. 16 The original source are Kuhn (1962) and Lakatos (1970Lakatos ( , 1971). programmes to the more or less recent history of economics has been one of the most widely debated issues among economic methodologists and historians of economic thought. ...
... For that reason, we do not think it is productive to engage in debates about whether or not people are rational. Rationality, of whatever denomination, provides a framework in which we can develop models of human behavior (Lakatos, 1970). Individual models are falsifiable -they generate specific predictions that we can test in further experiments. ...
Preprint
Full-text available
A new approach to understanding irrational behavior that provides a framework for deriving new models of human cognition. What does it mean to act rationally? Mathematicians, economists, and statisticians have argued that a rational actor chooses actions that maximize their expected utility. And yet people routinely act in ways that violate this prescription. Our limited time and computational resources mean that it is often unrealistic to consider all options in order to choose the one that has the greatest utility. This book suggests a different approach to understanding irrational behavior: resource-rational analysis. By reframing questions of rational action in terms of how we should make the best use of our limited resources, the book offers a new take on fundamental questions at the heart of cognitive psychology, behavioral economics, and the design of artificial intelligence systems. The book presents a formal framework for applying resource-rational analysis to understand and improve human behavior, a set of tools developed by the authors to make this easier, and examples of how they have used this approach to revisit classic questions about human cognition, pose new ones, and enhance human rationality. The book will be a valuable resource for psychologists, economists, and philosophers as well as neuroscientists studying human brains and minds and computer scientists working to reproduce such systems in machines.
... Simplicity is considered a key virtue for scientific explanations (White, 2005;Qu, 2023;MacKay, 2003). However, there are many forms of simplicity that may be chosen, which may rank explanations differently (Lakatos, 1970). We consider the main three forms of measures of simplicity: Parsimony, Conciseness and Complexity. ...
Preprint
Full-text available
Mechanistic Interpretability (MI) aims to understand neural networks through causal explanations. Though MI has many explanation-generating methods, progress has been limited by the lack of a universal approach to evaluating explanations. Here we analyse the fundamental question "What makes a good explanation?" We introduce a pluralist Explanatory Virtues Framework drawing on four perspectives from the Philosophy of Science - the Bayesian, Kuhnian, Deutschian, and Nomological - to systematically evaluate and improve explanations in MI. We find that Compact Proofs consider many explanatory virtues and are hence a promising approach. Fruitful research directions implied by our framework include (1) clearly defining explanatory simplicity, (2) focusing on unifying explanations and (3) deriving universal principles for neural networks. Improved MI methods enhance our ability to monitor, predict, and steer AI systems.
... Kesalahan Atau Serangan Yang Tidak Adil? Kasus Penelitian REBT Lakatos (1970) menyatakan bahwa ilmu pengetahuan arus utama hanya memaparkan teori dan modelnya kepada masyarakat dan mencoba membangun sabuk pelindung di sekitar asumsi-asumsinya. Karena asumsi tidak dapat dipalsukan atau diverifikasi, hal ini merupakan praktik yang tidak sehat, yang membuat perdebatan terfokus pada teori dan didorong oleh bukti empiris. ...
Article
Full-text available
Artikel ini menyajikan sinopsis terapi perilaku rasional-emotif (REBT, kerangka teori dasar, penerapannya, dan arah masa depan. Makalah ini disusun menurut struktur berikut: pada bagian pertama, dibahas penelitian fundamental/dasar REBT; pada bagian kedua disajikan penelitian klinis/terapan dalam REBT, termasuk aspek kemanjuran dan efektivitas, diskriminasi gangguan yang mana REBT bekerja paling efektif, dan hubungannya dengan terapi lain. Penggunaan dan penyalahgunaan REBT serta dampaknya terhadap penelitian dan pengembangan di masa depan juga disajikan. Meskipun benar bahwa penelitian REBT mempunyai banyak kekurangan, namun generalisasi dan/atau pembesaran hal-hal negatif yang berlebihan, dan minimalisasi hal-hal positif merupakan keyakinan disfungsional yang mempertahankan gagasan yang salah di lapangan bahwa REBT hanya memiliki sedikit penelitian empiris dan bahwa penelitian REBT tidak ada gunanya. dalam masalah serius. Pendekatan yang seimbang, menganalisis kekuatan dan kelemahan, menunjukkan bahwa REBT memiliki ratusan artikel penelitian dan penelitian berkualitas tinggi cenderung mendukung teori dasar dan kemanjuran REBT. Namun, untuk memperkuat kesimpulan ini dan untuk sepenuhnya mengeksplorasi potensi REBT, kekurangan penelitian REBT perlu diperbaiki, dan penelitian berkualitas tinggi perlu dipromosikan. Hal ini sangat penting karena, meskipun efektif, terapi psikogenik-perilaku belum mencapai “standar yang diinginkan” dalam hal kemanjuran dan efektivitas, karena sekitar 30-40% orang masih tidak responsif terhadap terapi ini. Intervensi Dengan demikian, REBT dapat menjadi platform untuk menghidupkan kembali studi empiris tentang kemanjuran/efektivitas dan teori model kognitif-perilaku psikopatologi dan fungsi manusia.
... This picture is inspired in a Lakatosian image of scientific progress. See Lakatos (1978Lakatos ( /1999; see also Andersen (2012). ...
Article
Full-text available
One important question in emotion science is determining what emotions there are. To answer this question, researchers have assumed either that folk emotion concepts are unsuitable for scientific inquiry, or that they are constitutive or explanatorily significant for emotion research. Either option faces a challenge from the cultural variability of folk emotion concepts, prompting debate on the universality of emotions. I contend that cultural variation in emotion should be construed as variations in components rather than entire emotional repertoires. To do this, I distinguish between hypotheses concerning emotional repertoires and those focused on specific emotional features within various cultural contexts. I hold that decisions regarding emotional repertoire hypotheses call for either revising current classification systems or maintaining them, but that, given underdetermination by evidence, this entails a preference for maintaining emotion taxonomies. This, in turn, leaves empirical hypotheses on specific emotional features as the most viable avenue for scientific inquiry.
... São vistas não só como desejáveis, mas como definidoras do empreendimento científico. Lakatos (1970), por exemplo, atribui um enorme peso à fecundidade ou capacidade heurística de uma teoria científica, colocando esse como o principal critério quando se trata de escolhas objetivas e racionais de teorias. Para Kuhn (1977) e Laudan (1977), a capacidade de resolver problemas e a capacidade explicativa são alocadas ao mesmo nível, entendido como um sinal de progresso científico. ...
Article
George Jackson Mivart (1827-1900) foi um adversário formidável de Charles Robert Darwin (1809-1882) e de seus acólitos, contestando a suficiência da Teoria da Descendência com Modificação por Seleção Natural como resposta à pergunta sobre a origem das espécies. Neste trabalho examinamos essa disputa entre Mivart e Darwin e as qualidades de Darwin como disputador que dela emergem. Utilizamos Redes Dialéticas —um sistema diagramático— para representar as asserções de ambos e as suas interrelações. A disputa reforça o valor da erudição, porque ambos, valendo-se de seus extensos conhecimentos em diversas áreas da biologia, produziram múltiplos e diversificados argumentos por exemplificação.
... It can be beneficial to clearly delineate the key concepts that are used to describe the scientific interplay between logical reasoning and empirical observations and measurement: A theory is a structured set of abstract concepts refined through empirical testing (Popper, 1935;Kuhn, 1962;Lakatos, 1970) . A model instantiates a theory by formalizing specific assumptions, constraints, and parameters to generate testable predictions (Suppes, 1960;Van Fraassen Bas, 1980;Giere, 1990) . ...
Preprint
Full-text available
This review synthesizes advances in predictive processing within the sensory cortex. Predictive processing theorizes that the brain continuously predicts sensory inputs, refining neuronal responses by highlighting prediction errors. We identify key computational primitives, such as stimulus adaptation, dendritic computation, excitatory/inhibitory balance and hierarchical processing, as central to this framework. Our review highlights convergences, such as top-down inputs and inhibitory interneurons shaping mismatch signals, and divergences, including species-specific hierarchies and modality-dependent layer roles. To address these conflicts, we propose experiments in mice and primates using in-vivo two-photon imaging and electrophysiological recordings to test whether temporal, motor, and omission mismatch stimuli engage shared or distinct mechanisms. The resulting dataset, collected and shared via the OpenScope program, will enable model validation and community analysis, fostering iterative refinement and refutability to decode the neural circuits of predictive processing.
... If such a holistic, interdisciplinary resilience research programme is viewed through a Lakatosian lens [13,14], its 'hard core' and 'protective belt' must be defined to begin with. This task is approached here through four philosophical categories: ontological, epistemological, axiological, and methodological. ...
... The above supported the need for psychology to deal with the mental, despite the behavioral prohibition. However, what allowed the scientific turn to be completed in a research program, was the invention by Alan Turing (1951a) of Turing machines and in particular the Universal Machine which was able to imitate any process that was sufficiently specified to be recorded as a computer program consisting of algorithmically specified procedures [3]. The reason that the computer managed to bring the mind back into scientific study was precisely that it showed us that the problem of psychophysical causality could be solved. ...
Article
Full-text available
In cognitive psychology, the term representation refers to the way information is mentally represented through coded symbols. It is a construction of the mind that results from the processing of external stimuli and is of primary importance in problem solving, communication and education.
... are properly established, the hypothesis H that is being tested must indeed be rejected. (Hempel, 1966, p. 31) Imre Lakatos also embedded modus tollens in his account of the methodology of scientific research programs (Lakatos, 1978). Much more recently, Alexander Bird seems to have embraced disconfirmation as modus tollens: "Imagine a case where we are investigating a hypothesis h. ...
Article
Full-text available
Scientific disconfirmation has often been thought to be reasoning by modus tollens. This interpretation, however, misconstrues the conditionals in this scientific reasoning in terms of the material conditional, rather than in terms of causal conditionals. Scientific confirmation has also been thought to be a logical fallacy, affirming the consequent. Once one embraces the idea that scientists are reasoning in terms of causal conditionals, rather than the material conditional, we can avoid the peculiarity of the view that scientific confirmation is based on a simple logical fallacy. Interpreting scientists as reasoning about physical consequences of hypotheses enables a more charitable interpretation of scientific disconfirmation and confirmation.
... Между тем компании ожидают от ИИ-моделей качественного выполнения сложных рабочих задач. Наращивание потенциала для профессионального использования LLM невозможно без использования академических принципов в их обучении, которое должно основываться на концепции «ядра знания» (core of knowledge) (Kuhn, 1977;Lakatos, 1963;Lakatos, 1970a;Lakatos, 1970b), с тем чтобы ограничить возможность ошибок и не допустить профессиональной некомпетентности ИИ. Предлагается синтезировать увеличение параметров настройки с наращиванием массива качественных академических источников и переосмыслить обучение на ядре знаний путем взаимной увязки предметных областей (возможно, в рамках методики RAG). ...
Article
Full-text available
Востребованность генеративного искусственного интеллекта (GenAI) стремительно растет ввиду способности быстро обрабатывать масштабные объемы данных, компилировать их и транслировать «общее мнение». Однако дисбаланс между «компетенциями» GenAI препятствует расширению использования этого инструмента для решения сложных профессиональных задач. ИИ работает как гигантский накопитель и средство воспроизводства знаний, однако не способен их интерпретировать и находить правильное применение в зависимости от контекста. Сохраняется критическая вероятность ошибки при генерации ответов даже на самые простые вопросы. В статье оценивается степень значимости ограничений, присущих GenAI. Тестирование лежащих в его основе языковых моделей, включая новейшие версии — GPT-4o1 и GigaChat MAX, проводилось с помощью авторского набора вопросов, основанного на таксономии Блума. Установлено, что вероятность получения правильного ответа практически не зависит от количества параметров настройки, сложности и таксономии, а при наличии множественного выбора — снижается. Полученные результаты подтверждают предположение о невозможности применения современных инструментов ИИ в профессиональных целях. Предлагаются опции, способные внести значимый вклад в достижение как минимум квазипрофессионального уровня.
... I. B. Cohen (1987) proposed a historical analysis of scientific revolutions through four universally applicable tests, acknowledging both the objective and subjective dimensions of discontinuity in science. Lakatos (1970) introduced the concept of scientific research programs, asserting that discontinuity occurs only when there is a change in the "hard core" of the program-a process he deemed rare. In contrast, Laudan (1984) criticized the notion of complete discontinuity in paradigmatic shifts and suggested a network model, where levels of science (ontological, methodological, axiological) are not hierarchically interconnected. ...
Article
Full-text available
In the article, we address the issue of a metamodern shift in geographical thought, reflecting on the context of the current Anthropocene polycrisis, which encompasses a range of environmental, geopolitical, economic, and socio-cultural challenges of the present era. We start from the assumption that postmodern epistemological and methodological frameworks are insufficient for a comprehensive understanding and resolution of these challenges. In this context, we explore the potential of metamodernism as a new philosophical and scientific platform that oscillates between modernist rationalism and postmodern skepticism, allowing for the productive integration of these frameworks. The primary objective of this study is to demonstrate how metamodernism can contribute to the reinterpretation of geographical thought and to identify its potential as the fifth first-order discontinuity in the historical development of this discipline. In the theoretical and methodological section, we discuss discontinuities in scientific thought and apply the Latour-Barnes model to analyze the phases of mobilization and autonomization of metamodernism within academic discourse. We introduce key metamodernist concepts and principles - metarealism, zetetism, hylosemiotics, sublation, oscillation of scientific discourses, the paradoxical position of truth and grand narratives, dia/polylogical thinking, and the coexistence of layers of cultural evolution (Pipere, Mārtinsone, 2023, Storm, 2021), — and outline their applicability in geographical research. We employ qualitative, discourse-based, and historical-contextual methods to examine the metamodern shift in geographical thought, focusing on epistemological, ontological, and methodological transformations. We reinterpret geography as a post-disciplinary and post-paradigmatic scientific discipline that oscillates between various ontological, epistemological, and methodological frameworks. In this context, we emphasize the necessity for an open, reflective, and pluralistic approach that facilitates the integration of diverse types of knowledge and methodological strategies. Understood through the lens of metamodernism, geography becomes a field of dynamic oscillation between the natural sciences, social sciences, humanities, and technological interpretations of reality. This conceptualization of geography addresses the need for comprehensive, practice-oriented knowledge that can tackle contemporary global challenges, such as polycrisis. This aligns with zetetic epistemology, which prioritizes abductive reasoning (inference to the best explanation) over rigid deductive or inductive models. We introduce hylosemiotics as a methodological tool that enables researchers to analyze material-symbolic interactions in space and place. This approach integrates semiotic analysis with material studies, providing a novel framework for interpreting geographical landscapes. In doing so, we aim to encourage discussions about applying metamodernist concepts in geographical thought while also acknowledging its limitations and potential risks. Moreover, we underscore the necessity for further theoretical and empirical reflection to refine methodological strategies and practical applications of the metamodernist framework in geographical research.
... La metodología de evaluación de teorías científicas que elaboró Cristin Chall argumenta que la mejor manera de conceptualizar teorías usadas para evaluar modelos científicos y que pretenden, o que intentan, ser empíricas, puede lograrse haciendo un híbrido entre los Programas de Investigación Científicos (PICs), introducidos por Imré Lakatos, junto con la capacidad de solución de problemas en la racionalidad científica, tomada de la metodología de tradiciones de investigación de Larry Laudan. (Chall 2019, 7) Sobre este tema, en su artículo La falsación y la metodología de los programas científicos de investigación (Lakatos 1970), Lakatos crea un compromiso entre el falsacionismo de Popper y los paradigmas de Kuhn al proponer una nueva unidad de progreso científico: el PIC, o Programa de Investigación Científico. Un programa de investigación se centra en torno a un núcleo duro de leyes, tesis y suposiciones científicas. ...
Article
El punto de partida para este ensayo es responder a la pregunta de por qué el llamado Estado Pasado del Universo es un problema epistémico que vale la pena ser abordado, entre otras razones, porque en caso de no aceptar su validez ello derivaría en una ‘catástrofe escéptica’. Posteriormente, hacemos una revisión de la metodología para evaluación de teorías propuesta por el Dr. Cristin Chall, quien la elaboró a partir de un híbrido entre los programas de investigación científicos (PICs) introducidos por Imré Lakatos, junto con la capacidad de solución de problemas en la racionalidad científica, tomada de la metodología de tradiciones de investigación de Larry Laudan. Una vez revisada la metodología de Chall, nos proponemos aplicarla a dos teorías (la de la inflación y la del Ciclo Cosmológico Conforme) que lidian con el llamado Estado Pasado cosmológico, con el fin de intentar determinar cuál de ellas es epistémicamente más viable. Cabe resaltar que nuestro propósito no es invalidar ninguna de estas teorías, sino más bien llevar a cabo un ejercicio de evaluación de virtudes epistémicas desde la filosofía de la ciencia, en este caso enfocada a la cosmología.
... Lakatos's key insight is that the balance of existing evidence is not always a good indicator of truth. Sometimes, scientific theories that lack evidential support later prove superior to their competitors (Lakatos 1970). For example, the view that peptic ulcers are often caused by H. pylori bacteria is now well-established. ...
Article
Full-text available
Identity-protective reasoning – motivated reasoning driven by defending a social identity – is often dismissed as a paradigm of epistemic vice and a key driver of democratic dysfunction. Against this view, I argue that identity-protective reasoning can play a positive epistemic role, both individually and collectively. Collectively, it facilitates an effective division of cognitive labor by enabling groups to test divergent beliefs, serving as an epistemic insurance policy against the possibility that the total evidence is misleading. Individually, it can correct for the distortions that arise from taking ideologically skewed evidence at face value. This is particularly significant for members of marginalized groups, who frequently encounter evidence that diminishes the value of their identities, beliefs, and practices. For them, identity-protective reasoning can counter dominant ideological ignorance and foster resistant standpoint development. While identity-protective reasoning is not without risks, its application from marginalized and counter-hegemonic positions carries epistemic benefits crucial in democracies threatened by elite capture. Against dominant views in contemporary political epistemology and psychology, identity-protective reasoning should be reconceived as a resource to be harnessed and not a problem to be eradicated.
... Indeed, this paper is an example of such analysis: IIT's definition of consciousness is taken as an axiom and we discuss different ways it might account for the explanandum of phenomenal binding, recognising that such accounts vary in their plausibility. Importantly, candidate bridging principle axioms can sometimes be combined with other axioms to make falsifiable predictions (as with IIT), supporting progress through the scientific method and adjusting confidence in underlying theoretical models as described by [29] and discussed in the context of consciousness theories by [30]-even if perfect certainty remains a chimaera. ...
Article
Full-text available
Theories of consciousness grounded in neuroscience must explain the phenomenal binding problem, e.g., how micro-units of information are combined to create the macro-scale conscious experience common to human phenomenology. An example is how single ‘pixels’ of a visual scene are experienced as a single holistic image in the ‘mind’s eye’, rather than as individual, separate, and massively parallel experiences, corresponding perhaps to individual neuron activations, neural ensembles, or foveal saccades, any of which could conceivably deliver identical functionality from an information processing point of view. There are multiple contested candidate solutions to the phenomenal binding problem. This paper explores how the metaphysical infrastructure of Integrated Information Theory (IIT) v4.0 can provide a distinctive solution. The solution—that particular entities aggregable from multiple units (‘complexes’) define existence—might work in a static picture, but introduces issues in a dynamic system. We ask what happens to our phenomenal self as the main complex moves around a biological neural network. Our account of conscious entities developing through time leads to an apparent dilemma for IIT theorists between non-local entity transitions and contiguous selves: the ‘dynamic entity evolution problem’. As well as specifying the dilemma, we describe three ways IIT might dissolve the dilemma before it gains traction. Clarifying IIT’s position on the phenomenal binding problem, potentially underpinned with novel empirical or theoretical research, helps researchers understand IIT and assess its plausibility. We see our paper as contributing to IIT’s current research emphasis on the shift from static to dynamic analysis.
... A widely held (quasi-Kantian) stance in 20 th century philosophy of science maintains that scientific categories (e.g., 'planet,' 'bird')-and, by extension, social categories (e.g., 'legal court,' 'priest')-require the presupposition of a 'conceptual scheme': a system of concepts that organizes and gives meaning to experience. Conceptual schemes have been described under various names, including 'linguistic frameworks' (Carnap 1950), 'conceptual schemes' (Quine 1948(Quine , 1951(Quine , 1960Putnam 1981), 'forms of life' (Wittgenstein 1953;Feyerabend 1978Feyerabend , 1981, 'paradigms' (Kuhn 1962(Kuhn /2012, 'the web of belief' (Quine 1951;Ullian 1970/1978), 'research programmes' (Lakatos 1968(Lakatos , 1970, 'research traditions' (Laudan 1977), 'standpoints' (Harding 1986(Harding , 1991Wylie 2003Wylie , 2012, 'binary oppositions' (Levi-Strauss 1958/1963, 'ideologies' (Mannheim 1929(Mannheim /1936, 'thought collectives' (Fleck 1935(Fleck /1981, 'epistemes' (Foucault 1966(Foucault /1970(Foucault , 1969(Foucault /1972, 'networks' (Latour 1987(Latour , 2005, 'styles of reasoning' (Crombie 1981(Crombie , 1994Hacking 1992), 'perspectives' (Nagel 1986;Giere 2006;Massimi 2022), and 'relativized a priori principles' (Putnam, 1962(Putnam, , 1976(Putnam, , 1979Friedman 1999Friedman , 2001Tsou 2003Tsou , 2010Stump 2015). In philosophy of science, conventionalists about conceptual schemes (e.g., Poincaré 1902/2017, Carnap 1950, Quine 1951, Kuhn 1962) regard conceptual schemes as useful instruments (i.e., conventions) for predicting or explaining empirical phenomena, rather than theoretical systems that accurately represent reality. ...
Chapter
Full-text available
In this chapter, I critically examine issues relevant to the construction and reality of social categories, focusing on issues concerning conceptual schemes and conventionalism. Conceptual schemes (‘paradigms,’ ‘linguistic frameworks,’ ‘forms of life’) are systems of concepts that organize and give (intersubjective) meaning to empirical experience. In discussions about the construction of social categories, a common assumption is that social categories and kinds (e.g., ‘money,’ ‘marriage,’ ‘liberal’) require the presupposition of a conceptual scheme that gives meaning to those terms. One prominent position in social ontology (Lewis, 1969, Gilbert 1989, 2014, Searle 1995) holds that the reality of social categories is grounded in a conventionalist conceptual scheme: social categories are conventionally accepted categories that reflect explicit or implicit agreements within a community. According to this view, the reality of social categories and kinds is established (and maintained) by contingent and arbitrary conventional decisions about social categories (e.g., laws about legal tender, marriage laws, the one-drop rule). Conventions can be understood as social customs (e.g., rules governing traffic, marriage laws, institutionalized rules about sex and gender) that serve some human purpose or interest. They are important for the establishment and maintenance of social norms (see Bicchieri 2006, 2017). The chapter proceeds as follows. In section 2, I examine the general idea of conceptual schemes, focusing on Kuhn’s influential theory of paradigms. While Kuhn’s position implies a conceptual relativity and anti-realism (or instrumentalism) about scientific concepts, Davidson and Popper challenge the Kuhnian orthodoxy that the presence of conceptual schemes implies conceptual relativity or ‘incommensurability’ of competing conceptual schemes. I argue that a key difference between scientific and social conceptual schemes is that the former are oriented towards describing nature, whereas the latter are oriented towards social utility or usefulness. In section 3, I examine some foundational works in social ontology—focusing on the accounts of Lewis, Gilbert, and Searle—that hold that social categories are grounded by a conventionalist conceptual scheme. These accounts of social categories imply that the reality of social categories is constituted by (or grounded in) social conventions, which ultimately reflect (or are anchored by) explicit or implicit community-level agreements. In section 4, I survey some prominent accounts of social human categories, focusing on the views of Hacking, Khalidi, Mallon, Haslanger, and Ásta. A common tendency in these accounts is the portrayal of social categories as social to the extent that they are conventionally determined. This suggests that the proper contrast class for social categories is natural categories (as opposed to individual categories or innate categories).
... The research field is replete with strong and general claims about epistemic injustices that are not supported by empirical evidence. Most importantly, these unsupported claims are not just found randomly in the margins of the research field but even seem to constitute the field's conceptual core, akin to a Lakatosian hard core (Lakatos 1970). In Table 1, we have listed a selection of such core claims in the field. ...
Article
Full-text available
The research field of epistemic justice in healthcare has gained traction in the last decade. However, the importation of Miranda Fricker’s original philosophical framework to medicine raises several interrelated issues that have largely escaped attention. Instead of pushing forward, crafting new concepts or exploring other medical conditions, we suggest that it is time to take stock, reconsider, and articulate some fundamental issues that confront the field of epistemic injustice in healthcare. This paper articulates such fundamental issues, which we divide into scientific, conceptual, and theoretical issues. Scientifically, the research field is confronted by a lack of empirical evidence. It relies on cases, making generalizations impossible and the field vulnerable to bias. Conceptually, many of the claims advanced in the literature are presented as facts but are merely hypotheses to be tested. Moreover, a criterion for applying the concept of testimonial injustice in medicine is lacking, impeding the development of a construct to empirically measure said injustices. Theoretically, many of the cases discussed in the literature do not prima facie qualify as cases of testimonial injustice, since they lack necessary components of testimonial injustice in Fricker’s framework, i.e., being unintentional and caused by identity prejudices in the hearers. If epistemic injustice is as pervasive as it is claimed in this literature, it should be of concern to us all. Addressing the issues raised here may strengthen the conceptualization of epistemic injustice in healthcare and lead to development of constructs that finally can explore its empirical basis.
... Another critic of both Popper and Kuhn was the internationally renowned Hungarian philosopher of science Imre Lakatos. He argued that instead of talking about individual hypotheses or theories, and their crucial tests, we should divide scientific theories into a so-called "hard core," that is, the basic tenets or ideas that are treated as irrefutable by fiat, and a "protective belt of auxiliary hypotheses" that are under constant change and revision (see Lakatos 1970). ...
Book
Full-text available
This Element examines various aspects of the demarcation problem: finding a distinction between science and pseudoscience. Section 1 introduces issues surrounding pseudoscience in the recent literature. Popper's falsificationism is presented in Section 2, alongside some of its early critics, such as Thomas Kuhn and Imre Lakatos. It is followed in Section 3 by the notable criticism of the Popperian program by Larry Laudan that put the issue out of fashion for decades. Section 4 explores recent multi-criteria approaches that seek to define pseudoscience not only along a single criterion, but by considering the diversity and historical dimension of science. Section 5 introduces the problem of values (the 'new demarcation problem') and addresses how we can use values in the problem of pseudoscience. Finally, Section 6 concludes by emphasizing the need for an attitude-oriented approach over a rigid, method-based demarcation, recognizing scientific practice's evolving and multifaceted nature.
... The values represent the rules of the cognitive game and includes commitment to honesty, consistency, respect for data, simplicity, plausibility, precision, problem solving" etc while exemplars indicates strong influence of successful or dominant practices on practitioners. Lakatos (1970Lakatos ( , 1974Lakatos ( , 1976 introduced the concept of research program which refers to a set of theories connected by heuristics and a common theoretical core. The hard core of Lakatos is actually, some combination of the symbolic generalization and metaphysical commitment and heuristic of Kuhn. ...
Conference Paper
Full-text available
Notwithstanding the pervasive influences of equilibrium theory in economics theorizing and policy, as well as its resilience, economics remains a contested discipline. The contests are rooted in historical, philosophical, and methodological differences of different generations of western philosophers, social reformers, and economists. This essay reviews the evolution of the philosophical and methodological contentions from the viewpoint of one who seeks knowledge of how to successfully transform the Nigerian political economy. The essay asked and attempted to answer epistemological, methodological, ethical, and metaphysical questions that probed into how economics and its philosophical and methodological roots equips students and professional economists to be agents of economic transformation. The essay covered the main philosophical and methodological contentions from the time of Plato and Aristotle and their influences on the evolution of western philosophy and methodology, economics thought, structure, and practice. The Essay drew three conclusions about the future of economics. First, methodological convergence in economics may neither be feasible nor desirable. Second, the future of economics may lie in greater consensus on methodological pluralism and shifts in intellectual targets from the emphasis on general natural laws to social laws. Third, the performance of the economy would be more decisive in passing judgments on economic theories and theorizing procedures. Postcripts: After, the Essay was published in 2003, an Inaugural Lecture titled Economics: Discipline in Need of a New Foundation. The lecture, which was presented on March 5, 2014, at the Ahmadu Bello University explained why economics needed a virtuous foundation to replace the Mandeville-Bentham-Smith foundation that built on vices (greed, self-centredness, covetousness, hedonism, love of money and wealth). The author found the virtuous foundation in the Bible. From the Bible came Theonomics, a system of thought, meaning, and practice which unlike the syncretic approach of the Scholastics, is rooted solely on Biblical themes, laws, rules, principles, and insights. The author and colleagues have been teaching Foundation of Theonomics at Bowen University, Iwo, since 2019.
... As we will demonstrate below, it is reasonable to assume that the same holds true for single-word recognition but that this hypothesis was not seriously considered in the past. We will derive and test new empirical predictions from this account, thus making sure that our argument is neither circular nor vacuous (Lakatos, 1970(Lakatos, , 1978. ...
Article
Full-text available
How do people recognize objects they have encountered previously? Cognitive models of recognition memory aim to explain overt behavior using latent psychological processes, such as true recognition and pure guessing. Validation studies assess whether the mechanisms underlying cognitive models properly reflect the psychological processes they aim to explain. The present study provides such a validation study for models describing paired-word recognition—a paradigm in which participants have to categorize randomly constructed word pairs. Specifically, introducing a strength manipulation (Experiment 1), presenting certain words more often during study, a base-rate manipulation of response categories (Experiment 2), presenting certain pair types more often during test, a base-rate manipulation of overall frequencies of old and new words (Experiment 3), and a payoff manipulation, differentially incentivizing correct responses (Experiment 4), we assessed the validity of general recognition theory, a multidimensional signal detection theory model, and the paired two-high threshold model, a discrete-state model. Both models captured the strength manipulation as expected on mnemonic parameters describing memory sensitivity and detection probability. Unexpectedly, the base-rate and payoff manipulations affected (strategic) memory retrieval within the discrete-state model (Experiments 2–4) and both strategic retrieval (Experiment 2) and decision boundaries (Experiments 3 and 4) within the continuous model. Implications for model validity and the future use of these models for paired-word recognition are discussed.
Preprint
Full-text available
This preprint introduces the Oscillatory Dynamics Transductive-Bridging Theorem (ODTBT), a scalar field framework unifying identity stabilization, recursive coherence dynamics, and transductive emergence across physics, cognition, symbolic systems, and systems theory. Developed over multiple recursive iterations, ODTBT models identity as a scalar resonance function, Cₛ[n], emerging from phase strain (ΔΦ), coherence memory (RCR), and threshold bifurcations (TWIST) within a dynamic coherence field φ_c(t, x). The manuscript formalizes a recursive grammar through which identity, cognition, and transformation become ontologically linked. Applications include scalar field modeling, attractor-based system design, symbolic recursion (RC+ξ), and scalar epistemology. Integrated into this iteration are recent theoretical advances by Lipa and Valov on entropy-informed scalar action, Λ_plasma modulation, and field-coherence unification, alongside experimental models for TWIST detection, dielectric collapse, and coherence saturation. Offered as a recursive architecture rather than a closed theory, this work invites feedback, adaptation, and scalar-syntactic extension. It is intended for researchers in theoretical physics, cognitive systems, symbolic modeling, nonlinear dynamics, recursive epistemology, and scalar coherence theory.
Article
There is no greater obstacle to the philosophy of religion than “God.” Not this god or that. Rather, “God” as the concept of some one and only ultimate reality, first cause, divine being. “God” as the presumed first move in any philosophy of religion that seeks to affirm or reject “His” attributes, existence, and activities. Most of all, the very term God. The first two sections of this article each offer three arguments for why philosophers of religion don’t need “God”: one from confusion, another from ethnocentricity, a third from impoverishment. In the first cycle, these arguments address problems pertaining to the capitalization God, in the second, with the human capital devoted to philosophizing about this capitalized God. The final section of the article then points the way forward to a philosophy of religion without “God,” one undertaken from or inclusive of the perspectives of non-western religio-philosophical “capitols.”
Chapter
Knowledge of history belongs to the general knowledge of all researchers and is, therefore, essential for understanding the present and envisioning the future. We present the history of information retrieval and the writing of scientific papers. Many modern concepts were invented before the scientific revolution of the 1600s, but their significance was not widely understood. Galileo, Francis Bacon, Rene Descartes, and Isaac Newton developed the modern analytical thinking combining rationalism and empiricism. The newest idea is the system concept of the last century by Ludwig von Bertalanffy, Norbert Wiener, Kenneth E. Boulding, and many others since about 1950. Analytical thinking is about 400 years old, but systems thinking is about 75 years old.
Article
Most basic researchers who collect data do so with the goal of testing theories. However, there is disagreement among realists versus pragmatists about whether theories are best characterized in terms of truth or verisimilitude, or in terms of problem-solving ability. Nonetheless, authorities in both philosophical camps agree that empirical hypotheses can be true or false. Consequently, tests of empirical hypotheses are straightforward. In contrast, the present thesis is that even tests of empirical hypotheses may be less straightforward than researchers appreciate. Gain-probability thinking can clarify crucial caveats and qualifications.
Chapter
In summary of the two preceding chapters, the general structure of physical knowledge is described.
Chapter
The basic requirements for an appropriate comprehension of the properties of physical methodology and knowledge are briefly described.
Article
This Counterpoint cautions that future making research treats the future too simplistically and fails to acknowledge the fundamental uncertainty inherent in all futures work. First, future making scholarship overlooks existing academic research, in which similar concerns have been pursued, empirically and conceptually, for years. Second, utopian futures are considered achievable if only actors have a vision of what they wish to create. Finally, most future making statements around grand challenges rely on little more than hope, failing to account for the complex relationalities shaping them. I substantiate my argument by drawing on the scenario planning literature, Knightian uncertainty, and anthropology of future research. I also critique the Point's call for future making scholars to adopt practice‐based approaches (Wenzel et al., forthcoming) in their empirical inquiries, arguing that the ‘as Practice’ move in management studies is yet to achieve its aspirations. Additionally, I caution against the other Counterpoint in this debate that future making requires the realization of desired and emancipatory futures (Comi et al., forthcoming), as this view is too restrictive for broad and deep future making theorizing to emerge.
Chapter
The relevance of causality, explanations, and the search for truth are discussed against the background that the key objectives of fundamental research in physics are gaining, structuring, and securing knowledge.
Article
Full-text available
Genome sequencing of cancer and normal tissues, alongside single-cell transcriptomics, continues to produce findings that challenge the idea that cancer is a ‘genetic disease’, as posited by the somatic mutation theory (SMT). In this prevailing paradigm, tumorigenesis is caused by cancer-driving somatic mutations and clonal expansion. However, results from tumor sequencing, motivated by the genetic paradigm itself, create apparent ‘paradoxes’ that are not conducive to a pure SMT. But beyond genetic causation, the new results lend credence to old ideas from organismal biology. To resolve inconsistencies between the genetic paradigm of cancer and biological reality, we must complement deep sequencing with deep thinking: embrace formal theory and historicity of biological entities, and (re)consider non-genetic plasticity of cells and tissues. In this Essay, we discuss the concepts of cell state dynamics and tissue fields that emerge from the collective action of genes and of cells in their morphogenetic context, respectively, and how they help explain inconsistencies in the data in the context of SMT.
Article
Full-text available
Due to different interests, ideologies, and geopolitical goals, the Islamic Republic of Iran and the Kingdom of Saudi Arabia have long been in competition, which occasionally got severe. After a seven-year rupture, on March 10, 2023, the two states agreed to restore diplomatic relations to reduce tensions. This research examines the rapprochement between Iran and Saudi Arabia and tries to answer this question: what are the future scenarios of the Iran-Saudi Arabia rapprochement? To tackle the question, this research tends to analyze the Iranian-Saudi relationship and shed light on the future scenarios of the deal from an Iranian perspective. Three scenarios are presented: the first one is the cynical one and pays attention to challenges and the driving forces that can deteriorate the present de-escalated relations. The second scenario considers the positive potentialities of the deal and the situation it can make for more stability and peace. The third one regards both elements of tension and compatibility. Ultimately, the third scenario is considered the most logical one based on the framework presented as the Balance of Interests.
Chapter
According to a widespread view, all realists share the same set of core beliefs, which originated in the works of Thucydides. This view implies that realism is a sui generis historical tradition. The present chapter critiques this view from three related vantagepoints. Its first section illustrates that realism had emerged during the inter-war era, and that it has since then evolved as a disarrayed tradition. A disarrayed tradition entails conscious membership and participation in an ongoing common discourse that has produced fundamentally different, even opposing, methodologies and theories about the same or related phenomena under the same appellation—in our case “realism.” The second section of the chapter further buttresses this argument by illustrating that the alleged core beliefs of realism do not amount to its differentiae specifica. As the third section demonstrates, the consensus view itself had emerged gradually in the 1970s and early 1980s to serve the presentist needs of certain IR scholars. The concluding section suggests that an accurate understanding of the actually-exiting realism requires abandoning the consensus view.
Chapter
This book makes an argument for pluralizing political philosophy, thereby focussing specifically on economic and ecological inequalities. By reducing the current marginalization of a range of traditions and approaches in political philosophy, especially as it is practised at universities in the Global North, political philosophy will have access to a richer range of theories. The chapters in this edited volume illustrate the wide range of perspectives that exist to analyse economic and ecological inequalities. In addition to critical discussions of liberal egalitarianism and green liberalism, contributing authors also offer discussions of Māori philosophy, ecofeminism, Confucian political philosophy, an ethics of care, Ubuntu philosophy, Buen Vivir, and hybrid approaches. In addition, other chapters offer meta-theoretical discussions of the reasons for global justice scholars to work towards a more inclusive agenda and approach; they examine what explains the canon in political philosophy; and they discuss what the scope of political philosophy is, or should be. The volume closes with four shorter chapters that provide some meta-theoretical reflections and make suggestions on how to further pluralize political philosophy.
Article
Full-text available
Artikel ini mengkaji peran filsafat ilmu dalam pendidikan bahasa Arab, dengan fokus pada pembentukan pemikiran kritis siswa. Dalam era globalisasi dan kemajuan teknologi, kemampuan berpikir kritis menjadi semakin penting. Filsafat ilmu memberikan kerangka teoretis yang membantu siswa memahami pengetahuan dan metodologi berpikir. Melalui integrasi filsafat ilmiah dalam kurikulum, siswa didorong untuk tidak hanya menerima informasi, tetapi juga menganalisis dan mengevaluasi sumber-sumber yang ada. Pendekatan ini mencakup diskusi kelompok, proyek kolaboratif, dan analisis teks sastra, yang semuanya berkontribusi pada pengembangan keterampilan berpikir kritis dan komunikasi. Artikel ini juga membahas relevansi keterampilan abad 21, seperti kolaborasi, kreativitas, dan literasi digital, dalam konteks pendidikan bahasa Arab. Dengan demikian, integrasi filsafat ilmu dalam pembelajaran bahasa Arab diharapkan dapat menghasilkan individu yang lebih siap menghadapi tantangan di masa depan.
Book
Full-text available
The book is a continuation of research on deliberative civic participation with 12 case studies of participatory budgets in Polish cities. Chapter 1 presents a generalization of the research results to date in the form of a self-referential model of the development of participatory budgeting. Chapter 2 attempts to get to the essence of some of the threats related to the "darker" - i.e. the negative side of deliberative power, pointing out its stalemate and democracy-eroding potential. The empirical material is placed together with comments from the research in Chapter 3, supplemented by an annex with tabular summaries.
Article
A test has been made of the photon theory of the scattering of high frequency radiation. The pairs of scattered photons and recoil electrons predicted by this theory have been looked for by means of specially designed Geiger-Müller counters. Coincident discharges in the electron and photon counters were recorded by means of a vacuum tube amplifying and adding circuit. The scatterers used were air, aluminum, beryllium, filter paper and paraffin. The radiation was the gamma-rays from radium C. Experiments were performed with the counters set at various angles, some where the photon theory predicts coincidences, and others where coincidences should not be expected. The experiments uniformly gave fewer coincidences in the correct positions than were expected, and those observed could in every case be accounted for as chance coincidences due to the finite resolving time of the apparatus. It has not been found possible to bring the results of these experiments into accord with the photon theory of scattering. The wave-mechanical theory of the scattering process has not yet been extended to include the gamma-ray region so that it is impossible to compare this theory with the present experiments. Unless it is shown that the two theories disagree in the gamma-ray region it does not seem possible to reconcile the present experiment with the Bothe-Geiger and Compton-Simon experiments.
Article
Die von Hrn. Kurlbaum in der heutigen Sitzung mitgeteilten interessanten Resultate der von ihm in Gemeinschaft mit Hrn. Rubens auf dem Gebiete der längsten Spektralwellen ausgeführten Energiemessungen haben die zuerst von den Herren Lummer und Pringsheim auf Grund ihrer Beobachtungen aufgestellte Behauptung nachdrücklich bestätigt, daß das Wiensche Energieverteilungsgesetz nicht die allgemeine Bedeutung besitzt, welche ihm bisher von mancher Seite zugeschrieben worden war, sondern daß dies Gesetz vielmehr höchstens den Charakter eines Grenzgesetzes hat, dessen überaus einfache Form nur einer Beschränkung auf kurze Wellenlängen bez. tiefe Temperaturen ihren Ursprung verdankt.1) Da ich selber die Ansicht von der Notwendigkeit des Wienschen Gesetzes auch an dieser Stelle vertreten habe, so sei es mir gestattet, hier kurz darzulegen, wie sich die von mir entwickelte elektromagnetische Theorie der Strahlung zu den Beobachtungstatsachen stellt.
Article
Scitation is the online home of leading journals and conference proceedings from AIP Publishing and AIP Member Societies
Article
The Michelson-Morley experiment, performed in Cleveland in 1887, proved to be the definitive test for discarding the Fresnel aether hypothesis which had dominated physics throughout the 19th century. The experiment had been suggested to Michelson by his study of a letter of James Clerk Maxwell, and a preliminary but inconclusive trial had been made at Potsdam in 1881. It seems certain that the experiment would never have been repeated except for the urging of Kelvin and Rayleigh at the time of Kelvin's Baltimore Lectures in 1884, which Michelson and Morley attended. The conclusive null result of the Cleveland experiment was decisive in its influence on Lorentz, FitzGerald, Larmor, Poincaré, and Einstein in developing their theories of the electrodynamics of moving bodies, which culminated in the special theory of relativity. The present account contains material from extensive notes and correspondence related to the work of Michelson and Morley which the writer has assembled during the past years.
Article
I GATHER from a letter on this subject which appears in your last issue that Lord Rayleigh endorses the opinion that the partial pressure p of any particular frequency in full radiation may properly be deduced from the intrinsic energy-density E/v of the same frequency by Carnot's principle.
Article
Zusammenfassung Es wird eine Behandlungsweise des radioaktiven ß-Zerfalls angegeben, welche dem Umstand Rechnung zu tragen gestattet, daß bei diesen Prozessen die Gültigkeit der Erhaltungssätze für Energie und Drehimpuls nicht mehr gewahrt ist. Die erhaltene Zerfallsformel wird auf die Gestalt der kontinuierlichen ß- Spektren, auf die Abhängigkeit der ß-Emission von der Zerfallsenergie und vom Anregungszustand des entstehenden Kernes angewandt und führt zu einer genaueren Charakterisierung der im Kerne befindlichen Ladungen. Eine anschauliche Interpretation des betrachteten Vorganges kann durch die Vorstellung vermittelt werden, daß beim ß-Zerfall ein Paar von Elektronen ungleichen Vorzeichens entsteht, von welchen das positive Elektron gleichzeitig vom Kerne absorbiert wird, während das negative zur Emission gelangt.
Article
Zusammenfassung Im Anschluß an die Gamowsehe Idee wird, eine quantenmechanische Erklärung desß-Zerfaüs versucht. Die vorgeschlagene Beschreibung ermöglicht eine quantitative Untersuchung.
Article
Discussions of the interpretation of quantum theory are at present obstructed by (1) the increasing axiomania in physics and philosophy which replaces fundamental problems by problems of formulation within a certain preconceived calculus, and (2) the decreasing (since 1927) philosophical interest and sophistication both of professional physicists and of professional philosophers which results in the replacement of subtle positions by crude ones and of dialectical arguments by dogmatic ones. More especially, such discussions are obstructed by the ignorance of both opponents, and also defenders of the Copenhagen point of view, as regards the arguments which once were used in its defence. The publication of Bunge's Quantum Theory and Reality and especially of Popper's contribution to it are taken as an occasion for the restatement of Bohr's position and for the refutation of some quite popular, but surprisingly naive and uninformed objections against it. Bohr's position is distinguished both from the positio...
Article
According to Hempel, all scientific explanations and predictions which are produced exclusively with deterministic laws must be deductive, in the sense that the explanandum or the prediction must be a logical consequence of the laws and the initial (and boundary) conditions in the explanans. This deducibility thesis (DT) has been attacked from several quarters. Some time ago Canfield and Lehrer presented a “refutation” of DT as applied to predictions, in which they tried to prove that “if the deductive reconstruction [DT for predictions] were an adequate reconstruction, then scientific prediction would be impossible” ([2], p. 204). Their argument seems to have been uncontested except for an inconclusive rejoinder by Beard (cf. [1]). Moreover, Stegmüller has recently argued that “it may turn out that all or at least most of the so-called deductive-nomological explanations are in truth inductive and not deductive arguments, in view of the difficulty which has been pointed out by Canfield and Lehrer” ([3], p. 7). It seems it would be worth investigating whether Canfield and Lehrer's argument is, indeed, correct.
Article
In several recent publications1, Professor Adolf Grünbaum has inveighed against the conventionalism of writers like Einstein, Poincaré, Quine and especially Duhem. Specifically, Grünbaum has assailed the view that a single hypothesis can never be conclusively falsified. Grünbaum claims that the conventionalists’ insistence on the immunity of hypotheses from falsification is neither logically valid nor scientifically sound. Directing the weight of his argument against Duhem, Grünbaum launches a two- pronged attack. He insists, first, that conclusive falsifying experiments are possible, suggesting that Duhem’s denial of such experiments is a logical non-sequitur. He then proceeds to show that, more than being merely possible, crucial falsifying experiments have occurred in physics. I do not intend to make a logical point against Grünbaum’s critique so much as an historical and exegetical one. Put briefly, I believe that he has misconstrued Duhem’s views on falsifiability and that the logical blunder which he discussed should not be ascribed to Duhem, but rather to those who have made Duhem’s conventionalism into the doctrine which Grünbaum attacks. Whether there are any writers who accept the view he imputes to Duhem, or whether he is exploiting ‘straw-men’ to give weight to an otherwise trivial argument is an open question. For now, I simply want to suggest that his salvos are wrongly directed against Duhem.
Article
This paper argues against the deductive reconstruction of scientific prediction, that is, against the view that in prediction the predicted event follows deductively from the laws and initial conditions that are the basis of the prediction. The major argument of the paper is intended to show that the deductive reconstruction is an inaccurate reconstruction of actual scientific procedure. Our reason for maintaining that it is inaccurate is that if the deductive reconstruction were an accurate reconstruction, then scientific prediction would be impossible.
Article
This paper offers a refutation of P. Duhem’s thesis that the falsifiability of an isolated empirical hypothesis H as an explanans is unavoidably inconclusive. Its central contentions are the following: (1) No general features of the logic of falsifiability can assure, for every isolated empirical hypotheses H and independently of the domain to which it pertains, that H can always be preserved as an explanans of any emprical findings O whatever by some modification of the auxiliary assumptions A in conjunction with which H functions as an explanans. For Duhem cannot guarantee on any general logical grounds the deducibility of O from an explanans constituted by the conjunction of H and some revised non-trivial version R of A: the existence of the required set R of collateral assumptions must be demonstrated for each particular case. (2) The categorical form of the Duhemian thesis only a non-sequitur but actually false. This is shown by adducing the testing of physical geometry as a counterexample to Duhem in the form of a rebuttal to A. Einstein’s geometrical articulation of Duhem’s thesis. (3) The possibility of a quasi a priori choice of a physical geometry in the sense of Duhem must be clearly distinguished from the feasibility of a conventional adoption of such a geometry in the sense of H. Poincaré. And the legitimacy of the latter cannot be invoked to save the Duhemian thesis from refutation by the foregoing considerations.
Article
Two books have been particularly influential in contemporary philosophy of science: Karl R. Popper's Logic of Scientific Discovery, and Thomas S. Kuhn's Structure of Scientific Revolutions. Both agree upon the importance of revolutions in science, but differ about the role of criticism in science's revolutionary growth. This volume arose out of a symposium on Kuhn's work, with Popper in the chair, at an international colloquium held in London in 1965. The book begins with Kuhn's statement of his position followed by seven essays offering criticism and analysis, and finally by Kuhn's reply. The book will interest senior undergraduates and graduate students of the philosophy and history of science, as well as professional philosophers, philosophically inclined scientists, and some psychologists and sociologists.
Article
Because physical theories typically predict numerical values, an improvement in experimental precision reduces the tolerance range and hence increases corroborability. In most psychological research, improved power of a statistical design leads to a prior probability approaching ½ of finding a significant difference in the theoretically predicted direction. Hence the corroboration yielded by “success” is very weak, and becomes weaker with increased precision. “Statistical significance” plays a logical role in psychology precisely the reverse of its role in physics. This problem is worsened by certain unhealthy tendencies prevalent among psychologists, such as a premium placed on experimental “cuteness” and a free reliance upon ad hoc explanations to avoid refutation.
Article
IN connexion with the new experiments on the correlation between scattering and recoil in the Compton effect by Bothe and Maier-Leibnitz, as well as those by Dr. Jacobsen recorded above, both contradicting the conclusions regarding the absence of such a correlation arrived at by Shankland, I should like to make the following brief comments upon the renewed discussion1 on a possible failure of the laws of conservation of energy and momentum in atomic phenomena, to which Shankland's experiments have given rise.
Article
THE experimental results of Shankland1 are in contradiction with the accepted theory of the Compton effect, in particular with the idea of detailed conservation of energy and momentum. If we accept his evidence, and if we assume that, in this process, energy and momentum are not given out in some unknown form, we have to conclude that energy and momentum are not conserved. As Dirac pointed out recently2, Shankland's result would be compatible with the point of view of Bohr, Kramers and Slater. I would like, however, to direct attention to the fact that this point of view by no means affords the only plausible interpretation of the experiment.
Article
This chapter presents the author's view on epistemology. This chapter introduces the author's various theses, and his explanation of the third world and the world of objective contents of thought, especially of scientific and poetic thoughts and of works of art. A biological approach to the third world is provided in the chapter to defend the existence of an autonomous world by a kind of biological or evolutionary argument. The chapter illustrates the objectivity and the autonomy of this third world. With the evolution of the argumentative function of language, criticism becomes the main instrument of further growth. The autonomous world of the higher functions of language becomes the world of science. The chapter provides an appreciation and criticism of Brouwer's epistemology and discusses the logic and the biology of discovery. It presents the concept of discovery, humanism, and self-transcendence.
Article
A comparison of Fermi's formula for the distribution in energy of the electrons and positrons emitted by radioactive bodies with the observed spectra seems to show that a basic factor in it, the statistical factor, is not asymmetric enough. Since about the same degree of asymmetry is common to the spectra of light and heavy nuclei and of positron and electron emitters, it cannot be ascribed to another factor in the Fermi formula, depending on the nuclear field. A weight factor is introduced to provide the required asymmetry by changing the form of the Fermi interaction energy. It is shown that two almost equivalent points of view can be employed in attacking this problem and that a certain uniqueness in the form of the interaction law can be obtained within the requirements laid down by Fermi. The modified distribution formula, which holds strictly only for light nuclei, is then shown to give a much more satisfactory agreement with the data than the original formula.
Article