# Foundations of Science

Online ISSN: 1572-8471
Recent publications
Article
In this paper, we defend two claims. First, we argue that a notion of contextuality that has been formalized in physics and psychology is applicable to linguistic contexts. Second, we propose that this formal apparatus is philosophically significant for the epistemology of language because it imposes homogeneous rational constraints on speakers. We propose a Contextuality Principle that explains and articulates these two claims. This principle states that speakers update contextual information by significantly reducing the space of probabilities and variables in a non-commutative way. Some contexts affect other contexts not merely in terms of the information they contain, but also on the basis of their sequential order. In particular, we argue that the Contextuality by Default (CBD) theory provides a formalism that helps explain the role of contextuality in rational linguistic exchanges.

Article
Local hidden-variable model of singlet-state correlations discussed in Czachor (Acta Phys Polon A 139:70, 2021a) is shown to be a particular case of an infinite hierarchy of local hidden-variable models based on an infinite hierarchy of calculi. Violation of Bell-type inequalities can be interpreted as a ‘confusion of languages’ problem, a result of mixing different but neighboring levels of the hierarchy. Mixing of non-neighboring levels results in violations beyond the Tsirelson bounds.

Article
The paper addresses the problem of imaginative resistance in science, that is, why and under what circumstances imagination sometimes resists certain scenarios. In the first part, the paper presents and discusses two accounts concerning the problem and relevant for the main thesis of this study. The first position is that of Gendler (Journal of Philosophy 97:55–81, 2000), (Gendler, in: Nichols (ed) The Architecture of the Imagination: New essays on pretence, possibility and fiction, Oxford University Press, New York, 2006a), (Gendler & Liao, in: Gibson, Carroll (eds) The routledge companion to philosophy of literature, Routledge, New York, 2016), according to which imaginative resistance mainly concerns evaluative scenarios, presenting deviant moral attitudes. The second account examined is that of Kim et al. (in: Cova, Réhault (eds) Advances in experimental philosophy of aesthetics, Bloomsbury, London, 2018), who insisted on the link between imaginative resistance on the one hand and counterfactual and counterdescriptive scenarios on the other. In the light of both theories, this paper discusses the importance of addressing the problem of imaginative resistance in the scientific enterprise in the light of some mechanisms of embodied simulation, based on the activity of mirror neurons and investigated within the framework of the Embodied Simulation Theory.

Article
Quantum mechanics (QM) predicts probabilities on the fundamental level which are, via Born probability law, connected to the formal randomness of infinite sequences of QM outcomes. Recently it has been shown that QM is algorithmic 1-random in the sense of Martin–Löf. We extend this result and demonstrate that QM is algorithmic ω\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\omega$$\end{document}-random and generic, precisely as described by the ’miniaturisation’ of the Solovay forcing to arithmetic. This is extended further to the result that QM becomes Zermelo–Fraenkel Solovay random on infinite-dimensional Hilbert spaces. Moreover, it is more likely that there exists a standard transitive ZFC model M, where QM is expressed in reality, than in the universe V of sets. Then every generic quantum measurement adds to M the infinite sequence, i.e. random real r∈2ω\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$r\in 2^{\omega }$$\end{document}, and the model undergoes random forcing extensions M[r]. The entire process of forcing becomes the structural ingredient of QM and parallels similar constructions applied to spacetime in the quantum limit, therefore showing the structural resemblance of both in this limit. We discuss several questions regarding measurability and possible practical applications of the extended Solovay randomness of QM. The method applied is the formalization based on models of ZFC; however, this is particularly well-suited technique to recognising randomness questions of QM. When one works in a constant model of ZFC or in axiomatic ZFC itself, the issues considered here remain hidden to a great extent.

Article
The aim of the paper is to develop the concept of perceptual relation and to apply it to digital environments. First, the meaning of perceptual relation is phenomenologically analyzed and defined as the interaction between the whole and its parts, which is theorized by the founders of Gestalt psychology. However, this relation is not considered as an intrinsic, but as an extended one, implying also the relation with the surrounding world (Umwelt). Subsequently, this concept of extended relation is applied to a chosen object (a ball) as it is perceived in four different kinds of digital dimensions (on-screen, virtual, augmented, and hybrid). Through a phenomenological analysis, I argue that, whereas the whole-part configuration remains the same, some modes of appearance of the object (multisensoriality, figure-ground interaction, affordances, and persistence) are different. In order to define this dynamic, I have coined the concept of transdimensional analogy.

Article
Has the theory of rationality as ‘openness to criticism’ solved the problem of ‘rational belief in reason’? This is the main question the present article intends to address. I respond to this question by arguing that the justified true belief account of knowledge has prevented Karl Popper’s critical and William Bartley’s pan-critical rationalism from solving the problem of rational belief in reason. To elaborate this response, the article presents its arguments in three stages: First, it argues that the idea of objective knowledge as justified true belief leads to the equation of objective knowledge with justification. Hence, if we base the theory of critical rationalism, as openness to criticism, upon such a conception of knowledge, our theory of rationality involves in infinite regress of proofs. Second, it argues that Popper describes critical rationalism as an ‘irrational attitude’ of openness to criticism because the rationalist cannot justify his belief in reason by argument or experience. Thus, since Popper assumes that ‘a belief in reason’ must be justified in order to be ‘rational’ he cannot lead to a solution for the problem of rational belief in reason. Third, it argues that, like Popper’s critical rationalism, Bartley’s pan-critical rationalism originates in the justified true belief account of knowledge, however, not because Bartley defines critical rationalism as irrational faith in reason, but because his theory does not tell us how an ‘open’ belief in reason is to be refuted logically. The reason for this failure is that Bartley uses Popper’s epistemology of science to define the conception of criticism. While Popper and Bartley are recognized as the non-justificationist philosophers of science and rationality, this article tries to reveal that the idea of objective knowledge as justified true belief has prevented them from showing how ‘a rational belief in reason’ can be defended by argument. The article also briefly shows the consequence of this study for a notable change in the logical foundations of science.

Article
The starting point of this article is the observation that the emergence of the Anthropocene rehabilitates the need for philosophical reflections on the ontology of technology. In particular, if technological innovations on an ontic level of beings in the world are created, but these innovations at the same time create the Anthropocene World at an ontological level, this raises the question how World creation has to be understood. We first identify four problems with the traditional concept of creation: the anthropocentric, ontic and outcome orientation of traditional concepts of creation, as well as its orientation of material fabrication. We subsequently develop a progressive concept of World creation with four characteristics that move beyond the traditional conceptuality: (1) a materialistic concept of creation that accounts for (2) the ontogenetic process and (3) the ontic and ontological nature of creation, and (4) is conceptualized as semantic creation of the World in which we live and act.

Article
We present an epistemological schema of natural sciences inspired by Peirce’s pragmaticist view, stressing the role of the phenomenological map , that connects reality and our ideas about it. The schema has a recognisable mathematical/logical structure which allows to explore some of its consequences. We show that seemingly independent principles as the requirement of reproducibility of experiments and the Principle of Sufficient Reason are both implied by the schema, as well as Popper’s concept of falsifiability. We show that the schema has some power in demarcating science by first comparing with an alternative schema advanced during the first part of the 20th century which has its roots in Hertz and has been developed by Einstein and Popper. Further, the identified differences allow us to focus in the construction of Special Relativity, showing that it uses an intuited concept of velocity that does not satisfy the requirements of reality in Peirce. While the main mathematical observation connected with this issue has been known for more than a century, it has not been investigated from an epistemological point of view. A probable reason could be that the socially dominating epistemology in physics does not encourage such line of work. We briefly discuss the relation of the abduction process presented in this work with discussions regarding “abduction” in the literature and its relation with “analogy”.

Article
The author of this paper discusses the theme of the "simulated body", that is the sense of "being there” in a body that is not one's own, or that does not exist in the way one perceives it. He addresses this issue by comparing Immersive Virtual Reality technology, the phenomenological approach, and Gerald Edelman's theory of Neural Darwinism. Virtual Reality has been used to throw light on some phenomena that cannot be studied experimentally in real life, and the results of its simulations enrich the phenomenological discourse on the lived body. Virtual “Reality” seems to replicate—at least in part—the simulation mechanisms of our mind, thus favoring developments in the field of philosophy of mind.

Article
Trish Glazebrook and Dana Belu both think I spend too much time criticizing the Cartesianism that both empirical and transcendental philosophies of technology quite obviously oppose. They argue that I would have been better off if I had instead considered how these two philosophies “converge on the thesis of crisis” in technoscientific life (Glazebrook) and/or “made wider use of Feenberg’s work” (Belu). While I am sympathetic to both Glazebrook’s thesis and Feenberg’s work, I argue that their recommendations raise precisely the “pre-philosophical” issue I discuss in my paper. The issue, addressed directly by both Nietzsche and the young Heidegger, is how such recommendations can be carried forward as productive, life-driven articulations of our current needs in technoscientific times and not just become two more in a long string of chosen (and thus very Cartesian) “turns” in a history of philosophical “positions.”

Article
In this paper we compare the strategies applied by two successful biological components of the ecosystem, the viruses and the human beings, to interact with the environment. Viruses have had and still exert deep and vast actions on the ecosystem especially at the genome level of most of its biotic components. We discuss on the importance of the human being as contraptions maker in particular of robots, hence of machines capable of automatically carrying out complex series of actions. Beside the relevance of designing and assembling these contraptions, it is of basic importance the goal for which they are assembled and future scenarios of their possible impact on the ecosystem. We can’t procrastinate the development and implementation of a highly inspired and stringent “ethical code” for human beings and humanoid robots because it will be a crucial aspect for the wellbeing of the mankind and of the entire ecosystem.

Article
I will present a comparative analysis between Thomas Kuhn's The Copernican Revolution (CR) published in 1957 and The Structure of Scientific Revolutions (SSR) published in 1962, ir order to identify divergences in the views contained in each work. I shall set forth a comparative analysis of the historiographical assumptions employed by Kuhn in each of his books. I will explore some proposals which have pointed out several discontinuities between both books, as I introduce some tools to widen this interpretative trend. I will argue that although Kuhn’s work in 1957 contains some concerns and problems which anticipate his later stances, these anticipations coexist with historiographical formulations and premises which are incompatible with the core of SSR. Therefore, I will assert that Kuhn adopts different historiographical frameworks in CR and in SSR. Finally, I will conclude that these differences are expressions of Kuhn's adoption of a more externalist view in the former, and a more internalist frame in the latter.

Article
This essay engages with Bernard Stiegler’s discussion with Martin Heidegger in The ordeal of Truth, published in Foundations of Science 2020 (this volume). It appreciates Stiegler’s progressive reading of Heidegger’s work but critically reflects on several elements in his work. A first element is the methodological aspect of Heidegger’s being historical thinking, which is missed by Stiegler and confirms the indifference towards philosophical method that can be found in the work of many contemporary philosophers. A second element concerns Heidegger’s and Stiegler’s remaining humanism and the necessity to move beyond humanism and post-humanism in the era of global warming. A third element of reflection concerns Stiegler’s idea of the obligation of making our being-in-default come true, which shows a hidden metaphysical orientation in his work.

Article
Natural selection is generally considered to be a process exclusive to the domain of biotic systems. In this paper, a universal, five phase set of dynamics are identified as a framework underpinning the natural selection that occurs in all processes, governing every interaction at every scale, from the quantum to the intergalactic. The theoretical model describes a two-tendency universe, where the tensions that exist between syntropy and entropy provide the context for functional synergies from which all matter and material systems emerge, supporting a theory of general, biotic and cognitive evolution.

Article
The postphenomenological framework of concepts—and especially the version utilized by the founder of this school of thought, Don Ihde—has proven useful for puncturing others’ totalizing or otherwise overgeneralizing claims about technology. However, does this specialization in deflating hype leave this perspective unable to identify the kinds of technological patterns necessary for contributing to activist interventions and political critique? Put differently, the postphenomenological perspective is committed to the study of concrete human-technology relations, and it eschews essentialist and fundamentalizing accounts of technology. Do these commitments render it incapable of providing general assessments of our contemporary technological situation? It is my contention that this perspective can indeed be useful for these kinds of critical projects. To do so, we must go beyond Ihde’s personal tendency to utilize postphenomenology mostly for deflating others’ hype, and explore this perspective’s distinctive potential for identifying ways that technologies can become set within problematic patterns of usage and design. My suggestion is that the postphenomenological notion of “multistability” (i.e., the idea that technologies are always open to multiple uses and meanings) can play a helpful role in these efforts, especially when combined with a conception of local, rather than totalizing, stabilizations of human-technology relationships.

Article
This essay is a response to Robert Scharff’s “Before Empirical Turns and Transcendental Inquiry: pre-philosophical Considerations”. Scharff digs beneath the empirico-transcendental debate between Ihde and Stiegler in order to critique this debate’s Cartesian presuppositions. He uses the work of Nietzsche and the early Heidegger to further his critique. There is much to like in Scharff’s rich and intricate analytic interpretation but this is also the crux of my critique. The detour into Nietzsche’s and the early Heidegger’s work is ultimately unnecessary. If Scharff looks closer, he will see that Feenberg’s revised theory of instrumentalization already accomplishes the return to technical experience and the critique of Cartesian techno scientific rationality that Scharff desires.

Article
This paper challenges that Ihde’s and Stiegler’s approaches stand in radical opposition. It argues that ethos is prior to law, exposes a Heideggerian rift between technoscience and technics, and rejects separation of theory from practice in favor of logics of poiêsis.

Article
I present a systematic interpretation of the foundational purpose of constructions in ancient Greek geometry. I argue that Greek geometers were committed to an operationalist foundational program, according to which all of mathematics—including its entire ontology and epistemology—is based entirely on concrete physical constructions. On this reading, key foundational aspects of Greek geometry are analogous to core tenets of 20th-century operationalist/positivist/constructivist/intuitionist philosophy of science and mathematics. Operationalism provides coherent answers to a range of traditional philosophical problems regarding classical mathematics, such as the epistemic warrant and generality of diagrammatic reasoning, superposition, and the relation between constructivism and proof by contradiction. Alleged logical flaws in Euclid (implicit diagrammatic reasoning, superposition) can be interpreted as sound operationalist reasoning. Operationalism also provides a compelling philosophical motivation for the otherwise inexplicable Greek obsession with cube duplication, angle trisection, and circle quadrature. Operationalism makes coherent sense of numerous specific choices made in this tradition, and suggests new interpretations of several solutions to these problems. In particular, I argue that: Archytas’s cube duplication was originally a single-motion machine; Diocles’s cissoid was originally traced by a linkage device; Greek conic section theory was thoroughly constructive, based on the conic compass; in a few cases, string-based constructions of conic sections were used instead; pointwise constructions of curves were rejected in foundational contexts by Greek mathematicians, with good reason. Operationalism enables us to view the classical geometrical tradition as a more unified and philosophically aware enterprise than has hitherto been recognised.

Article
This critical response to Dominic Smith’s ‘Taking Exception: Philosophy of Technology as a Multidimensional Problem Space’ begins by outlining the key contributions of his essay, namely his insightful approach to the transcendental, on the one hand, and his introduction of the topological problem space as an image for thought, on the other. The response then suggests ways of furthering this approach by addressing potential reservations about determinism. The response concludes by suggesting a way out of these questions of determinism by thinking the transcendental in concert with the agonistic.

Article
Heidegger has been blamed for being obsolete, irrelevant, ignorant and even dangerous in relation to contemporary philosophy of technology. Based on mainly two texts from Heidegger’s post-war production, “The Question Concerning Technology” (1953) and “Only a God can Save Us” (1966/1976), this commentary to Don Ihde’s article tries to show how Heidegger actually makes sense to philosophy of technology. The sheer fact that many postmodern thinkers, among those Don Ihde, are constantly ‘measuring’ their line of thoughts and use of concepts against Heidegger’s original thinking is proof of this relevance. I think that Heidegger, despite his despicable political and moral convictions, is needed when it comes to a critique and understanding of contemporary technological innovation and development in relation to what it means to be human. Most important is to stress that Heidegger was not a technodystopian, which a thorough reading of the two selected texts clearly shows. On the contrary, Heidegger points at the existential dialectical essence in technology, which means that both damnation and redemption can be mediated by technology.

Article
In Ontology, quality determines beings. The quality-quantity bipolarity reveals that a conceptual logical comprehension that can include negation must be a dialectical logic. Quality is a precise characteristic of something (or a subject predicate) capable of augmentation or diminution while remaining identical through differences or quantitative changes. Thus, quality and in opposition quantity are inextricably linked, giving definition to each other, so constituting a logical bipolarity. The theory is that a magnitude G is never separated from secondary qualities α and β, and therefore, a measure depends on a concrete quality Gα or Gβ, that is to say on one pole of a logical bi-pole. However, the particular number, the unit, that expresses the result of a measure is the quality G alone. Examples drawn from physical and chemical experiments illustrate these ideas and elaborate the structure of the concept of opposition between the secondary qualities α and β of a magnitude G.

Article
I reply to two comments to my paper “Subjectivity and transcendental illusions in the Anthropocene,” by Johannes Schick and Melentie Pandilovski. Schick expands on the possibility that technical objects become “other” in a Levinasian sense, making use of Simondon’s three-layered structure of technical objects. His proposal is to free technical objects and install a different relationship between humankind and technology. I see two major difficulties in Schick's proposal. These difficulties are based on a number of features of current digital technology which make it difficult to enter the proposed ethical relationship with it. A first cluster of difficulties consists of the phenomena of blackboxing, the intimate interwovenness of inventing technologies and profit on all levels of the technical object, and the ownership of and control over technologies. A second cluster revolves around the impossibility of a symmetrical relationship with the hyperobject because of current technology’s hyperobject-like nature. Next I discuss Pandilovski’s comments, where I point out that phenomenology is more encompassing than the study of having conscious experiences, and that phenomenology is essentially a method, rather than a collection of results.

Article
Albinism in Tanzania causes fierce health-related stigma. Little research has focused on the impact of stigma reduction strategies aiming to reduce albinism related stigma. Therefore, this research assessed the impact of two short video interventions among high school students in Tanzania on their attitude towards people with albinism: a contact intervention (n = 95) and an education intervention (n = 97). A mixed method design was used. Directly before and after the interventions impact was measured among all participants through the Albinism Social Distance Scale for Adolescents (ASDS-A), knowledge items, and entertainment items. After these measurements focus group discussions were conducted, 16 in total (n = 80). Both interventions caused a significant increase in the levels of correct knowledge about albinism. The education intervention entailed a significant positive change in attitude measured through the ASDS-A, whereas the contact intervention did not have a significant effect. In terms of entertainment value, the respondents were more enthusiastic about the contact intervention. The study suggests that education interventions on their own can be a successful tool in decreasing albinism related stigma. Additionally, qualitative findings show many positive outcomes for the contact intervention. Therefore, we would recommend using a combination of these two interventions, which has also proved successful in the past. However, more research on the effect of a combination of the two strategies is recommended.

Article
A Controlled Not variant of the standard quantum teleportation protocol affords a step-by-step analysis of what is, or can be said to be, achieved in the process in either location. Dominant interpretations of what quantum teleportation consists in and implies are reviewed in this light. Being mindful of the statistical significance of the terms and operations involved, as well as awareness of classical analogies, can help sort out what is specifically quantum-mechanical, and what is not, in so-called teleportation. What the latter achieves appears to be the transmission, without the involvement of mysterious channels, of what a quantum state encapsulates: the multitiered probabilistic characterization of a given preparation, which might be all we can possibly know about a physical system in a given experimental situation.

Article
There is a wide range of realist but non-Platonist philosophies of mathematics—naturalist or Aristotelian realisms. Held by Aristotle and Mill, they played little part in twentieth century philosophy of mathematics but have been revived recently. They assimilate mathematics to the rest of science. They hold that mathematics is the science of X, where X is some observable feature of the (physical or other non-abstract) world. Choices for X include quantity, structure, pattern, complexity, relations. The article lays out and compares these options, including their accounts of what X is, the examples supporting each theory, and the reasons for identifying the science of X with (most or all of) mathematics. Some comparison of the options is undertaken, but the main aim is to display the spectrum of viable alternatives to Platonism and nominalism. It is explained how these views answer Frege’s widely accepted argument that arithmetic cannot be about real features of the physical world, and arguments that such mathematical objects as large infinities and perfect geometrical figures cannot be physically realized.

Article
In comparative policy analysis (CPA), a generally accepted historic problem that transcends time is that of identifying common variables. Coupled with this problem is the unanswered challenge of collaboration and interdisciplinary research. Additionally, there is the problem of the rare use of text-as-data in CPA and the fact it is rarely applied, despite the potential demonstrated in other subfields. CPA is multi-disciplinary in nature, and this article explores and proposes a common variable candidate that is found in almost (if not) all policies, using the science of conceptual systems (SOCS) as a pathway to investigate the structure found in policy as a lynchpin in CPA. Furthermore, the article proposes a new text-as-data approach that is less expensive, which could lead to a more accessible method for collaborative and interdisciplinary policy development. We find that the SOCS is uniquely positioned to serve in an alliance fashion in the larger qualitative comparative analysis that supports CPA. Because policies around the world are failing to reach their goals successfully, this article is expected to open a new path of inquiry in CPA, which could be used to support interdisciplinary research for knowledge of and knowledge in policy analysis.

Article
Wellner’s article aims at changing an essential element within phenomenology by introducing the idea of digital imagination. Assuming her thesis, I aim to raise two possible kinds of questions generated by the introduction of a technologically embedded imagination which is externalized.

Article
Helena De Preester’s “Subjectivity and Transcendental Illusions in the Anthropocene” aims to rethink fundamentally the human–technology relationship against the backdrop of the Anthropocene. Essentially, the essay is concerned with the current form of subjectivity that characterizes humankind in the Anthropocene, and analyzes how it embeds knowledge, desire and behavior. De Preester indeed succeeds in creating a potent and engaging reflection on the current form of human subjectivity characteristic for the age of the Anthropocene, by referring to Vilém Flusser’s apparatuses, McKenzie Wark’s vectoralist class, Stiegler’s savoir-vivre and object of addiction, Robert Pfaller’s notion of interpassivity; Williams’ attention economy, etc. However, De Preester throughout her essay, although discussing thoroughly the aforementioned ideas of subjectivity and transcendental illusions in the Anthropocene (as much as the essay format allows for this), falls short in her ambitious aspiration to rethink fundamentally the human–technology relationship against the backdrop of the Anthropocene. Although De Preester provides excellent examples, they do not seem cohesive enough to support this conclusion, primarily as the analysis of the Anthropocene’s triad, as pointed out in the introduction (Earth–Technology–Humankind), requires a thorough insight into the role that Technology plays in its constitution.

Article
In my comments, I address two issues that are important but not central to the paper under review here. First, I present a reading of the postphenomenological concept of multistability by going back to Merleau-Ponty’s notion of the primacy of perception. I conclude that assertions affirming the multistability of technologies should not be seen as merely empirical. Second, I address the adequacy of using the language of ‘empirical’ and ‘transcendental’ as a means to categorize exclusionary approaches in philosophy of technology.

Article
This commentary introduces the notion of “technical alterity” in order to address the following questions: is it possible that technical objects can become “others” in analogy to Levinas’ ethics and can this relation provide solutions for the subject in the Anthropocene? According to Levinas, the human subject’s only break from having to be itself is in the consumption and enjoyment of things. Objects constitute thus an “other” that can be consumed, i.e., appropriated and be made one’s own. But, in times of the Anthropocene, where the entanglement of human and non-human actors becomes increasingly obvious and intricate, and a question of survival for human beings in the face of the climate crisis, it is necessary to develop a relation with non-human actors that does not reduce them to mere means to an end. This ethical relation with technical objects relies upon an epistemic act, since technical objects precisely do not have a “face” in the Levinasian sense. Technical objects as “technical others” have therefore—in light of Simondon’s philosophy of technology—to be invented .

Article
This response briefly argues that post-phenomenology has always cut across the transcendental-empirical divide and is able to cultivate a deep respect for technologies in their otherness, without denying their relation to humanity. It does this by revisiting Don Ihde’s genetic phenomenological variations and tracing its relation to Gilbert Simondon’s ontogenesis. Having set up the historical nature of objects, the second part of this paper will take up Yoni Van Den Eede’s call for a more speculative approach.

Article
The purpose of this article is to establish a connection between modelling practices and interpretive approaches in quantum mechanics, taking as a starting point the literature on scientific representation. Different types of modalities (epistemic, practical, conceptual and natural) play different roles in scientific representation. I postulate that the way theoretical structures are interpreted in this respect affects the way models are constructed. In quantum mechanics, this would be the case in particular of initial conditions and observables. I examine two formulations of quantum mechanics, the standard wave-function formulation and the consistent histories formulation, and show that they correspond to opposite stances, which confirms my approach. Finally, I examine possible strategies for deciding between these stances.

Article
This paper uses the concept of causal stories to explore how death, sickness and misfortune lead to accusations of sorcery or witchcraft. Based on empirical research in Papua New Guinea, we propose a new analytical framework that shows how negative events may trigger particular narratives about the use of the supernatural by individuals and groups. These narratives then direct considerations about the cause of the misfortune, the agent who can heal it, and the appropriate response from those affected by the misfortune. We also categorise the factors that attract people towards magical causal narratives or towards competing non-magical causal narratives. We situate our analysis within a context of worldview pluralism, where individuals possess multiple worldviews, such as a magical worldview or a scientific worldview. We argue that causal stories operate to activate the dominance of one worldview or a combination of worldviews in given circumstances. Our theoretical contribution may be extended to discrimination and violence against those suffering from health-related stigma, as this too gives rise to competing causal stories that either take hold or are ignored depending upon diverse factors.

Article
In many parts of sub-Saharan Africa, mothers impacted by the genetic condition of albinism, whether as mothers of children with albinism or themselves with albinism, are disproportionately impacted by a constellation of health-related stigma, social determinants of health (SDH), and human rights violations. In a critical ethnographic study in Tanzania, we engaged with the voices of mothers impacted by albinism and key stakeholders to elucidate experiences of stigma. Their narratives revealed internalized subjective stigma, social stigma such as being ostracized by family and community, and structural stigma on account of lack of access to SDH. An analysis of health systems as SDH revealed stigmatizing attitudes and behaviours of healthcare providers, especially at the time of birth; a lack of access to timely quality health services, in particular skin and eye care; and a lack of health-related education about the cause and care of albinism. Gender inequality as another SDH featured prominently as an amplifier of stigma. The findings pose implications for research, policy, and practice. A concrete avenue to de-stigmatization of mothers impacted by albinism exists by the application of principles of human rights, particularly equality and non-discrimination; contextual analysis of cultural dynamics including relevant ontology; meaningful participation of rights-claimants, such as peer groups of mothers; and accountability of governments and their obligation to ensure access to health information as a key social determinant of the right to health.

Article
While stigmatisation is universal, stigma research in low- and middle-income countries (LMIC) is limited. LMIC stigma research predominantly concerns health-related stigma, primarily regarding HIV/AIDS or mental illness from an adult perspective. While there are commonalities in stigmatisation, there are also contextual differences. The aim of this study in DR Congo (DRC), as a formative part in the development of a common stigma reduction intervention, was to gain insight into the commonalities and differences of stigma drivers (triggers of stigmatisation), facilitators (factors positively or negatively influencing stigmatisation), and manifestations (practices and experiences of stigmatisation) with regard to three populations: unmarried mothers, children formerly associated with armed forces and groups (CAAFAG), and an indigenous population. Group exercises, in which participants reacted to statements and substantiated their reactions, were held with the ‘general population’ (15 exercises, n = 70) and ‘populations experiencing stigma’ (10 exercises, n = 48). Data was transcribed and translated, and coded in Nvivo12. We conducted framework analysis. There were two drivers mentioned across the three populations: perceived danger was the most prominent driver, followed by perceived low value of the population experiencing stigma. There were five shared facilitators, with livelihood and personal benefit the most comparable across the populations. Connection to family or leaders received mixed reactions. If unmarried mothers and CAAFAG were perceived to have taken advice from the general population and changed their stereotyped behaviour this also featured as a facilitator. Stigma manifested itself for the three populations at family, community, leaders and services level, with participation restrictions, differential treatment, anticipated stigma and feelings of scapegoating. Stereotyping was common, with different stereotypes regarding the three populations. Although stigmatisation was persistent, positive interactions between the general population and populations experiencing stigma were shared as well. This study demonstrated utility of a health-related stigma and discrimination framework and a participatory exercise for understanding non-health related stigmatisation. Results are consistent with other studies regarding these populations in other contexts. This study identified commonalities between drivers, facilitators and manifestations—albeit with population-specific factors. Contextual information seems helpful in proposing strategy components for stigma reduction.

Article
When a boat disappears over the horizon, does a distant observer detect the last moment in which the boat is visible, or the first moment in which the boat is not visible? This apparently ludicrous way of reasoning, heritage of long-lasting medieval debates on decision limit problems, paves the way to sophisticated contemporary debates concerning the methodological core of mathematics, physics and biology. These ancient, logically-framed conundrums throw us into the realm of bounded objects with fuzzy edges, where our mind fails to provide responses to plain questions such as: given a closed curve with a boundary (say, a cellular membrane) how do you recognize what is internal and what is external? We show how the choice of an alternative instead of another is not arbitrary, rather points towards entirely different ontological, philosophical and physical commitments. This paves the way to novel interpretations and operational approaches to challenging issues such as black hole singularities, continuous time in quantum dynamics, chaotic nonlinear paths, logarithmic plots, demarcation of living beings. In the sceptical reign where judgements seem to be suspended forever, the contemporary scientist stands for a sort of God equipped with infinite power who is utterly free to dictate the rules of the experimental settings.

Article
The aim of this contribution is to provide a rather general answer to Hume’s problem. To this end, induction is treated within a straightforward formal paradigm, i.e., several connected levels of abstraction. Within this setting, many concrete models are discussed. On the one hand, models from mathematics, statistics and information science demonstrate how induction might succeed. On the other hand, standard examples from philosophy highlight fundamental difficulties. Thus it transpires that the difference between unbounded and bounded inductive steps is crucial: while unbounded leaps of faith are never justified, there may well be reasonable bounded inductive steps. In this endeavour, the twin concepts of information and probability prove to be indispensable, pinning down the crucial arguments, and, at times, reducing them to calculations. Essentially, a precise study of boundedness settles Goodman’s challenge. Hume’s more profound claim of seemingly inevitable circularity is answered by obviously non-circular hierarchical structures.

Article
It is relatively easy to state that information retrieval (IR) is a scientific discipline but it is rather difficult to understand why it is science because what is science is still under debate in the philosophy of science. To be able to convince others that IR is science, our ability to explain why is crucial. To explain why IR is a scientific discipline, we use a theory and a model of scientific study, which were proposed recently. The explanation involves mapping the knowledge structure of IR to that of well-known scientific disciplines like physics. In addition, the explanation involves identifying the common aim, principles and assumptions in IR and in well-known scientific disciplines like physics, so that they constrain the scientific investigation in IR in a similar way as in physics. Therefore, there are strong similarities in terms of the knowledge structure and the constraints of the scientific investigations between IR and scientific disciplines like physics. Based on such similarities, IR is considered a scientific discipline.

Article
In recent intervention campaigns sensitizing about harmful practices in eastern Africa, the beliefs and institutions of rural populations are marked out: culture is the culprit. This article concentrates on the most targeted region, Sukuma-speaking communities in Tanzania, to verify the stigmatizing impact of institutions: whether bridewealth treats women as commodities, whether children with nsebu disorder are stigmatized, and why children living with albinism are stigmatized. Complementing the situational analysis of power relations, cultural analysis approaches institutions as established practices in a group and as generated from the palette of experiential frames constituting the cultural system prevailing in that group. The method’s sensitivity to intracultural diversity highlights the local capacity of applying cultural logics, ethically framing situations and creating new institutions, for instance by female healers protecting their clients against stigmatization. The method permits to conclude, for the cases studied, that institutions categorizing people prevent rather than cause the discreditable social status known as stigma.

Article
Alcohol consumption during pregnancy can lead to fetal alcohol spectrum disorders (FASD). FASD is a spectrum of structural, functional, and neurodevelopmental problems with often lifelong implications, affecting communities worldwide. It is a leading preventable form of intellectual disabilities and therefore warrants effective prevention approaches. However, well-intended FASD prevention can increase stigmatization of individuals with FASD, women who consume or have consumed alcohol during pregnancy, and non-biological parents and guardians of individuals with FASD. This narrative review surveyed the literature on stigmatization related to FASD. Public stigma appears to be the most common form of stigma studied. Less is known about FASD-related self-stigma, stigma by association, and structural stigma. Accordingly, the current literature on FASD-related stigma does not appear to provide sufficient guidance for effectively reducing FASD-related stigma. However, lessons can be learned from other related health topics and the use of a systematic approach for the development of health promotion programs, namely Intervention Mapping.

Article
Different why-questions emerge under different contexts and require different information in order to be addressed. Hence a relevance relation can hardly be invariant across contexts. However, what is indeed common under any possible context is that all explananda require scientific information in order to be explained. So no scientific information is in principle explanatorily irrelevant, it only becomes so under certain contexts. In view of this, scientific thought experiments can offer explanations, should we analyze their representational strategies. Their representations involve empirical as well as hypothetical statements. I call this the “representational mingling” which bears scientific information that can explain events. Buchanan’s thought experiment from constitutional economics is examined to show how mingled representations explain.

Article
The purpose of this essay is to present and analyse the basic assumptions of Leszek Nowak’s conception of the unity of science. According to Nowak, the unity of science is manifested in the common application of the method of idealisation in scientific research. In accordance with his conception, regardless of the discipline they represent, researchers go through the same stages in building a theory. Two key ones among them are: introducing idealising assumptions into the representation and then their concretisation. In this view, idealisation is the basis of the scientific method, while other cognitive procedures complement it. Nowak’s conception has particular relevance in the context of the dispute between naturalism and anti-naturalism and in the context of the continuing rift between social scientists and natural scientists. It calls into question the anti-naturalist thesis of the ontological uniqueness of social sciences and the resulting methodological consequences. I argue that Nowak’s conception is a cognitively valuable contribution to the contemporary epistemology of science, but it also has weaknesses, mainly due to the limitations of applying the idealisation-concretisation scheme in research practice. For it turns out, as I point out in this essay, that many idealising assumptions are not subject to concretisation and that concretisations do not always condition an increase in the explanatory and/or predictive power of the representations.

Article
According to van Fraassen, inference to the best explanation (IBE) is incompatible with Bayesianism. To argue to the contrary, many philosophers have suggested hybrid models of scientific reasoning with both explanationist and probabilistic elements. This paper offers another such model with two novel features. First, its Bayesian component is imprecise. Second, the domain of credence functions can be extended.

Article
The aim of this paper is to better qualify the problem of online trust. The problem of online trust is that of evaluating whether online environments have the proper design to enable trust. This paper tries to better qualify this problem by showing that there is no unique answer, but only conditional considerations that depend on the conception of trust assumed and the features that are included in the environments themselves. In fact, the major issue concerning traditional debates surrounding online trust is that those debates focus on specific definitions of trust and specific online environments. Ordinarily, a definition of trust is assumed and then environmental conditions necessary for trust are evaluated with respect to such specific definition. However, this modus operandi fails to appreciate that trust is a rich concept, with a multitude of meanings and that there is still no strict consensus on which meaning shall be taken as the proper one. Moreover, the fact that online environments are constantly evolving and that new design features might be implemented in them is completely ignored. In this paper, the richness of the philosophical discussions about trust is brought into the analysis of online trust. I first provide a set of conditions that depend on the definition of trust that can be assumed and then discuss those conditions with respect to the design of online environments in order to determine whether they can enable (and under which circumstances) trust.

Article
It is generally believed that two rival non-relativistic quantum theories, the realist interpretation of quantum mechanics and Bohmian mechanics, are empirically equivalent. In this paper, I use these two quantum theories to show that it is possible to offer a solution to underdetermination in some local cases, by specifying what counts as relevant empirical evidence in empirical equivalence and underdetermination. I argue for a domain-sensitive approach to underdetermination. Domain sensitivity on theories’ predictions plays a role in determining whether two or more theories are empirically equivalent and underdetermined. To support my argument for the denial of the empirical equivalence between Bohmian mechanics and the realist interpretation of quantum mechanics, I argue that they are not empirically equivalent when we consider their predictions for domains outside their application, using the relativistic domain as an example.

Article
During the second half of the twentieth century, several philosophers of technology argued that their predecessors had reflected too abstractly and pessimistically on technology. In the view of these critics, one should study technologies empirically in order to fully understand them. They developed several strategies to empirically inform the philosophy of technology and called their new approach the empirical turn. However, they provide insufficient indications of what exactly is meant by empirical study in their work. This leads to the critical question of what counts as an empirically informed philosophy of technology in the empirical turn. In order to answer this question, we first elaborate on the problems that the empirical turn philosophers tried to address; secondly, we sketch their solutions, and, thirdly, we critically discuss their conceptions of empirical study. Our critical analysis of the empirical turn contributes to new efforts to engage in an empirically informed philosophy of technology.

Article
Authoritative appraisals have qualified this book as an “axiomatic” theory. However, given that its essential content is no more than an analogy, its theoretical organization cannot be axiomatic. Indeed, in the first edition Dirac declares that he had avoided an axiomatic presentation. Moreover, I show that the text aims to solve a basic problem (How quantum mechanics is similar to classical mechanics?). A previous paper analyzed all past theories of physics, chemistry and mathematics, presented by the respective authors non-axiomatically. Four characteristic features of a new model of organizing a theory were recognized. A careful examination of Dirac’s text shows that it actually applied this kind of organization of a theory, confirming formally what Kronz and Lupher suggested through intuitive categories (pragmatism and rigour), i.e. Dirac’s formulation of Quantum mechanics represents a distinct theoretical approach from von Neumann’s axiomatic approach. However, since the second edition Dirac has changed his approach: although relying again on analogy, his theory refers to the axiomatic method. Some considerations on the odd paths which led to the present formulation of QM are added. They suggest that a new foundation of this theory needs to be found.

Article
This paper presents and critically discusses the “logos approach to quantum mechanics” from the point of view of the current debates concerning the relation between metaphysics and science. Due to its alleged direct connection with quantum formalism, the logos approach presents itself as a better alternative for understanding quantum mechanics than other available views. However, we present metaphysical and methodological difficulties that seem to clearly point to a different conclusion: the logos approach is on an epistemic equal footing among alternative realist approaches to quantum mechanics.

Top-cited authors
• Centre Cavaillès, CNRS et Ecole Normale Supérieure de Paris
• University of Auckland
• Vrije Universiteit Brussel
• National Scientific and Technical Research Council
• University of Leicester