Article

Representing and Intervening: Introductory Topics in the Philosophy of Natural Science.

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The globalists cited above all take a theory-centric approach. In contrast, for those who take an experiment-based approach, the central questions of the debate concern such issues as whether particular experimental results warrant belief in various theories, theoretical claims, and/or entities (Cartwright, 1983;Hacking, 1983;Eronen, 2019;Chen and Hricko, 2021). ...
... The second dimension of the debate concerns the opposition between experiment-based and theory-centric approaches. Hacking (1983) has famously developed an experiment-based approach which combines realism about entities with instrumentalism about theories. For Hacking, we can be realists about entities to the extent that we can manipulate those entities in experimental settings. ...
... To begin with, some criteria of reality may be susceptible to counterexamples. For example, Lyons (2006) argues that there are counterexamples to Psillos's (1999) theoretical criterion of predictive success, and Gelfert (2003) does the same for Hacking's (1983) experimental criterion of manipulation. Since we admit multiple criteria, there is a sense in which we are less susceptible to counterexamples than those who maintain that a single criterion is applicable to all cases. ...
Article
This paper develops an approach to the scientific realism debate that has three main features. First, our approach admits multiple criteria of reality, i.e., criteria that, if satisfied, warrant belief in the reality of hypothetical entities. Second, our approach is experiment-based in the sense that it focuses on criteria that are satisfied by experiments as opposed to theories. Third, our approach is local in the sense that it focuses on the reality of particular kinds of entities. We apply this approach to a case that many philosophers have debated, namely, Jean Perrin’s work on atoms and molecules. We provide a novel account by arguing that Perrin’s work warranted a minimal belief in the reality of atoms and molecules as unobservable, discrete particles by satisfying a criterion of reality that we call experimental determination of number per unit. By doing so, he confirmed Avogadro’s hypothesis, but he did not confirm other key constituents of the atomic theories involved. We argue that our account of Perrin’s work is preferable to several other accounts, and we use this as a reason in support of our approach to the realism debate more generally.
... It was a 'thing' that partially existed by virtue of the conceptions that had been made of how it should be performed and thereby yield desirable outcomes and changes. Yet, it was also still a 'thing' that had to be further constituted through situated contexts of creation (Hacking, 1983), as it only made up a somewhat diffuse part of the healthcare sector. Its ontology had fairly blurred contours, so to speak. ...
... But starting with the data is just the wrong place to start out. (Chief engineer, interview, Feb. 2020) In order to understand how to prevent unplanned admissions predictively in practice and make the experiment succeed (Hacking, 1983), the project manager and the managing director of the AI company made a guess that there was a need for means and methods centered more directly around users and working practices. "Where in the healthcare sector are the good AI use cases?", the project manager asked at a meeting, as if such cases merely had to be discovered. ...
... As the quote suggests, the developers had learned that they might be more successful in enacting predictive AI in the healthcare sector by drawing on designers, user-centric approaches, and ethnographic observations in addition to data, statistics, and engineering, and by exploring the working practices first rather than the digital data produced through such practices. As previously noted, Hacking (1983) contends that experiments are about learning how to use an apparatus or instrument in the right way, and knowing when the experiment succeeds. In our case, we might say that it was about learning how to perform an experiment in order for predictive AI, and thus automatized data-driven procedures, to succeed as a solution to specific healthcare challenges. ...
Article
Full-text available
Currently, a large number of AI projects are experimenting with the use of AI and big data for various purposes, especially in the public sector. In this article, we explore one such AI project. Specifically, we study a group of developers in Scandinavia and their efforts to enact predictive AI through the development of a clinical decision support system (CDSS) in pursuit of a future proactive healthcare sector. This yet-to-be system was envisioned to prevent unplanned hospitalizations by ‘turning’ what we term ‘potential patients’, i.e. the effective management of patient trajectories, in pursuit of a proactive healthcare sector. In the article, we investigate this particular project as an ‘experiment’ and conceptualize the developing CDSS as a ‘partially existing object’ with an uncertain ontological status. By studying the gradual enactment and emergence of the CDSS, we illuminate how this fuzzy data-driven object is performed and gradually attributed with solid reality: during its creation process, it advances from being a proactive device imagined to be used in primary healthcare to becoming a triage tool embedded in the prehospital emergency department. Along the way, the project developers are also transformed, learning what ‘moves’ and ‘actions’ to make, and, thereby, becoming skillful CDSS-operators. By using ‘experiment’ as our analytical lens, the article renders visible how persons, locations, and procedures have to be changed, revoked, and suspended in order for the AI project to succeed. Thus, the article contributes to showing how ‘social mangling’ is an essential precondition for predictive AI to succeed as a prolific solution to specific healthcare challenges, along with developers’ learning and transformation.
... Interestingly enough, the historical use of the term realism is linked to the opposition of both theses: 'before Kant, realism usually meant anti-nominalism. After Kant, it usually meant anti-idealism' [22] (p. 108). ...
... Nominalism can be linked to the negation of the existential dimension of realism, and idealism to the negation of the one of independence [15], or the other way around: 'idealism is a thesis about existence. In its extreme form, it says that all that exists is mental, a production of the human spirit' [22] (p. 108, emphasis in original). ...
... It says that only our modes of thinking make us sort grass from straw, flesh from foliage. The world does not have to be sorted that way; it does not come wrapped up in ''natural kinds''' [22] (p. 108, emphasis in original). ...
Article
Full-text available
Several authors emphasize the need for a change in classification theory due to the influence of a dogmatic and monistic ontology supported by an outdated essentialism. These claims tend to focus on the fallibility of knowledge, the need for a pluralistic view, and the theoretical burden of observations. Regardless of the legitimacy of these concerns, there is the risk, when not moderate, to fall into the opposite relativistic extreme. Based on a narrative review of the literature, we aim to reflectively discuss the theoretical foundations that can serve as a basis for a realist position supporting pluralistic ontological classifications. The goal is to show that, against rather conventional solutions, objective scientific-based approaches to natural classifications are presented to be viable, allowing a proper distinction between ontological and taxonomic questions. Supported by critical scientific realism, we consider that such an approach is suitable for the development of ontological Knowledge Organization Systems (KOS). We believe that ontological perspectivism can provide the necessary adaptation to the different granularities of reality.
... [1][2][3][4] Проблемный блок № 2. Поговорим немного о так называемом «накопителе» -о тех базах данных и информационных «складах» вековой человеческой мысли, которые сегодня принято чаще использовать, не оглядываясь на критерии надёжности. [40,45] Итак, базы данных и прочие своды информации образуют некую среду. Сама по себе эта среда нейтральна -она не обладает качественными характеристиками на манер «хороший-плохой». ...
... Такая же логическая ситуация и с исходными источниками: они могут быть действительно качественными, однако работа с источниками, аналитическая деятельность и механизмы научно-исследовательского ряда затрагивают напрямую вопросы навыков учёных. [37][38][39][40][41] Справедливости ради, следует отметить, что как таковые выше упомянутые навыки научной деятельности не передаются «с кровью матери» или каким-то иным автоматическим способом. Эти навыки требуется приобретать -для чего, естественно, требуются и соответствующие научные школы, и методологии, и профессиональный научный состав, заинтересованной в подготовке будущего научного поколенияинтеллектуальной элиты страны. ...
... Если мы охватим все существующие источники информации и условно ранжируем их по степени надёжности и возможности проверить информацию по качественным параметрам, у нас, в прикладной науке, получится строго 2 категории: мнимые источники информации, которые во многом зависят от субъекта, и достоверные источники информации. [15,31,40] МНИМЫЕ ИСТОЧНИКИ ИНФОРМАЦИИ: ...
Book
Full-text available
«Фотография как источник научной информации» — это коллективная монография, созданная группой авторов — учёных, членов Экспедиционного корпуса, что выступает первым основательным исследованием развития источниковедения непосредственно в части роли использования фотографии как комплексного инструмента и источника в сфере научно-исследовательской деятельности, а также в деятельности связанных профессий. Фотоаппарат – научный инструмент, создающий объективную научную информацию. Соответственно, задаваясь вопросами эффективного научного исследования, нацеленного на разрешение разноплановых научных задач, следуя прогрессу и отвечая требованиям времени, учёному необходимы такие универсальные инструменты, которые не только позволяют «добывать» информацию из бездонных карьеров поля неизвестного, но и производить объективные продукты научной деятельности. И фотография сегодня – инструмент не только надёжный, но и общедоступный. Данная монография не только описывает генезис феномена фотографии как источника научной информации, но также предоставляет практические и методические рекомендации, способствующие организации эффективной научной деятельности посредством применения такого инструмента научного доказывания, объекта исследования и предпосылки обоснования научных гипотез, как фотография.
... The hard core comprises the essential elements of a programme. It is what we might call the "lead idea" (Larvor, 2006) or "central principles" (Hacking, 1983). In Newton's theory of gravitation, the hard core contains his three laws of motion and his one law of gravitation. ...
... Heuristics are the implicit and explicit methodological rules and beliefs which constrict and guide the behavior of scientists working within the programme. The negative heuristic tells the scientist what not to do (e.g., "…not to tinker with the hard core…", Chalmers, 1999, p. 133), while the positive heuristic tells the scientist where to focus their attention -which provides what Hacking (1983) describes as a "…ranking of [scientific] problems" (p. 117) to work on. ...
... Rather, the MSRP -applied within the context of scholarly journals -is useful for overall appraisal and this assessment can then be used to inform pursuit within scholarly communities. Put differently, for the individual, it is primarily a descriptive and "backward-looking" (Hacking, 1983) framework -offering a pragmatic set of tools for future consideration, rather than a purely prescriptive one. After all, there are valid reasons that an individual may provide for backing (i.e., publishing within, reviewing for, contributing to) a seemingly degenerating journal or for avoiding a progressive one (Feyerabend, 1970(Feyerabend, , 1980. ...
Article
Full-text available
Despite continued attention, finding adequate criteria for distinguishing “good” from “bad” scholarly journals remains an elusive goal. In this essay, I propose a solution informed by the work of Imre Lakatos and his methodology of scientific research programmes (MSRP). I begin by reviewing several notable attempts at appraising journal quality – focusing primarily on the impact factor and development of journal blacklists and whitelists. In doing so, I note their limitations and link their overarching goals to those found within the philosophy of science. I argue that Lakatos’s MSRP and specifically his classifications of “progressive” and “degenerative” research programmes can be analogized and repurposed for the evaluation of scholarly journals. I argue that this alternative framework resolves some of the limitations discussed above and offers a more considered evaluation of journal quality – one that helps account for the historical evolution of journal-level publication practices and attendant contributions to the growth (or stunting) of scholarly knowledge. By doing so, the seeming problem of journal demarcation is diminished. In the process I utilize two novel tools (the mistake index and scite index) to further illustrate and operationalize aspects of the MSRP.
... After being ignored for some time during the 20 th century, EE has aroused significant attention during the past decades because of its contrast to the presumed theory guidance of experiments. Hacking (1983), and then Steinle (1997) and Burian (1997), who coined the term 'exploratory experimentation', discussed EE as a scientific procedure along historical examples, in which experiments act to a large part autonomously from theory. Hacking (1983) used Herschel's exploration of radiant heat (177ff), Steinle the explorations of Ampere and Faraday on electromagnetic interaction, Burian contributions of Brachet to the understanding of RNA and DNA. ...
... Hacking (1983), and then Steinle (1997) and Burian (1997), who coined the term 'exploratory experimentation', discussed EE as a scientific procedure along historical examples, in which experiments act to a large part autonomously from theory. Hacking (1983) used Herschel's exploration of radiant heat (177ff), Steinle the explorations of Ampere and Faraday on electromagnetic interaction, Burian contributions of Brachet to the understanding of RNA and DNA. Not surprisingly, by addressing different scientific fields and problems, some differing emphases exist. ...
... Conceptualization is thus based on results of measurements, which are ordered, classified and correlated. Conceptualization leads at best to phenomenological laws (see also Hacking (1983) (165)) by finding empirically justified correlations between phenomena and their properties. Being pre-theoretical, conceptualization falls short of explaining the correlation, but by constraining the empirical range of correlations, they guide the development towards concepts and theories. ...
Article
Full-text available
Along three measurements at the Large Hadron Collider (LHC), a high energy particle accelerator, we analyze procedures and consequences of exploratory experimentation (EE). While all of these cases fulfill the requirements of EE: probing new parameter spaces, being void of a target theory and applying a broad range of experimental methods, we identify epis- temic differences and suggest a classification of EE. We distinguish classes of EE according to their respective goals: the exploration where an established global theory cannot provide the details of a local phenomenon, exploration of an astonishing discovery and exploration to find a new entity. We find that these classes also differ with respect to the existence of an identifiable target and their impact on the background theory. The characteristics distin- guish EE from other kinds of experimentation, even though these different kinds have not yet been systematically studied. The formal rigor and precision of LHC physics facilitates to analyze concept formation in its early state. In particular we emphasize the importance for nil–results for conceptualization and argue that conceptualization can also be achieved from nil–results only.
... Ian Hacking is a central member of the Stanford School of philosophy of science (Suppes 1978;Cartwright 1983Cartwright , 1999Hacking 1983Hacking , 1992Galison 1987;Dupré 1993), which is characterized by a commitment to scientific pluralism, disunity of science, and practice-based accounts of science (Ludwig and Ruphy 2021;Cat 2022). In their overarching commitment to disunity of science and rejection of universalist accounts, the Stanford School opposes both the logical empiricists' unity of science ideal and Kuhn's (monistic) attempt to present a global (or 'universal') account of scientific change (cf. ...
... Ian Hacking is a central member of the Stanford School of philosophy of science (Suppes 1978;Cartwright 1983Cartwright , 1999Hacking 1983Hacking , 1992Galison 1987;Dupré 1993), which is characterized by a commitment to scientific pluralism, disunity of science, and practice-based accounts of science (Ludwig and Ruphy 2021;Cat 2022). In their overarching commitment to disunity of science and rejection of universalist accounts, the Stanford School opposes both the logical empiricists' unity of science ideal and Kuhn's (monistic) attempt to present a global (or 'universal') account of scientific change (cf. ...
... SPSP and &HPS are contemporary societies that represent the pluralist/ disunity of science tradition of HPS that emerged in the 1980s (Ludwig and Ruphy 2021). 6 Hacking's particularism is inspired by J. L. Austin's ordinary language philosophy and Michel Foucault's 'history of the present' (Hacking 2002). Tsou (2015) (1) 'ground level questions': specific questions about particular cases that have some bearing on the objectivity of science (e.g., 'Can we trust medical research when it is funded by pharmaceutical companies?') ...
... All experiments involve physical phenomena, that is, quantitative or qualitative properties of a system. It is standard in the philosophy of experimental science to distinguish between the 'source system' that is manipulated in the lab and the 'target system' about which the experimenter wishes to gain knowledge (Hacking et al., 1983;Galison et al., 1987;Franklin, 1989;Franklin and Perovic, 2019). With this in mind, we can think of the 'source physical phenomena' as the quantitative or qualitative properties of the system manipulated in the lab and the 'target physical phenomena' as the quantitative or qualitative properties of the system about which the experimenter is hoping to learn. ...
... For work on the epistemology of experiment see(Hacking et al., 1983;Galison et al., 1987;Franklin, 1989;Franklin and Perovic, 2019;Evans and Thébault, 2020a). ...
Preprint
In an analogue quantum simulation, an experimentally accessible quantum system is controlled and measured precisely in order to learn about the properties of another quantum system. As such, analogue quantum simulation is a novel tool of scientific inference standing between computation simulation and conventional experiment. In this book we undertake a comprehensive appraisal of the epistemology of analogue quantum simulation. In particular, we consider the types of understanding that analogue quantum simulation can yield. We draw a distinction between analogue quantum computations and analogue quantum emulations and argue that this distinction has important practical consequences on the types of validation an experimenter needs to perform in order to meet their epistemic goal. Our analysis is rooted in the contemporary scientific practice of analogue quantum simulation and draws upon detailed case studies of cold atoms, quantum photonics, and dispersive optical media platforms. Our goal is to provide a new framework for scientists and philosophers alike to understand the epistemic foundations of this exciting new area of scientific practice.
... Thus, it is possible to maintain a realist perspective about the targets of measurement-objects and their properties-while acknowledging that knowledge is constructed by humans and can be so constructed in multiple ways. To make the point in a slightly different way, borrowing terms from Nancy Cartwright (1983) and Ian Hacking (1983), one can subscribe to entity realism without necessarily subscribing to theory realism; that is, the entities that feature in scientific theories may be regarded as real, without requiring a judgment about the truth of the theories into which they figure. On the other hand, conceptual pluralism is not the same as relativism: responsible science requires awareness and acknowledgment of the roles that conceptual frameworks, methodological approaches, and statistical models play in shaping investigations and requires explication and empirical investigation of the hypothesized connections between the objects and processes under investigation and measurement results. ...
... Such a transduction effect may become the basis of a direct method of measurement (see Sect. 7.3): through the calibration of the transducer, the 52 To be clear, we do not aim to provide a sufficient set of criteria for the justification of a claim about the existence of any given property, as this would surely involve issues specific to that property. 53 Our stance here is broadly consistent with Ian Hacking's (1983) perspective on entity realism, which entails that a claim about the existence of an entity is justified if it can be used to create effects that can be investigated and understood independently of their cause. As Hacking famously put it, in reference to experiments involving the spraying of electrons and positrons onto a superconducting metal sphere: "if you can spray them, then they are real" (p. ...
... The question of how management and organization theories come to matter to managers and organizations generate recurrent concerns that have been discussed under the umbrella of the "rigour-relevance debate" (Carton and Mouricou 2017;Kieser et al. 2015;Grossmann-Hensel and Seidl 2021) or the "academic-practitioner gap" or "divide" (Bartunek and Rynes 2014;Carton and Ungureanu 2018). One fruitful epistemological angle to make sense of such a question consists of approaching the activity of organizational scholars not as "representing neutrally" an external reality, but rather as "intervening actively" within this reality (Hacking 1983;Pickering 1995). Research dedicated to the performativity of theory is predicated on this insight and analyses scientific statements as (co-)constitutive of empirical (social) reality. ...
... The field of research dedicated to the performativity of theories is nurtured by multiple intellectual influences such as actor-network theory (ANT) (Callon 1998(Callon , 2016, social studies of sciences (Hacking 1983), pragmatism (Muniesa 2014), Science Technology Society (STS) (Law 2008), or socio-materiality studies (Barad Table 2 Contrasting the poles of the performativity as a mindset/performativity as a social mechanism continuum Characteristics Performativity as a mindset Performativity as a social mechanism Theoretical anchoring Austin (1962); Actor-Network Theory; Pragmatism; Science Technology Society studies (STS); Socio-materiality Common core assumptions Influence of the: • Linguistic turnnon-representational view on language • Process turnontology of becoming • Material turnconsideration of socio-materiality • Practice turnfocus on the actual doings of actors Distinctive assumptions Performativity as an "epistemological breakthrough": Stronger commitment to core assumptions: Reconsideration and reconceptualization of entrepreneurship as a journey or strategy performativity as a new cultural condition Analysis of the production of assemblages and/or actor-networks constituting new assumptions or theories (e.g., Bayesian thinking, blue ocean strategy) Specification of social mechanism and boundary conditions of performativity Performativity workinstitutional work needed to perform theory Performative praxis mechanisms that explain how theories are brought into being through actors' practices Inscription of theory into artefacts to change routines Illustrative articles Cabantous and Sergi (2018) (2018) The Performativity of Theories 2007). Performativity studies in management builds on four core common assumptions that capitalize on the long journey and successive translations of this notion in social sciences since Austin's (1962) publication . ...
Chapter
The purpose of this chapter is to explain how organization and management studies have built on and helped advance various streams of research dedicated to the performativity of theory – that is, how theory shapes the patterns of social interactions that constitute social reality. For that purpose, two ideal-type positions that form the poles of a continuum of scholarship about theory performativity are distinguished. These poles consist of approaching either performativity as a mindset – an onto-epistemic lens helpful to reconsider the nature or organizational phenomena and management concepts – or of analyzing performativity as a social mechanism involved in the production and transformation of social reality for actors. By relying on illustrations from recent research, the common core assumptions underlying both perspectives on performativity can be specified as well as these perspectives’ distinctive commitments to these assumptions, analytical foci, and contribution to organizational knowledge of performativity. Finally, the chapter discusses how the insights generated at each pole of this continuum complement each other and can advance organizational and management studies of theory performativity.
... While a continuously evolving landscape of problems and proposed solutions might seem to counter a notion of progress in science, scientific theories have been used to explain and control progressively more phenomena over the course of the scientific record (Laudan, 1978;Douglas, 2014). According to the pragmatic view, this progress results from community-maintained standards of explanation, under an overarching drive to better predict and control natural phenomena of potential relevance to society (Hacking, 1983;Douglas, 2014). ...
... Traditional views emphasize the use of experiments to test proposed theories (Popper, 1959), and even consider an interplay in which theories suggest new experiments and unexpected experimental results reveal the need for new theories (Laudan, 1978;Firestein, 2015). However, theories do not arise fully formed but are developed over time through an interaction with experimentation (Laudan, 1978;Hacking, 1983;Bechtel, 2013;Douglas, 2014;Firestein, 2015). We now consider two crucial pieces of that dialogue: the domain of a theory, or phenomena it is intended to pertain to, and a translation function, which specifies how it should relate to phenomena in its domain. ...
Article
Full-text available
In recent years, the field of neuroscience has gone through rapid experimental advances and a significant increase in the use of quantitative and computational methods. This growth has created a need for clearer analyses of the theory and modeling approaches used in the field. This issue is particularly complex in neuroscience because the field studies phenomena that cross a wide range of scales and often require consideration at varying degrees of abstraction, from precise biophysical interactions to the computations they implement. We argue that a pragmatic perspective of science, in which descriptive, mechanistic, and normative models and theories each play a distinct role in defining and bridging levels of abstraction, will facilitate neuroscientific practice. This analysis leads to methodological suggestions, including selecting a level of abstraction that is appropriate for a given problem, identifying transfer functions to connect models and data, and the use of models themselves as a form of experiment.
... Anti-realism about theories says they should not be believed literally but are rather useful ways of prediction. Anti-realism about entities says they are useful intellectual fictions (Hacking 1983). Here is not the place to argue for a particular stance in the realism debate. ...
... (Duhem [1906(Duhem [ ] 1981 Note that the theory-ladenness of experimental facts or observations should not be seen as a threat to scientific realism, as various authors have shown (see e.g. Popper [1934Popper [ / 1959Popper [ ] 2005Shapere 1982;Hacking 1983). The theory-ladenness of experimental procedures is not a problem for theory testing, since the theoretical assumptions underlying experimental procedures are some of the best verified theories that science has produced. ...
Article
Full-text available
There are various conceptions of objectivity, a characteristic of the scientific enterprise, the most fundamental being objectivity as faithfulness to facts. A brute fact, which happens independently from us, becomes a scientific fact once we take cognisance of it through the means made available to us by science. Because of the complex, reciprocal relationship between scientific facts and scientific theory, the concept of objectivity as faithfulness to facts does not hold in the strict sense of an aperspectival faithfulness to brute facts. Nevertheless, it holds in the large sense of an underdetermined faithfulness to scientific facts, as long as we keep in mind the complexity of the notion of scientific fact (as theory-laden), and the role of non-factual elements in theory choice (as underdetermined by facts). Science remains our best way to separate our factual beliefs from our other kinds of beliefs.
... Hacking (1983), as quoted byFranklin (1989, p.166). ...
Article
Current cosmological observations place little constraints on the nature of dark matter, allowing the development of a large number of models and various methods for probing their properties, which seem to provide ideal grounds for the employment of robustness arguments. In this article, the extent to which such arguments can be used to overcome various methodological and theoretical challenges is examined. The conclusion is that while robustness arguments have a limited scope in the context of dark matter research, they can still be used for increasing the scientists’ confidence about the properties of specific models.
... Hier wird deutlich: Es geht nicht um die Leistungsfähigkeit algorithmischer Systeme und deren vermeintlich mimetische Abbildung von Realität. Eine solche Vorstellung geht nicht nur am operativen Charakter des "Representing and Intervening" (Hacking 1983) algorithmischer Verfahrensweisen vorbei. Alteritätstheoretisch perspektiviert lassen sie sich auch als Chimäre entlarven: Denn Zuschreibungen der Andersheit sind stets vermittelt und sinnhaft gestiftet, nicht objektiv gegeben. ...
... Se aleja así de la concepción clásica que plantea la dualidad de los contextos de descubrimiento y justificación, implicando una relación no dicotómica entre representación (proyecto) y realidad (problemas del mundo), y contradiciendo la visión ingenua de la concepción teórica actual, que todavía persiste en la idea de asignar a la realidad una dimensión validatoria externa basada en una supuesta dimensión óntica independiente, consistente con el realismo ontoepistémico que impregna toda su perspectiva. Esta visión está ya amenazada -desde el punto de vista filosófico-con el desarrollo de diferentes puntos de vista, como el de Hacking (1983), para quien (en sintonía con la posición de Margenau) la realidad es ""(…) la segunda creación humana, la primera es la representación. Una vez que hay una práctica a representar, viene inmediatamente a continuación un concepto de segundo orden. ...
Article
Este artículo parte de una crítica a la concepción clásica de la teoría del diseño y a los fundamentos ontoepistémicos que condicionan la posibilidad de una mejor comprensión de sus procesos constitutivos, así como de los impactos que genera sobre la transformación del hábitat construido. Los objetivos básicos son revisar la asunciones más relevantes del cuerpo teórico vigente y reflexionar sobre la necesidad de generar nuevas bases para las disciplinas proyectuales en general y para el diseño arquitectónico en particular. A través de estrategias metodológicas basadas en el método filosófico, la elucidación de problemas conceptuales y la sistematización teórica, se investigan nuevas categorías consistentes con el fin de renovar la imagen del diseño en el contexto actual. Los resultados muestran tres nuevas categorías para avanzar en una nueva imagen de la teoría del diseño: la problematización, la representación y la investigación proyectual.
... Hacking (Hacking 1983) afirmă că multe aspecte ale practicii științifice, inclusiv experimente, nu pot fi interpretate ca încercări de falsificare sau coroborare. ...
Book
Full-text available
În ciuda criticilor teoriei falsificabilității propuse de Karl Popper pentru demarcarea între știință și ne-știință, în principal pseudoștiință, acest criteriu este încă foarte util, și perfect valabil după perfecționarea lui de către Popper și adepții lui. Mai mult, chiar și în versiunea sa inițială, considerată de Lakatos ca ”dogmatică”, Popper nu a afirmat că această metodologie este un criteriu absolut de demarcare: un singur contra-exemplu nu este suficient pentru a falsifica o teorie; mai mult, o teorie poate fi salvată în mod legitim de falsificare prin introducerea unei ipoteze auxiliare. În comparație cu teoria lui Kuhn a revoluțiilor, de care el însuși s-a dezis ulterior transformând-o într-o teorie a ”micro-revoluțiilor”, consider că metodologia de demarcare a lui Popper, împreună cu dezvoltarea ulterioară propusă de acesta, inclusiv coroborarea și verosimilitudinea, deși imperfectă, nu numai că este valabilă și azi, dar este încă cea mai bună metodologie de demarcare. Pentru argumentare, m-am folosit de principalele lucrări ale lui Popper care tratează această problemă, și a principalilor săi critici și susținători. După o scurtă prezentare a lui Karl Popper, și o introducere în problema demarcației și metodologia falsificabilității, trec în revistă principalele critici aduse și argumentele susținătorilor săi, accentuând pe ideea că Popper nu a pus niciodată semnul egalității între falsificare și respingere. În final prezint propriile concluziile în această problemă.
... Thus, it is possible to maintain a realist perspective about the targets of measurement-objects and their properties-while acknowledging that knowledge is constructed by humans and can be so constructed in multiple ways. To make the point in a slightly different way, borrowing terms from Nancy Cartwright (1983) and Ian Hacking (1983), one can subscribe to entity realism without necessarily subscribing to theory realism; that is, the entities that feature in scientific theories may be regarded as real, without requiring a judgment about the truth of the theories into which they figure. On the other hand, conceptual pluralism is not the same as relativism: responsible science requires awareness and acknowledgment of the roles that conceptual frameworks, methodological approaches, and statistical models play in shaping investigations and requires explication and empirical investigation of the hypothesized connections between the objects and processes under investigation and measurement results. ...
Chapter
Full-text available
This chapter aims to present a brief conceptual history of philosophical thinking about measurement, concentrating in particular on the issues of objectivity and subjectivity, realism and nonrealism, and the role of models in measurement, as well as a discussion of how these philosophical issues have shaped thinking and discourse about measurement in both the human and physical sciences. First, three perspectives on measurement and its epistemic status are discussed, grouped as (a) naive realism, (b) operationalism, and (c) representationalism. Following this, we discuss how these perspectives have informed thinking about the concept of validity in the human sciences, and how they have influenced the way in which measurement is characterized in different contexts as being dependent on empirical and/or mathematical constraints. We then attempt to synthesize these perspectives and propose a version of model-dependent realism which maintains some of the elements of each of these perspectives and at the same time rejects their most radical aspects, by acknowledging the fundamental role of models in measurement but also emphasizing that models are always models of something: the empirical components of measurement are designed and operated to guarantee that, via such models, measurement results convey information on the intended property. The analysis also provides a simple explanation of two of the most critical stereotypes that still affect measurement science: the hypotheses that (1) measurement is quantification, which hides the relevance of the empirical component of the process, and that (2) measurement is only a process of transmission and presentation of preexisting information, usually intended as the “true value” of the measurand, which instead neglects the role of models in the process.
... draws awareness to the fact that technological tools or instruments often play a fundamental role in a scientific knowledge production process (Boon, 2011;Lacey, 2012) . Hacking (1983), for example, mentions the way in which the microscope is essential to the investigation of microscopic organisms. We can only see them with the microscope, never without. ...
Article
Full-text available
Recent developments in AI-research suggest that an AI-driven science might not be that far off. The research of for Melnikov et al. (2018) and that of Evans et al. (2018) show that automated systems can already have a distinctive role in the design of experiments and in directing future research. Common practice in many of the papers devoted to the automation of basic research is to refer to these automated systems as ‘agents’. What is this attribution of agency based on and to what extent is this an important notion in the broader context of an AI-driven science? In an attempt to answer these questions, this paper proposes a new methodological framework, introduced as the Four-Fold Framework, that can be used to conceptualize artificial agency in basic research. It consists of four modeling strategies, three of which were already identified and used by Sarkia (2021) to conceptualize ‘intentional agency’. The novelty of the framework is the inclusion of a fourth strategy, introduced as conceptual modeling, that adds a semantic dimension to the overall conceptualization. The strategy connects to the other strategies by modeling both the actual use of ‘artificial agency’ in basic research as well as what is meant by it in each of the other three strategies. This enables researchers to bridge the gap between theory and practice by comparing the meaning of artificial agency in both an academic as well as in a practical context.
... Ian Hacking (1982Hacking ( , 1983, Nancy Cartwright (1983, essay 5), and Ronald Giere (1988, chapter 5) constitute the first generation of entity realists. Hacking first coined the term entity realism. ...
Article
This paper concerns the recent revival of entity realism. Having been started with the work of Ian Hacking, Nancy Cartwright and Ronald Giere, the project of entity realism has recently been developed by Matthias Egg, Markus Eronen, and Bence Nanay. The paper opens a dialogue among these recent views on entity realism and integrates them into a more advanced view. The result is an epistemological criterion for reality: the property-tokens of a certain type may be taken as real insofar as only they can be materially inferred from the evidence obtained in a variety of independent ways of detection.
... instrumentation, experiments in written arguments, representations of phenomena, experimentalists versus theorists. Practice-philosophical and STS studies have shown that experiments are sites of hard work, contingency and messiness (Latour and Woolgar 1986;Hacking 1983;Pickering 1995;Knorr Cetina 1999). They are sites in which 'worlds are raised', i.e. the knowledge produced through experiments has consequences beyond the confines of the lab. ...
Article
Full-text available
From the introduction: This special issue is dedicated to the exploration of experiments and experimentation. It follows a PhD. course entitled “Exploring and performing experiments” that we organized at Department of Digital Design and Information Studies in spring 2019. The course was attended by 12 PhD fellows, and during the course we and the participants decided to produce a special issue based on the participants’ PhD research projects. The literature for the course included a variety of texts and research articles focusing on experiments mainly from the field of Science and Technology Studies (STS). The readings included the work of Ian Hacking, Andy Pickering, Bruno Latour, Steven Shapin and Simon Schaffer, Isabelle Stengers, Shirley Strum and Brian Eno among others. In the call for papers for this issue authors were asked to draw on the literature in the field of STS in order to explore the role of experiments and experimentation in their own projects, and to consider their articles as vehicles for bringing insights from STS to their own fields. The spirit of this special issue is thus one of ‘STS pollination’ by bringing STS to other fields, rather than necessarily being contributions to STS itself. Hopefully it will generate novel insights and contributions and perhaps cross-pollination
... Even though experiments are set in a manipulated and 'designed' environment, laboratory work is not as smooth as one might expect. Hacking (1983) argues that 'laboratory work is not merely about representation but about invention' (Sismondo, 2010, p. 108). The scientist is actively engaged in manipulating the object because materials do not behave as one expects, apparatuses do not work, etc. ...
Article
Full-text available
This paper explores how the field of Science and Technology Studies (STS) can inform and help conceptualise a relatively new form of laboratory work in education: virtual laboratories. To date, STS have not addressed laboratory work in education. This paper focuses on the virtual educational laboratory by synthesising arguments from the STS literature on laboratory work and proposes research questions that can guide future ethnographic research on how virtual laboratories are applied and constructed locally in the classroom. I argue that the virtual laboratory, like the physical one, must be understood in a broader cultural, social and material context. Moreover, the virtual laboratory has both constraints and affordances, tied to the medium through which it is materialised. I conclude that the virtual laboratory can be understood as a hybrid between explorative and instructive learning.
... Discourses of science put down stakes into the world; they "intervene" or "experiment". Ian Hacking's book, Representing and Intervening (see Hacking, 1982), marks a contribution here. Nietzsche's focus upon "experimentation" is perhaps also important. ...
... Por un lado, las nociones de objetividad y verdad son construcciones históricas que han forjado las dos entidades metafísicas de sujeto y objeto (Daston & Galison, 2007). Por otra parte, el conocimiento científico no puede reducirse a un encuentro cara a cara entre el sujeto y el objeto, ya que implica a un conjunto de actores: instrumentos, publicaciones, instituciones, inversiones y los propios objetos que participan en la construcción del conocimiento (Latour, 1987;Hacking, 1983). ...
Article
Full-text available
Los estudios CTS han intentado que la ciencia observe todo lo que ocurre en un campo de estudio y que no se limite a la teoría, como se evidencia en lo que se llama el «giro empírico» en la filosofía de la tecnología. Este estudio muestra el posible vínculo entre la filosofía de la tecnología y la perspectiva CTS, y propone que el concepto de tiempo-paisaje es más apropiado que el de tiempo cronológico para entender los cambios tecnológicos actuales. En este sentido, más allá de las conjeturas teóricas sobre la tecnología, esta investigación toma la energía nuclear como caso de estudio para cuestionar y ponderar los postulados tradicionales del cambio técnico, adoptando el principio metodológico de la exigencia de concreción de los estudios empíricos y el concepto de imaginario sociotécnico. El proceso de indagación combinó elementos como la epistemología, la ética y la política en una perspectiva antropológica. Finalmente, se establece que el uso de la energía nuclear, considerada como parte de la era del átomo, invita a pensar en el tiempo en términos distintos a los cronológicos. El estudio también ayuda a ilustrar los beneficios mutuos de una asociación entre los estudios CTS y la filosofía de la tecnología.
... En dicho contexto la manipulación de modelos es esencial. Siguiendo a Ian Hacking (1983), son los experimentos los que les permiten a los científicos poner a prueba las hipótesis sustitutas de un modelo que puedan elaborarse acerca de la existencia y relaciones de los fenómenos en cuestión. Por ello, el éxito de la representación se ha asociado, generalmente, a la consecución de explicaciones exitosas. ...
Article
Full-text available
En las últimas dos décadas ha surgido en epistemología una perspectiva que rescata la noción de comprensión e intenta ponerla en el foco principal de la discusión sobre la modelización científica, al considerar que tiene un papel fundamental en la actividad del conocimiento. Desde una breve revisión de algunas ideas del pragmatismo americano y del inferencialismo de Mauricio Suárez, el objetivo de este trabajo es proponer una definición inferencial de la comprensión científica. Se afirma que la comprensión cumple un papel central en la práctica de la modelización, como término de éxito cognitivo, y que puede introducirse como una noción importante para la discusión acerca de la función explicativa que tienen los modelos.
... Apesar de o realismo de entidades ser defendido por importantes filósofos da ciência (Hacking, 1983;Bunge, 2010), e alguns não considerarem que a metaindução pessimista ofereça graves ameaças a formas críticas de realismo (Psillos, 1999), outros autores buscaram assimilar a metaindução pessimista a visões realistas, devido às suas importantes contribuições para uma apreciação filosófica do significado das entidades teóricas. Torna-se comum a defesa de formas mais seletivas de realismo. ...
Article
Full-text available
O presente artigo tem o objetivo de apresentar um estudo de episódio histórico, a saber o estudo da óptica dos corpos em movimento no século XIX, e discutir suas implicações epistemológicas e educacionais. Por meio de uma análise em epistemologia histórica, realizamos reflexões acerca da provisoriedade do conhecimento científico e o abandono de entidades teóricas da ciência do passado, com o intuito de criar uma argumentação em favor da confiabilidade dos conhecimentos científicos. Apoiando-nos em formas críticas de realismo científico, objetivamos defender visões de ciência que sejam ontologicamente relativistas, mantendo concepções realistas e objetivistas críticas perante a ciência.
... Knowledge is never naive and much less aseptic or pure. Every form of representation of the world, in addition to naming and creating identities, translates into forms of relationship, of appropriation, in general in general strategies of intervention in reality (Hacking, 1983). ...
Article
Full-text available
The present article aims — albeit briefly — to reflect about the theoretical origins and development of multi-species anthropology. Our brief “journey” has its starting point in the paradigm of the human exceptionalism and the anthropocentric view of the relationship between human beings and the rest of the natural world. This gaze, having constituted the central paradigm of the origins of the anthropological discipline, is the result of profoundly western ways of looking at and interpreting the world and the diversity it contains. Traditional dualisms such as nature-culture are based on it, which justified the distinct treatment of the non-Western “other”. In turn, the end of this paradigm emerged as the result of the modernity rise up questions such as the mediatization of environmental issues. In this context, a new area of research emerged, the Human-Animal Studies (HAS), as coined by DeMello, despite other designations used by different research areas (e.g. anthrozoology). In this new area of investigation, relationships with other animals are seen as co-constructed, interdependent and relational, just like ecosystems themselves, and are inside a new line of thought: an Anthropology beyond humanity.
... Knowledge is never naive and much less aseptic or pure. Every form of representation of the world, in addition to naming and creating identities, translates into forms of relationship, of appropriation, in general in general strategies of intervention in reality (Hacking, 1983). ...
Article
The present article aims — albeit briefly — to reflect about the theoretical origins and development of multi-species anthropology. Our brief “journey” has its starting point in the paradigm of the human exceptionalism and the anthropocentric view of the relationship between human beings and the rest of the natural world. This gaze, having constituted the central paradigm of the origins of the anthropological discipline, is the result of profoundly western ways of looking at and interpreting the world and the diversity it contains. Traditional dualisms such as nature-culture are based on it, which justified the distinct treatment of the non-Western “other”. In turn, the end of this paradigm emerged as the result of the modernity rise up questions such as the mediatization of environmental issues. In this context, a new area of research emerged, the Human-Animal Studies (HAS), as coined by DeMello, despite other designations used by different research areas (e.g. anthrozoology). In this new area of investigation, relationships with other animals are seen as co-constructed, interdependent and relational, just like ecosystems themselves, and are inside a new line of thought: an Anthropology beyond humanity
... Habiendo unanimidad en que gran parte de la ciencia actual precisa una intermediación técnica o, en expresión de Hacking (1983), aunque la práctica científica tiene como característica central la interacción entre representación e intervención, no todo el mundo concede relevancia a la noción de tecnociencia. Bunge (2012), por caso, estima que no se trata más que de un neologismo confuso equivalente a la ciencia aplicada, por lo que no requeriría un tratamiento específico. ...
Chapter
Full-text available
El conocimiento científico actual se desarrolla fuertemente mediado por la tecnología y condicionado por intereses, objetivos y valores sociales y económicos. En esta situación, denominamos tecnociencia al régimen característico de la ciencia contemporánea en el que la tecnología comparte protagonismo como contexto, motor y finalidad de la investigación, quedando ambas profundamente acopladas entre sí y con la sociedad. La biotecnología es un ejemplo de tecnociencia cuya relevancia e impacto se comprobó tras la aparición y propagación del virus SARS-CoV-2 en 2019 que desembocó en una crisis global sin precedentes. Científicos y tecnólogos cobraron un protagonismo inusual ante un reto múltiple que incluía el desarrollo acelerado de vacunas eficaces para su distribución planetaria. El logro fue posible, entre otros factores, por desarrollos biotecnológicos previos de carácter disruptivo. En este capítulo se caracteriza la tecnociencia contemporánea desde las perspectivas epistemológica, ontológica y axiológica, comparándola con la actividad científica de tradición teoreticista. También se discuten y valoran desde la práctica tecnocientífica las nociones de determinismo tecnológico y social, ilustrando todos los puntos anteriores con aspectos vacunales de la pandemia de COVID-19.
... Validator (11), responsiveness to treatment, is about response to intervention. The other validator evidence is observational.Hacking (1983),Cartwright (1989), andWoodward (2005) have all argued that response to intervention is especially strong evidence. ...
Article
Full-text available
The concept of a “validator” as a unit of evidence for the validity of a psychiatric category has been important for more than fifty years. Validator evidence is aggregated by expert committees (for the Diagnostic and Statistical Manual of Mental Disorders (DSM), these are referred to as “workgroups”), which use the results to make nosological decisions. Through an examination of the recent history of psychiatric research, this paper argues that it is time to reassess this traditional practice. It concludes with specific suggestions for going forward.
... 3). It seems that here we have an interesting example of the primacy of experimental over theoretical, as mentioned by Ian Hacking, for it must be admitted that the reality of entities is related to experimental devices that act as a cause producing certain phenomena (Hacking 1983). ...
Article
Full-text available
In this article, we are interested in analyzing the biography of metallic titanium (Ti) and its dioxide (TiO2) from a historical, philosophical, and sociological point of view of some of its modes of existence. This biography does not suggest any anthropomorphization of material objects, rather, it is about an attempt to reconcile the reality of science in the present with its history to understand the particularities of materials in contemporary societies. We intend to investigate some properties and characteristics of the “natural” modes of existence and the new properties and implications when these materials gain a new mode of existence, the nanostructured one. This mode of existence refers to a new way of organizing scientific knowledge, an inflection which has been called technoscience, more concerned with what an object will become in the future, then with what it essentially is. In this sense, the proper chemical identity of these substances is one among other modes of existence, into this mode of existence, one must add others which can approach its biological, geological, cultural, technological, economic or geopolitical behavior.
... Por intermédio do Quadro 5, avalia-se que E5 e E12 fornecem indicativo de uma momentânea e singela alusão ao campo da arte, não estreitando relações com outros acontecimentos históricos vivenciados por Joseph Wright, os quais compõem um panorama mais propício para a compreensão dos experimentos e demonstrações científicas ilustradas pelo pintor em suas duas pinturas. Parecem, também, refletir de modo geral sobre o significado dos experimentos na construção do conhecimento científico, ao considerarem a prática experimental como ferramenta de transformação da realidade impregnada de teoria, sendo esta também carregada pelas práticas experimentais e fundamentalmente pelos artefatos (HACKING, 1983). E5, E7 e E12 debatem, assim, sobre as funções dos experimentos nas obras de Wright em um contexto científico e artístico desconhecido que, embora se dê a conhecer de maneira subentendida, poderia contribuir para discussões mais ricas e significativas acerca das relações entre arte e HFC, já "(...) que tanto o trabalho artístico quanto o científico são formas (...) de ampliar a percepção da realidade e de conceber novas leituras do mundo" (FERREIRA, 2010, p. 277). ...
Article
Full-text available
Partindo-se do pressuposto de que relações da arteciência podem auxiliar na compreensão da construção histórica e cultural do conhecimento científico, apresentam-se os resultados da implementação de um módulo de ensino – constituído por uma história em quadrinhos e por seus textos associados – com intuito de se discutir a não neutralidade na observação e o papel dos experimentos no empreendimento científico através de pinturas. O módulo foi aplicado em uma disciplina de história da ciência de um curso de física. A partir dele objetivou-se analisar, com base na teoria fundamentada construtivista de Kathy Charmaz, em que medida o mesmo fornece subsídios para que os(as) alunos(as) possam refletir sobre a prática pedagógica e científica de modo mais crítico e diversificado. Verificou-se que o módulo de ensino apresenta potencialidade para se pensar a ciência com a arte e as pluralidades na educação científica.
... To this end, we need to turn our attention away from the standard view and focus on a Baconian variety of experiment as presented by Hacking (ch. 9,24). ...
Article
Replication experiments purport to independently validate claims from previous research or provide some diagnostic evidence about their reliability. In practice, this value of replication experiments is often taken for granted. Our research shows that in replication experiments, practice often does not live up to theory. Most replication experiments in practice are confounded and their results multiply determined, hence uninterpretable. These results can be driven by the true data generating mechanism, issues present in the original experiment, discrepancies between the original and the replication experiment, new issues introduced in the replication experiment, or combinations of any of these factors. The answers we are looking for with regard to the true state of nature require a rigorous and meticulous investigative process of eliminating errors and singling out elementary or pure cases. In this paper, we introduce the idea of a minimum viable experiment that needs to be identified in practice for replication results to be clearly interpretable. Most experiments are not replication-ready and before striving to replicate a given result, we need theoretical precision or systematic exploration to discover empirical regularities.
... If one stretches the concept of technological artefact to a sufficient degree, all scientific endeavours turn out to be technoscientific. As discussed by several philosophers of science (chiefly among these is Hacking 1983), technological artefacts always shape the contexts in which natural phenomena are observed and studied. 2 This problem will be partially sidestepped here, as the focus of the following discussion is restricted to a particular class of technological artefacts, namely, robotic systems. This restriction may enable one to formulate a reasonable distinction between research endeavours that qualify as technoscientificwhere the object of inquiry is a robotic system or a phenomenon significantly influenced by robotic systemsand scientific endeavours whose object of inquiry is non-robotic or a system that is not significantly influenced by robotic systems. ...
Article
Full-text available
In this paper, we ask one fairly simple question: to what extent can biorobotics be sensibly qualified as science? The answer clearly depends on what “science” means and whether what is actually done in biorobotics corresponds to this meaning. To respond to this question, we will deploy the distinction between science and so-called technoscience, and isolate different kinds of objects of inquiry in biorobotics research. Capitalising on the distinction between “proximal” and “distal” biorobotic hypotheses, we will argue that technoscientific biorobotic studies address proximal hypotheses, whilst scientific biorobotic studies address distal hypotheses. As a result, we argue that bioroboticians can be both considered as scientists and technoscientists and that this is one of the main payoffs of biorobotics. Indeed, technoscientists play an extremely important role in 21st-century culture and in the current critical production of knowledge. Today’s world is increasingly technological, or rather, it is a bio-hybrid system in which the biological and the technological are mixed. Therefore, studying the behaviour of robotic systems and the phenomena of animal-robot interaction means analysing, understanding, and shaping our world. Indeed, in the conclusion of the paper, we broadly reflect on the philosophical and disciplinary payoff of seeing biorobotics as a science and/or technoscience for the increasingly bio-hybrid and technical world of the 21st century.
... Instruments of this latter kind work by carrying out a reliable, theoretically informed procedure to arrive at an output. As I will show, what justifies our belief in the reliability of physically mediated instruments is distinct from that which justifies our belief in the reliability of theoretically mediated instruments (a claim echoed in the broad literature on scientific instruments [Hac83,Bai04,Cha10]). So, when it comes to the reliability of scientific instruments, in general, there exist two distinct epistemic categories to which we appeal in seeking justification for belief in their reliability. ...
Article
Full-text available
Deep learning (DL) has become increasingly central to science, primarily due to its capacity to quickly, efficiently, and accurately predict and classify phenomena of scientific interest. This paper seeks to understand the principles that underwrite scientists’ epistemic entitlement to rely on DL in the first place and argues that these principles are philosophically novel. The question of this paper is not whether scientists can be justified in trusting in the reliability of DL. While today's artificial intelligence exhibits characteristics common to both scientific instruments and scientific experts, this paper argues that the familiar epistemic categories that justify belief in the reliability of instruments and experts are distinct, and that belief in the reliability of DL cannot be reduced to either. Understanding what can justify belief in AI reliability represents an occasion and opportunity for exciting, new philosophy of science.
... Conducting tests also involves embodied work, with scientists "repositories of unconscious experience whose responsibility it is to develop an embodied sense for resolving certain problem situations" (Knorr-Cetina 1992, 119). Against approaches that distinguish between material models and conceptual models (Hacking 1983;Myers 2008, 165, 166) showed how molecular biologists literally get a "feeling for proper molecular configuration," with body-work central to "interpreting the specificities of protein forms and functions." Both conceptual and embodied knowing may not only be individual, but collective (Vertesi 2012). ...
Article
Full-text available
This paper examines the relationship between conceptual and embodied reasoning in engineering work. In the last decade across multiple research projects on pipeline engineering, we have observed only a few times when engineers have expressed embodied or sensory aspects of their practice, as if the activity itself is disembodied. Yet, they also often speak about the importance of field experience. In this paper, we look at engineers’ accounts of the value of field experience showing how it works on their sense of what the technology that they are designing looks, feels, and sounds like in practice, and so what this means for construction and operation, and the management of risk. We show how office-based pipeline engineering work is an exercise in embodied imagination that humanizes the socio-technical system as it manifests in the technical artifacts that they work with. Engineers take the role of the other to reason through the practicability of their designs and risk acceptability.
Thesis
Full-text available
ÖZET Geçtiğimiz 50 yılda kadın ve erkek arasında sosyal, ekonomik ve kültürel bağlamdaki uçurum kapanma eğilimi gösterse de cinsiyete dayalı ücret uçurumu problemi ve kadınların kurumsal, akademik ve politik arenada liderlik gerektiren pozisyonlarda yetersiz şekilde temsil edilmesi sanayileşmiş ülkelerde dahi devam eden bir problemdir. Daha yakın zamanda ortaya çıkan bir literatür, topluluk önünde konuşma isteğinin liderlik tercihlerinin önemli bir belirleyicisi olduğunu öne sürmektedir. Kariyer beklentileri ve liderlik pozisyonları için çok önemli olmasına rağmen araştırmalar ise insanların üçte ikisinin değişen düzeylerde topluluk önünde konuşma korkusu yaşadığını göstermektedir. Özellikle kadınlar, erkeklere göre topluluk önünde konuşma korkusunun daha yüksek olduğunu bildirmektedirler. Mevcut literatürün çoğu topluluk önünde konuşma tercihini yüz yüze ortamda araştırırken çevrimiçi ortamda tercihlerin değişip değişmediğine ilişkin bilgimiz kısıtlıdır. Çevrimiçi ortamlarda da topluluk önünde konuşma tercihi cinsiyete göre farklılaşmakta mıdır? Bu çalışma, çevrimiçi ortamda topluluk önünde konuşma tercihinin cinsiyete göre farklılık gösterip göstermediğini incelemektedir. Erkek ve kadınların, çevrimiçi platformlarda topluluk önünde konuşma tercihlerinde farklılık gösterip göstermediğini analiz etmek için iki saha deneyi gerçekleştirilmiştir. İlk deneyde topluluk önünde konuşma isteği ikili bir tercih olarak, ikinci deneyde topluluk önünde konuşmaktan kaçınma sürekli bir tercih olarak tanımlanmıştır. Zoom platformundaki ilk saha deneyi, sunum sırasında kameralarını açık tutarak ödevlerini sınıfın önünde sunarlarsa final sınavı için fazladan yirmi puan kazanabilecek 500'den fazla öğrenciyi içermektedir. Microsoft Teams platformunda gerçekleştirilen ikinci deneyde, sınıflarının önünde beş dakikalık sunum yaptıklarında para kazanabilecek 521 öğrenci yer almaktadır. Sonuç olarak, kadın katılımcılar topluluk önünde konuşma konusunda erkeklerden daha fazla kaygı bildirseler de her iki deneyde de cinsiyete göre istatistiksel olarak anlamlı bir fark bulunmamıştır. Çevrimiçi platformlarda gerçekleştirilen kurumsal, akademik ve siyasi faaliyetlerin sayısını artırmak, kadınların toplumda daha fazla söz sahibi olmasını sağlayabilir. Bu sayede liderlik pozisyonlarına yükselme olasılığını arttırabilir. ABSTRACT Although the social, economic, and cultural gap between men and women has tended to close over the past 50 years, the problem of the gender pay gap and underrepresentation of women in the institutional, academic, and political arenas is an ongoing problem even in industrialized countries. More recently, emerging literature has suggested that the willingness to speak in public is an essential determinant of leadership preferences. Although it's crucial for career prospects and leadership positions, research shows that two-thirds of the population have a fear of public speaking to varying degrees. In particular, women have reported higher fear of public speaking than men. Most existing literature has focused on the willingness to speak in public in traditional face-to-face settings; our knowledge of whether preferences have changed in the virtual environment is very limited. Are there any gender differences in public speaking online as well? This study analyses whether the preference for public speaking in the online environment differs according to gender. We conducted two field experiments to analyze whether men and women differ in preferences for speaking in public on online platforms. In the first experiment, willingness to speak in public was defined as a binary preference, and for the second experiment, public speaking aversion was defined as a continuous preference. The first field experiment on the Zoom platform involved more than 500 students who could gain extra twenty points for the final exam if they presented their homework in front of the class while keeping their cameras on during the presentation. The second experiment conducted on Microsoft Teams platform involved 521 students who could earn money if they gave a five-minute presentation in front of their class. As a result, although female participants reported more anxiety about public speaking than males, there was no statistically significant difference based on gender in either experiment. Increasing the number of institutional, academic, and political activities carried out on online platforms can give women a greater voice in society. In this way, it can increase the possibility of promotion to leadership positions.
Article
What does outer space smell like? On the one hand, space scientists have used scent as a hint to discover the molecular histories of the cosmos. On the other, Palestinian astronomers, who regularly encounter Israel's vertical military arsenal, joke that it smells like Israel. Based on three years of fieldwork in the occupied West Bank, this article follows these astronomers and reveals how colonial experience can emerge in faraway realms, even past our planet. I position the space sciences alongside these Palestinian perspectives, arguing that such encounters with a superterranean state reveal the increasingly extraplanetary contours of colonial struggle. The Israeli‐Palestinian conflict, in other words, orbits earth. Scale thus emerges not as something that divides the (space) sciences, colonialism, and everyday sensory encounters into separable domains, but as a relational concept through which we can better grasp the dimensions of colonialism, politics, and power today.
Article
Full-text available
In this article, I explore how experiments with social robots enact and reconfigure more-than-human forms of sociality. I combine recent anthropological discussions of nonhuman sociality with Andy Pickering’s work on dances of agency (1993, 1995) and John Law’s method assemblages (2004) to show how human-robot interaction experiments enact open-ended and decentred configurations of entangling relations between humans and robots. I propose the concept of artificial sociality to capture both the ongoing enactments and multiple results of such experimental reconfigurations. Using these conceptual tools, I unpack the “curious robot experiment” from my ethnographic fieldwork in a Japanese robotics laboratory and compare the kinds of sociality produced in the two experimental conditions. I argue that the curious robot exemplifies what Pickering calls technologies of engagement (2018) by manifesting a form of artificial sociality that augments the unpredictability of dances of agency enacted in (re)configurations of entangling relations.
Article
Full-text available
This paper offers a practical argument for metaphysical emergence. The main message is that the growing reliance on so-called irrational scientific methods provides evidence that objects of science are indecomposable and as such, are better described by metaphysical emergence as opposed to the prevalent reductionistic metaphysics. I show that a potential counterargument that science will eventually reduce everything to physics has little weight given where science is heading with its current methodological trend. I substantiate my arguments by detailed examples from biological engineering, but the conclusions are extendable beyond that discipline.
Article
Full-text available
Identifying causal relationships is at the heart of all scientific inquiry, and a means to evidence-based practices and to guide policymaking. However, being aware of the complexities of interactions and relationships, scientists and academics are cautious about claiming causality. Researchers applying methods that deviate from the experimental design generally abstain from causal claims, reserving them for designs that adhere to the evidential ideals of empiricism (e.g., RCTs), motivated by the Humean conceptions of causality. Accordingly, results from other designs are ascribed lower explanatory power and scientific status. We discuss the relevance of also other perspectives of causality, such as dispositionalism and the power perspectives of various realist approaches, which emphasize intrinsic properties and contextual variations, as well as an inferentialist/epistemic approach that advocates causal explanations in terms of inferences and linguistic interaction. The discussion will be illustrated by the current situation within psychotherapy research and the APA Policy Statement on Evidence-Based Practice. The distinction between difference-making and causal production will be proposed as a possible means to evaluate the relevance of designs. We conclude that clarifying causal relationships is an ongoing process that requires the use of various designs and methods and advocates a stance of evidential pluralism.
Article
Recent scholars have called into question the categories “science” and “religion” because they bring metaphysical and theological assumptions that theologians should find problematic. The critique of the categories “science” and “religion” has above all been associated with Peter Harrison and his influential argument in The Territories of Science and Religion (2015). This article evaluates the philosophical conclusions that Harrison draws from his antiessentialist philosophy in the two volumes associated with his “After Science and Religion Project.” I argue that Harrison's project is too skeptical toward the categories “science” and “religion” and places too much emphasis on naturalism being incompatible with Christian theology. One can accept the lessons of antiessentialism—above all, how meanings of terms shift over time—and still use the terms “science” and “religion” in responsible ways. This article defends the basic impulse of most scholars in science and religion who promote dialogue and argues for a more moderate reading of the lesson of Territories.
Article
My paper examines the potential of Gaston Bachelard’s concept of phenomenotechnique through a critique of its historical reception and an analysis of its epistemological implications. Vis-a-vis constructivist and pragmatist interpretations, I argue that the core issue at stake in this concept is not just to regard scientific knowledge as ‘fabricated’ through technical means, but also to understand the radical difference that lies between the experience of a world given to and measured by scientists, and one that is ontologically expanded by the conceptual creations of science. In his reflections on the ‘phenomenotechnical realisation’ of noumena in microphysics, mathematics and chemistry, Bachelard not only anticipated a new epistemic modality which has become eminent now in computerised, data-centric approaches to nature and culture; he also became one of the first philosophers to explore the human experience of a reality inherently built from, and no longer just described by mathematical tools and models.
Article
This paper draws on the phenomenological-hermeneutical approaches to philosophy of science to develop realist perspectivism, an integration of experimental realism and perspectivism. Specifically, the paper employs the distinction between “manifestation” and “phenomenon” and it advances the view that the evidence of a real entity is “explorable” in order to argue that instrumentally-mediated robust evidence indicates real entities. Furthermore, it underpins the phenomenological notion of the horizonal nature of scientific observation with perspectivism, so accounting for scientific pluralism even in the cases of inconsistent models. Overall, realist perspectivism is proposed as the way to go for (phenomenologically-hermeneutically minded) philosophers of science.
Article
Full-text available
Collecting in the field is a critical intersection between humans and the rest of the natural world. This afterword begins by suggesting what happens when practices of field collecting are downplayed or ignored, using Ian Hacking’s discussion of the fossil Glyptodon in Representing and Intervening (1983). It then surveys collecting practices in the nineteenth and early twentieth centuries, focusing on questions of the colonial power in relation to the development of new roles and practices. During this key period, naturalists engaged with a remarkably diverse range of geographical sites, established traditions, and the challenges of imperial bureaucracies.
Article
When patients who undergo awake arthroscopic surgery follow the surgery on a screen, medical image technologies enable a rare look inside one’s own body. Based on ethnographic fieldwork at an orthopedic surgery unit in Denmark, we investigate how patients experience their bodies during surgery. Patients see surgery as proof of their pain, experience an anatomical re-categorization, and contemplate the decay of the aging body. We argue that awake arthroscopic surgery constitutes a liminal setting transforming patients’ perceptions of their body and their sufferings. Furthermore, we discuss how awake arthroscopic surgery can be understood as a frame for producing new realities. It constitutes a particular way of seeing and understanding that highlights the seductiveness of the visual as an objective carrier of truth and reminds us to remain critical toward the power of certain frames of knowledge production in medical settings.
Article
Full-text available
The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) uses the conceptualization of psycho- pathology to make psychiatric diagnoses operational. The use of explicit operational criteria appears to be based on an implicit neo-positivist epistemology. Operationalism involves an excessive focus on quantitative descriptions of behavior manifestations, contesting that psychopathology is under- stood as a deviation from the normal or the average in a given population. Consequently, the normal and the psycho- pathological become homogeneous. Our analysis investigates if this neo-positivist epistemology narrows psychopathology conceptualization and endanger integration with the hybrid biopsychosocial model of psychiatry. Based on Georges Canguilhem’s theorization of a qualitative approach to the individual organism who is in a state of morbidity, we show that the (psychiatric) pathology also contains differences in quality. Moreover, that humans are norm-producing organ- isms that actively respond to changes in their internal and external environment. In this regard, the operationalization of mental disorders could include the normativity in humans, i.e., the ability to produce norms. We argue this will mitigate the one-sided psychopathology conceptualization and strengthen the relationship between psychiatric nosology and psychiatry’s hybrid biopsychosocial model.
ResearchGate has not been able to resolve any references for this publication.