To read the full-text of this research, you can request a copy directly from the author.
... When it comes to genetic determinism, however, it is convenient to distinguish different meanings, as the expression is polysemic, and it is not always clear what concept of genetic determinism is being used across different contexts (Gayon, 2009;Kaplan, 2000). Part of the problem stands from the complexity of the concept of determinism itself, and its relationship to debates about human agency, causality of the future, and prediction (Hoefer, 2016;Müller & Placek, 2018). ...
... These two extreme visions of what genetic determinism is, and the responses that need to be articulated to resist genetic deterministic claims, reflect the existence of at least two different concepts potentially meant by the expression. On the one hand, what Kaplan (2000) calls the "complete information" strand, and Gayon (2009) names "Laplacian (scientific) determinism". This is an epistemological interpretation of genetic determinism, according to which the genetics of an organism provide complete information about its phenotype, to the point that if the details of the genotype were fully known, it would be possible to predict the complete phenotype that the organism would develop during its ontogeny. ...
... On the other hand, what Kaplan (2000) names the "intervention is useless" strand, which can roughly be equated to Gayon's (2009) "Laplacian (metaphysical/ontological) determinism". The main idea underlying this interpretation is that the genes fix the phenotypic traits to the point that the possession of one gene determines the expression of the trait, regardless of the potential interventions in the environment. ...
This paper addresses the topic of determinism in contemporary microbiome research. I distinguish two types of deterministic claims about the microbiome, and I show evidence that both types of claims are present in the contemporary literature. First, the idea that the host genetics determines the composition of the microbiome which I call “host-microbiome determinism”. Second, the idea that the genetics of the holobiont (the individual unit composed by a host plus its microbiome) determines the expression of certain phenotypic traits, which I call “microbiome-phenotype determinism”. Drawing on the stability of traits conception of individuality (Suárez in Hist Philos Life Sci 42:11, 2020) I argue that none of these deterministic hypotheses is grounded on our current knowledge of how the holobiont is transgenerationally assembled, nor how it expresses its phenotypic traits.
... Furthermore, new trends in biology promote the idea of a cellular Darwinism (Kupiec 2008, Kupiec et al 2009: during embryonic development, each cell fluctuates randomly between different states and stabilizes according to its interactions with neighbouring cells, by natural selection. From the growing number of new data in biological research showing the key role of stochastic processes, several authors suggest "the end of determinism in biology" (Paldi & Coisne 2009), and philosophers are distinguishing several kinds of determinism (Gayon 2009). ...
... These results show that (1) when trained in biology, as was the case for chance, more teachers know the importance of natural selection, even when having creationist conceptions (for comparison related to some of these countries, see our published results in , 2009 and below: Figures 6 & 7); and (2) teachers are more reluctant to accept the important role of chance than the important role of natural selection. Is this last reluctance linked to the topic of evolution, or is it more general? ...
... The large volume of the Biohead-Citizen data needs successive complementary analyses and publications. Some analyses related to evolution are further developed from a limited number of countries (12 to 19 countries: , Quessada 2008, Quessada & Clément 2011, 2009) and we will publish soon a more complete presentation from 28 countries. The present work is the first to be focused in 21 countries on the analysis of teachers' conceptions on chance and determinism in evolution. ...
... Importantly, all these three categories were tightly linked by what Bernard always emphasized as the first axiom of experimental science, the "principle of determinism" (Gayon, 2009). If certain conditions are set, a given phenomenon will necessarily occur according to a pre-established law. ...
During the period 1860-1880, a number of physicists and mathematicians, including Maxwell, Stewart, Cournot and Boussinesq, used theories formulated in terms of physics to argue that the mind, the soul or a vital principle could have an impact on the body. This paper shows that what was primarily at stake for these authors was a concern about the irreducibility of life and the mind to physics, and that their theories can be regarded primarily as reactions to the law of conservation of energy, which was used among others by Helmholtz and Du Bois-Reymond as an argument against the possibility of vital and mental causes in physiology. In light of this development, Maxwell, Stewart, Cournot and Boussinesq showed that it was still possible to argue for the irreducibility of life and the mind to physics, through an appeal to instability or indeterminism in physics: if the body is an unstable or physically indeterministic system, an immaterial principle can act through triggering or directing motions in the body, without violating the laws of physics.
Since the completion of the Human Genome Project (HGP), biomedical sciences have moved away from a gene-centred view and towards a multi-factorial one in which environment, broadly speaking, plays a central role in the determination of human health and disease. Environmental exposures have been shown to be highly prevalent in disease causation. They are considered as complementary to genetic factors in the etiology of diseases, hence the introduction of the concept of the “exposome” as encompassing the totality of human environmental exposures, from conception onwards (Wild in Cancer Epidemiol Biomark Prev 14:1847–1850, 2005), and the launch of the Human Exposome Project (HEP) which aims to complement the HGP. At first sight, and seen as complementary to the genome, the exposome could thus appear as contributing to the rise of novel postgenomic deterministic narratives which place the environment at their core. Is this really the case? If so, what sort of determinism is at work in exposomics research? Is it a case of environmental determinism, and if so, in what sense? Or is it a new sort of deterministic view? In this paper, we first show that causal narratives in exposomics are still very similar to gene-centred deterministic narratives. They correspond to a form of Laplacian determinism and, above all, to what Claude Bernard called the “determinism of a phenomenon”. Second, we introduce the notion of “reversed heuristic determinism” to characterize the specific deterministic narratives present in exposomics. Indeed, the accepted sorts of external environmental exposures conceived as being at the origins of diseases are determined, methodologically speaking, by their identifiable internal and biological markers. We conclude by highlighting the most relevant implications of the presence of this heuristic determinism in exposomics research.
Heredity has been dismissed as an insignificant object in Claude Bernard's physiology, and the topic is usually ignored by historians. Yet, thirty years ago, Jean Gayon demonstrated that Bernard did elaborate on the subject. The present paper aims at reassessing the issue of heredity in Claude Bernard's project of a "general physiology". My first claim is that Bernard's interest in heredity was linked to his ambitious goal of redefining general physiology in relation to morphology. In 1867, not only was morphology included within experimental physiology, but it also theoretically grounded physiological investigations. By 1878, morphology and physiology were considered as completely independent sciences, and only the latter was perceived as suitable to experimentation. My second claim is that this reversal reflected the existence of two opposite attitudes towards heredity. In the late 1860s, Bernard was convinced that heredity would soon be accessible to experimental manipulation and that new species would be produced in the laboratory exactly like organic chemistry succeeded to do for raw bodies. Ten years later, he ascertained that this was impossible. My third claim is that Bernard was epistemologically ill-equipped to address the issue of heredity. Bernard was strongly committed to a general reasoning scheme that acknowledged only three categories: determining conditions, constant laws and phenomena. This scheme was a key factor in his successes as a physiologist who was able to capture new mechanisms in living bodies. Nonetheless, it also prevented him from understanding how time and history could be endowed with a causal action that cannot be reduced to timeless parameters.
A common and enduring early modern intuition is that materialists reduce organisms in general and human beings in particular to automata. Wasn't a famous book of the time (1748) entitled L'Homme-Machine? In fact, the machine is employed as an analogy, and there was a specifically materialist form of embodiment, in which the body is not reduced to an inanimate machine, but is conceived as an affective, flesh-and-blood entity. This paper discusses how mechanist and vitalist models of organism exist in a more complementary relation than hitherto imagined, with conceptions of embodiment resulting from experimental physiology. From La Mettrie to Bernard, mechanism, body and embodiment are constantly overlapping, modifying and overdetermining one another; embodiment came to be scientifically addressed under the successive figures of vie organique and then milieu intérieur, thereby overcoming the often lamented divide between scientific image and living experience.
In the 17th century, Descartes put forth the metaphor of the machine to explain the functioning of living beings. In the 18th century, La Mettrie extended the metaphor to man. The clock was then used as the paradigm of the machine. In the 20th century, this metaphor still held but the clock was replaced by a computer. Nowadays, the organism is viewed as a robot obeying signals emanating from a computer program controlled by genetic information. This book shows that such a conception leads to contradictions not only in the theory of biology but also in its experimental research program, thereby impeding its development. The analysis of this problem is based on the most recent experimental data obtained in molecular biology as well as the history and philosophy of biology. It shows that the machine theory did not succeed in breaking with Aristotle's finalism. The book presents a new approach to biological systems based on cellular Darwinism. Genes are ruled by probabilistic mechanisms allowing cells to differentiate stochastically. Embryo development is not governed by a determinist genetic program but by natural selection occurring among cell populations inside the organism. This theory has considerable philosophical consequences. Man may be a machine but he is a random one.
www.worldscientific.com/worldscibooks/10.1142/6359
Multiple ways of representing the emerging new genetic knowledge and its implications have resulted from recent research including the Human Genome Project. In this chapter, we discuss that the presentation of human genetics is now less deterministic, formulated in a more systemic approach, taking into account the interaction between the genes and their environment (epigenetics), discussing the notion of biological determinism, and including connections with ethical and social implications. How are these new genetic trends represented today in biology textbooks? Do multiple ways exist across cultures, languages, and countries? Two complementary sets of data are presented and discussed: (1) the representation of human genetic diseases in French biology textbooks, showing a frequent absence of a systemic approach with nevertheless some exceptions, and (2) a comparative analysis of biology textbooks in 16 countries, showing the common similarity in their use of an implicit message through the same clothes and hairstyle of identical twins, but strong differences—in their use of the metaphorgenetic program
—which depended on the sociocultural context of each country. We argue that the renewal of the taught representations of human genetics is correlated not only with the renewal of scientific knowledge but also with implicit values underlying each country’s sociocultural context.
This paper studies pupil problem building around the subject of embryo development during an in-class debate between 16-17 year olds specialized in the sciences. To better understand the teaching procedure behind the study, it takes a scientific approach to the understanding of embryo development as pertains to preformism, to the epigenetic theory and to Darwin’s theory of natural selection but on a cellular level.In scientific circles this latter theory is currently putting forth evolutionist theories on embryo development based upon causal probability. This nascent theory has deeply shaken the foundations of genetic pre-determinism. “Molecular preformism” is a fundamental obstacle when theorizing on embryo development using evolutionist theories since it prevents any new problem building by leaving the speaker over-confident about his/her assertiveness while questioning nothing and inhibiting thought.This scientific study led to the creation of reference points to facilitate a teaching analysis of the subject. The pupils’s problem building allowed them to put up limits and identify needs on varying levels of explanation.This article shows how pupil identification of contradictions in models allowed them to better understand the problem and how it pushes them closer to the Darwinian view of probability.
ResearchGate has not been able to resolve any references for this publication.