Book

From Matter to Life: Information and Causality

Authors:

Abstract

Recent advances suggest that the concept of information might hold the key to unravelling the mystery of life’s nature and origin. Fresh insights from a broad and authoritative range of articulate and respected experts focus on the transition from matter to life, and hence reconcile the deep conceptual schism between the way we describe physical and biological systems. A unique cross-disciplinary perspective, drawing on expertise from philosophy, biology, chemistry, physics, and cognitive and social sciences, provides a new way to look at the deepest questions of our existence. This book addresses the role of information in life, and how it can make a difference to what we know about the world. Students, researchers, and all those interested in what life is and how it began will gain insights into the nature of life and its origins that touch on nearly every domain of science.
... Identifying the evolving degree to which agents make use of information sources, i.e. studying their information decomposition, can yield insights into the systems' internal organizational structure and allows to examine interactions of living organisms even without the precise observation of biochemical processes (see, e.g. [8,9]). ...
... Various computational characteristics of living systems have been recently studied. Using information theory, Flack identifies hierarchies in living systems [8]. Similarly, Krakauer et al. discuss the information theory of individuality [9]. ...
Article
Full-text available
Considering biological systems as information processing entities and analyzing their organizational structure via information-theoretic measures has become an established approach in life sciences. We transfer this framework to a field of broad general interest, the human gut microbiome. We use BacArena, a software combining agent-based modelling and flux-balance analysis, to simulate a simplified human intestinal microbiome (SIHUMI). In a first step, we derive information theoretic measures from the simulated abundance data, and, in a second step, relate them to the metabolic processes underlying the abundance data. Our study provides further evidence on the role of active information storage as an indicator of unexpected structural change in the observed system. Besides, we show that information transfer reflects coherent behavior in the microbial community, both as a reaction to environmental changes and as a result of direct effective interaction. In this sense, purely abundance-based information theoretic measures can provide meaningful insight on metabolic interactions within bacterial communities. Furthermore, we shed light on the important however little noticed technical aspect of distinguishing immediate and delayed effects in the interpretation of local information theoretical measures.
... As a consequence, the world is entirely quantum and the conventional or classical reality is just a moment in the dynamics of coherence-decoherence-recoherence. Information, i.e. information processing is the very transformation -say, from plasma to energy to matter to life, in the history of the universe (Walker, et al., 2017), so much so that t is never lost; rather, it is unceasingly transforming. The story of the transformation of information is the very story of the universe and of living beings, including human beings. ...
... There are not hyletic (or material) differences among biotic and abiotic stances (Walker et al., 2017). The differences can be said to be only qualitative, of degrees or of organization. ...
Article
Full-text available
It is impossible to fully grasp reality and the universe without a sound understanding of quantum science, i.e. theory. The aim of this paper is twofold, namely first presenting what quantum information processing consists of, and then consequently discussing the implications of quantum science to the understanding of reality. I shall claim that the world is fully quantum, and the classical world is but a limit case of the quantum world. The crux of the argument is that quantum information can be taken as a living phenomenon. Quantum information processing (QIP) has been mainly the subject of computational approaches. Here we take it as the way in which information allows for a non-dualistic explanation of the world. In this sense, quantum information processing consists in understanding how entanglement stands as the ground for a coherent reality yet highly dynamical, vibrant and vivid. Information, I argue, is a living phenomenon that creates itself out of nothing. Quantum information is a relational view of entities, systems, phenomena, and events (Auletta, 2005).
... A new perspective on ecosystem as ascendant information systems is introduced (Ulanowicz, 1997(Ulanowicz, ) 2000 Calls to consider the importance of information in biology and evolution are renewed (Maynard Smith, 2000;Szathmáry andSmith, 2002) 2015 Renewed interest in information as fundamental to the origin of life Davies et al., 2017Davies et al., ) 2017 Information theory is mainstream in molecular biology (Sherwin et al., 2017;Wagner, 2017). ...
... A new perspective on ecosystem as ascendant information systems is introduced (Ulanowicz, 1997(Ulanowicz, ) 2000 Calls to consider the importance of information in biology and evolution are renewed (Maynard Smith, 2000;Szathmáry andSmith, 2002) 2015 Renewed interest in information as fundamental to the origin of life Davies et al., 2017Davies et al., ) 2017 Information theory is mainstream in molecular biology (Sherwin et al., 2017;Wagner, 2017). ...
Article
Full-text available
The persistence of ecological systems in changing environments requires energy, materials, and information. Although the importance of information to ecological function has been widely recognized, the fundamental principles of ecological science as commonly expressed do not reflect this central role of information processing. We articulate five fundamental principles of ecology that integrate information with energy and material constraints across scales of organization in living systems. We show how these principles outline new theoretical and empirical research challenges, and offer one novel attempt to incorporate them in a theoretical model. To provide adequate background for the principles, we review major concepts and identify common themes and key differences in information theories spanning physics, biology and semiotics. We structured our review around a series of questions about the role information may play in ecological systems: (i) what is information? (ii) how is information related to uncertainty? (iii) what is information processing? (iv) does information processing link ecological systems across scales? We highlight two aspects of information that capture its dual roles: syntactic information defining the processes that encode, filter and process information stored in biological structure and semiotic information associated with structures and their context. We argue that the principles of information in living systems promote a unified approach to understanding living systems in terms of first principles of biology and physics, and promote much needed theoretical and empirical advances in ecological research to unify understanding across disciplines and scales.
... This is, by definition, a mathematically intractable problem. It is equivalent to Chalmers' Hard Problem of consciousness [32]; Walker and Davies call it the Hard Problem of Life [33,34]. ...
Article
Full-text available
The known laws of nature in the physical sciences are well expressed in the language of mathematics, a fact that caused Eugene Wigner to wonder at the “unreasonable effectiveness” of mathematical concepts to explain physical phenomena. The biological sciences, in contrast, have resisted the formulation of precise mathematical laws that model the complexity of the living world. The limits of mathematics in biology are discussed as stemming from the impossibility of constructing a deterministic “Laplacian” model and the failure of set theory to capture the creative nature of evolutionary processes in the biosphere. Indeed, biology transcends the limits of computation. This leads to a necessity of finding new formalisms to describe biological reality, with or without strictly mathematical approaches. In the former case, mathematical expressions that do not demand numerical equivalence (equations) provide useful information without exact predictions. Examples of approximations without equal signs are given. The ineffectiveness of mathematics in biology is an invitation to expand the limits of science and to see that the creativity of nature transcends mathematical formalism.
... The ball is in our court. Just how information was introduced early into the engines of metabolism is the 'hard problem of life' (Walker and Davies, 2017;Wong and Prabhu, 2023). From a crystallographic perspective, we might start with the size and shape, recalling Erwin Schrödinger's classification of the gene as an aperiodic crystal (Schrödinger 1944). ...
... Several characteristics of early life have endured through time and are shared by all forms of life today, from the near universality of the genetic code to the intermediate metabolism [32,51]. It is unlikely that these biological universalities represent only "frozen accidents" linked to a universal common ancestry, but may emerge as a consequence of some fundamental principles associated with information and thermodynamics that affect the robustness and evolvability of biological systems [51,60]. In a similar vein, it is expected that early ecological systems faced important challenges to their persistence in fluctuating environments, where the number of co-existing entities increases forming diverse communities, and that some fundamental principles may also be invoked to understand their persistence. ...
Preprint
Full-text available
We study the large-time behavior of an ensemble of entities obeying replicator-like stochastic dynamics with mean-field interactions as a model for a primordial ecology. We prove the propagation-of-chaos property and establish conditions for the strong persistence of the N-replicator system and the existence of invariant distributions for a class of associated McKean-Vlasov dynamics. In particular, our results show that, unlike typical models of neutral ecology, fitness equivalence does not need to be assumed but emerges as a condition for the persistence of the system. Further, neutrality is associated with a unique Dirichlet invariant probability measure. We illustrate our findings with some simple case studies, provide numerical results, and discuss our conclusions in the light of Neutral Theory in ecology.
... The ball is in our court. Just how information was introduced early into the engines of metabolism is the 'hard problem of life' (Walker and Davies, 2017;Wong and Prabhu, 2023). From a crystallographic perspective, we might start with the size and shape, recalling Erwin Schrödinger's classification of the gene as an aperiodic crystal (Schrödinger 1944). ...
Article
Full-text available
The demonstration by Ivan Barnes et al. that the serpentinization of fresh Alpine-type ultramafic rocks results in the exhalation of hot alkaline fluids is foundational to the submarine alkaline vent theory (AVT) for life’s emergence to its ‘improbable’ thermodynamic state. In AVT, such alkaline fluids ≤ 150°C, bearing H2 > CH4 > HS⁻—generated and driven convectively by a serpentinizing exothermic mega-engine operating in the ultramafic crust—exhale into the iron-rich, CO2> > > NO3⁻-bearing Hadean ocean to result in hydrothermal precipitate mounds comprising macromolecular ferroferric-carbonate oxyhydroxide and minor sulfide. As the nanocrystalline minerals fougerite/green rust and mackinawite (FeS), they compose the spontaneously precipitated inorganic membranes that keep the highly contrasting solutions apart, thereby maintaining redox and pH disequilibria. They do so in the form of fine chimneys and chemical gardens. The same disequilibria drive the reduction of CO2 to HCOO⁻ or CO, and the oxidation of CH4 to a methyl group—the two products reacting to form acetate in a sequence antedating the ‘energy-producing’ acetyl coenzyme-A pathway. Fougerite is a 2D-layered mineral in which the hydrous interlayers themselves harbor 2D solutions, in effect constricted to ~ 1D by preferentially directed electron hopping/tunneling, and proton Gröthuss ‘bucket-brigading’ when subject to charge. As a redox-driven nanoengine or peristaltic pump, fougerite forces the ordered reduction of nitrate to ammonium, the amination of pyruvate and oxalate to alanine and glycine, and their condensation to short peptides. In turn, these peptides have the flexibility to sequester the founding inorganic iron oxyhydroxide, sulfide, and pyrophosphate clusters, to produce metal- and phosphate-dosed organic films and cells. As the feed to the hydrothermal mound fails, the only equivalent sustenance on offer to the first autotrophs is the still mildly serpentinizing upper crust beneath. While the conditions here are very much less bountiful, they do offer the similar feed and disequilibria the survivors are accustomed to. Sometime during this transition, a replicating non-ribosomal guidance system is discovered to provide the rules to take on the incrementally changing surroundings. The details of how these replicating apparatuses emerged are the hard problem, but by doing so the progenote archaea and bacteria could begin to colonize what would become the deep biosphere. Indeed, that the anaerobic nitrate-respiring methanotrophic archaea and the deep-branching Acetothermia presently comprise a portion of that microbiome occupying serpentinizing rocks offers circumstantial support for this notion. However, the inescapable, if jarring conclusion is drawn that, absent fougerite/green rust, there would be no structured channelway to life.
... Grounded in the empirical and theoretical work on cognition and its evolution in nature (Walker et al. 2017) (Dodig-Crnkovic 2017a), from basal/ basic/ primitive/ elementary/ cellular, to complex form of human cognition (Dodig-Crnkovic 2014Levin et al. 2021;Manicka and Levin 2019;Stewart 1996), with natural information processing (natural computation) as a basis, infocomputational approach can be used to identify several topics in the research of cognition that need more study. ...
Chapter
Full-text available
At the time when the first models of cognitive architectures have been proposed, some forty years ago, understanding of cognition, embodiment and evolution was substantially different from today’s. So was the state of the art of information physics, information chemistry, bioinformatics, neuroinformatics, computational neuroscience, complexity theory, self-organization, theory of evolution, as well as the basic concepts of information and computation. Novel developments support a constructive interdisciplinary framework for cognitive architectures based on natural morphological computing, where interactions between constituents at different levels of organization of matter-energy and their corresponding time-dependent dynamics, lead to complexification of agency and increased cognitive capacities of living organisms that unfold through evolution. Proposed info-computational framework for naturalizing cognition considers present updates (generalizations) of the concepts of information, computation, cognition, and evolution in order to attain an alignment with the current state of the art in corresponding research fields. Some important open questions are suggested for future research with implications for further development of cognitive and intelligent technologies.
... However, "cognitive operations we usually ascribe to brains-sensing, information processing, memory, valence, decision making, learning, anticipation, problem solving, generalization and goal directedness-are all observed in living forms that don't have brains or even neurons." Based on empirical and theoretical insights about cognition and its evolution and development in nature (Walker, Davies, and Ellis 2017) , from basal/ basic/ primitive/ elementary/ cellular to complex form of human cognition, (Manicka and Levin 2019;Levin et al. 2021;Dodig-Crnkovic 2020;Stewart 1996; Dodig-Crnkovic 2014a) modelled on natural information processing (natural computation), we identify several cognitive architecture topics that deserve more study. ...
Preprint
Full-text available
Recent comprehensive overview of 40 years of research in cognitive architectures, (Kotseruba and Tsotsos 2020), evaluates modelling of the core cognitive abilities in humans, but only marginally addresses biologically plausible approaches based on natural computation. This mini review presents a set of perspectives and approaches which have shaped the development of biologically inspired computational models in the recent past that can lead to the development of biologically more realistic cognitive architectures. For describing continuum of natural cognitive architectures, from basal cellular to human-level cognition, we use evolutionary info-computational framework, where natural/ physical/ morphological computation leads to evolution of increasingly complex cognitive systems. Forty years ago, when the first cognitive architectures have been proposed, understanding of cognition, embodiment and evolution was different. So was the state of the art of information physics, bioinformatics, information chemistry, computational neuroscience, complexity theory, self-organization, theory of evolution, information and computation. Novel developments support a constructive interdisciplinary framework for cognitive architectures in the context of computing nature, where interactions between constituents at different levels of organization lead to complexification of agency and increased cognitive capacities. We identify several important research questions for further investigation that can increase understanding of cognition in nature and inspire new developments of cognitive technologies. Recently, basal cell cognition attracted a lot of interest for its possible applications in medicine, new computing technologies, as well as micro- and nanorobotics.
... Understanding the mechanisms of life as information processing is an early idea [25] which eventually led to the discovery of the DNA, and currently [26] used widely for astrobiology and SETI research. The idea of using the fixed-point, called the Y-combinator in lambda calculus λf.(λx.f ...
Preprint
Full-text available
In this article we present the motivation and the core thesis towards the implementation of a Quantum Knowledge Seeking Agent (QKSA). QKSA is a general reinforcement learning agent that can be used to model classical and quantum dynamics. It merges ideas from universal artificial general intelligence, constructor theory and genetic programming to build a robust and general framework for testing the capabilities of the agent in a variety of environments. It takes the artificial life (or, animat) path to artificial general intelligence where a population of intelligent agents are instantiated to explore valid ways of modelling the perceptions. The multiplicity and survivability of the agents are defined by the fitness, with respect to the explainability and predictability, of a resource-bounded computational model of the environment. This general learning approach is then employed to model the physics of an environment based on subjective observer states of the agents. A specific case of quantum process tomography as a general modelling principle is presented. The various background ideas and a baseline formalism are discussed in this article which sets the groundwork for the implementations of the QKSA that are currently in active development.
... In contrast, most biologists are focused on a detailed description of diverse bio-phenomena and think less about unified physical laws. Nevertheless, more biologists are thinking about the importance of information (Nurse, 2008), and more physicists are discussing the value of searching for new physical laws that can both explain biology better and advance physical theory (Prigogine and Stengers, 1984;Gell-Mann, 1994;Walker et al., 2017). ...
Article
The mechanism of biological information flow is of vital importance. However, traditional research surrounding the genetic code that follows the central dogma to a phenotype faces challengers, including missing heritability and two-phased evolution. Here, we propose the karyotype code, which by organizing genes along chromosomes at once preserves species genome information and provides a platform for other genetic and nongenetic information to develop and accumulate. This specific genome-level code, which exists in all living systems, is compared to the genetic code and other organic codes in the context of information management, leading to the concept of hierarchical biological codes and an ‘extended’ definition of adaptor where the adaptors of a code can be not only molecular structures but also, more commonly, biological processes. Notably, different levels of a biosystem have their own mechanisms of information management, and gene-coded parts inheritance preserves “parts information” while karyotype-coded system inheritance preserves the “system information” which organizes parts information. The karyotype code prompts many questions regarding the flow of biological information, including the distinction between information creation, maintenance, modification, and usage, along with differences between living and non-living systems. How do biological systems exist, reproduce, and self-evolve for increased complexity and diversity? Inheritance is mediated by organic codes which function as informational tools to organize chemical reactions, create new information, and preserve frozen accidents, transforming historical miracles into biological routines.
... Closely related measures have been proposed byAy and Polani (2008) andJanzing et al. (2013). Transfer entropy is another 'information as choice' measure -although correlational in character, not causal (Lizier and Prokopenko 2010) -which has been applied to the study of the origins of life(Walker et al. 2017). ...
Article
Full-text available
A causal approach to biological information is outlined. There are two aspects to this approach: information as determining a choice between alternative objects, and information as determining the construction of a single object. The first aspect has been developed in earlier work to yield a quantitative measure of biological information that can be used to analyse biological networks. This paper explores the prospects for a measure based on the second aspect, and suggests some applications for such a measure. These two aspects are not suggested to exhaust all the facets of biological information.
... Alternatively, perhaps there is something distinctive about living systems that can be expressed in terms of information, but a different approach is needed to characterize this (Walker et al. 2017). Numerous biologists and philosophers have proposed definitions of information intended to vindicate the idea that biology is fundamentally a science of biological information. ...
Preprint
Full-text available
The idea that biological information is created by evolution, passed on in heredity, and expressed during development is an attractive gloss on what has been revealed by the last century of advances in biology. But on closer examination it is hard to see what scientific substance corresponds to this vision. Several biologists and philosophers of biology have suggested that 'biological information' is no more than a collection of loose metaphors. Others have offered their own theories of biological information, but most of these have been oddly unrelated to actual biological practice. Here we argue that the conception of information used by Francis Crick in his 'sequence hypothesis' and 'central dogma', a conception closely related to the older idea of 'biological specificity', is adequate to state a substantial, general theory of biological information. There are two aspects to this account, corresponding to a fundamental duality in information theory between Shannon and Kolmogorov measures.
Article
Full-text available
We study the large-time behavior of an ensemble of entities obeying replicator-like stochastic dynamics with mean-field interactions as a model for a primordial ecology. We prove the propagation-of-chaos property and establish conditions for the strong persistence of the N-replicator system and the existence of invariant distributions for a class of associated McKean–Vlasov dynamics. In particular, our results show that, unlike typical models of neutral ecology, fitness equivalence does not need to be assumed but emerges as a condition for the persistence of the system. Further, neutrality is associated with a unique Dirichlet invariant probability measure. We illustrate our findings with some simple case studies, provide numerical results, and discuss our conclusions in the light of Neutral Theory in ecology.
Chapter
Full-text available
Despite the various criteria presented in the literature, most authors engaged in the debate about emergence agree on a fundamental distinction between strong/ontologically robust cases of emergence and weak/metaphysically innocent ones. The former typically involve entities that exhibit new causal capacities, while the latter are primarily associated with deductive unpredictability, conceptual novelty, and other qualities that highlight our epistemic limitations in understanding them. In this paper, I initially examine a paradigmatic example of weak emergence, namely the higher-level patterns generated by virtual cellular automata (CAs) as analyzed by Mark Bedau. Then, I demonstrate that the same mechanism can be observed in real biological systems, such as the dynamics governing the pigmentation ontogeny of the ocellated lizard (Timon lepidus). Unlike virtual CAs, however, real CAs produce patterns that seem to perform non-reducible functions. Therefore, I propose that despite the similarities between virtual and real CAs, the pigmentation pattern of the ocellated lizard should be regarded as a case of strong emergence. Moreover, I suggest that this analysis may shed light on the nature of biological emergent entities in general. Finally, the paper includes an addendum introducing an issue that, while not exhaustively addressed here, is highly relevant: how to metaphysically conceptualize the causal efficacy exhibited by the pigmentation patterns.
Article
Full-text available
This work presents some ambitious perspectives on how Systems Chemistry can contribute to developing the quite new research line of Chemical Artificial Intelligence (CAI). CAI refers to the efforts of devising liquid chemical systems mimicking some performances of biological and human intelligence, which ultimately emerge from wetware. The CAI systems implemented so far assist humans in making decisions. However, such CAI systems lack autonomy and cannot substitute humans. The development of autonomous chemical systems will allow the colonization of the molecular world with remarkable repercussions on human well‐being. As a beneficial side effect, this research line will help establish a deeper comprehension of the mesmerizing phenomenon of the origin of life on Earth and how cognitive capabilities emerge at a basic physico‐chemical level.
Article
Full-text available
This work highlights the relevant contribution of conformational stereoisomers to the complexity and functions of any molecular compound. Conformers have the same molecular and structural formulas but different orientations of the atoms in the three-dimensional space. Moving from one conformer to another is possible without breaking covalent bonds. The interconversion is usually feasible through the thermal energy available in ordinary conditions. The behavior of most biopolymers, such as enzymes, antibodies, RNA, and DNA, is understandable if we consider that each exists as an ensemble of conformers. Each conformational collection confers multi-functionality and adaptability to the single biopolymers. The conformational distribution of any biopolymer has the features of a fuzzy set. Hence, every compound that exists as an ensemble of conformers allows the molecular implementation of a fuzzy set. Since proteins, DNA, and RNA work as fuzzy sets, it is fair to say that life’s logic is fuzzy. The power of processing fuzzy logic makes living beings capable of swift decisions in environments dominated by uncertainty and vagueness. These performances can be implemented in chemical robots, which are confined molecular assemblies mimicking unicellular organisms: they are supposed to help humans “colonise” the molecular world to defeat diseases in living beings and fight pollution in the environment.
Article
Recent efforts in soft robotics and Artificial Life are attempting to construct homeostatically functioning machines with ‘feeling’ analogues. Such robots are designed to be ‘vulnerable’ and, thus, depart from traditional approaches to machine design and construction. In this paper, I explore a representative proposal by Antonio Damasio and Kingson Man, and ask how we can understand the deconstruction of ‘life’ in Derrida, Stiegler, Malabou and Wills to relate to such efforts. I argue that the adoption of biological and phenomenological principles in machine design calls for attention to the precise extent that it may not result in robots that adhere either to biological or to mechanical models. It is with this admission, of the essentially unknowable character of what may result from these efforts, that deconstruction can assist roboticists and synthetic biologists today.
Chapter
In complex systems that host evolutionary processes, in which entirely new entities may enter the scene, some variables can sometimes show a “hockey-stick” behavior, that is a long period of slow growth followed by an “explosive” increase. The TAP equation was proposed with the aim of describing the growth of the number of different types of entities in systems where new entities (e.g., artifacts) can be created, supposing that they derive from transformations of pre-existing ones. It shows a very interesting divergence in finite times, different from the usual exponential growth where divergence takes place in the infinite time limit. The TAP equation does not deal with the growth of the number of actual types, but rather with the number of the possible ones (the members of the so-called set of Adjacent Possible), and it can therefore overestimate the actual rate of growth. In this paper, we introduce a model (called BPSM, focused on systems that may be relevant for the origin of life) that takes into account the difference between the Adjacent Possible and the set of types that are actually created. Using simulations, it has been observed that the growth of the number of chemical species in the system resembles that of the corresponding TAP equation. Since in this case only combinations of at most two entities can be considered at each time, the TAP equation can be analytically integrated. Its behavior can be then compared to the (necessarily finite) behavior of model simulations; their behaviors turn out to be quite similar, and proper tests are introduced, which show that they differ from the familiar exponential growth. Therefore, the BPSM model provides a description of the rapid increase of diversity which resembles TAP, while based upon the growth of the actual entities rather than on the Adjacent Possible.KeywordsInnovationBinary polymer modelOrigin of lifeAdjacent possibleTAP equation
Chapter
A key question concerning the origin of life is whether polymers, such as nucleic acids and proteins, can spontaneously form in prebiotic conditions. Several studies have shown that, by alternating (i) a phase in which a system is in a water-rich condition and (ii) one in which there is a relatively small amount of water, it is possible to achieve polymerization. It can be argued that such “wet-dry” cycles might have actually taken place in the primordial Earth, for example in volcanic lakes. In this paper, using a version of the binary polymer model without catalysis, we have simulated wet and dry cycles to determine the effectiveness of polymerization under these conditions. By observing the behavior of some key variables (e.g., the number of different chemical species which appeared at least once and the maximum length of the species currently present in the system) it is possible to see that the alternation of wet and dry conditions can indeed allow a wider exploration of different chemical species when compared to constant conditions.KeywordsProtocellsOrigin of lifeGillespie algorithmSemipermeable membrane
Chapter
Complexity cannot be dealt with without a proper organization of information. Information reduces entropy while maintaining the state of complexity. This organization of information requires a good understanding of what information is, respectively, a theory of information. The mathematical definition of information, based on Shannon’s mathematical theory of communication is too simple to deal with the various types of complexity. This chapter presents a more differentiated model of information elaborating on the concept of control as defined in cybernetics. Various types of information are introduced like discursive information, goal-information, axiological information, material information, eidetic information, allelopathic information, effect information, pragmatic information, and reproductive information. With that, a model of types of information is presented which undergirds the concept of the information-based organization. This model is also a meta-model through which well-known but somewhat intuitive management concepts now can be logically interlinked, replacing the Weberian bureaucratic hierarchy with a hierarchy of types of information, with goal information at the top and reproductive information at the bottom. This also lays the foundations for a model of governance and administrative information space beyond conventional management information. Also, the relation between data and information becomes clear in operational, programmable ways, as relevant for AI and machine learning. This differentiated model of governance and administrative information explains why in the contemporary economy the organization of information precedes the structure, in the Weberian sense, of an organization design.
Article
Full-text available
The space of possible human cultures is vast, but some cultural configurations are more consistent with cognitive and social constraints than others. This leads to a “landscape” of possibilities that our species has explored over millennia of cultural evolution. However, what does this fitness landscape, which constrains and guides cultural evolution, look like? The machine-learning algorithms that can answer these questions are typically developed for large-scale datasets. Applications to the sparse, inconsistent, and incomplete data found in the historical record have received less attention, and standard recommendations can lead to bias against marginalized, under-studied, or minority cultures. We show how to adapt the minimum probability flow algorithm and the Inverse Ising model, a physics-inspired workhorse of machine learning, to the challenge. A series of natural extensions—including dynamical estimation of missing data, and cross-validation with regularization—enables reliable reconstruction of the underlying constraints. We demonstrate our methods on a curated subset of the Database of Religious History: records from 407 religious groups throughout human history, ranging from the Bronze Age to the present day. This reveals a complex, rugged, landscape, with both sharp, well-defined peaks where state-endorsed religions tend to concentrate, and diffuse cultural floodplains where evangelical religions, non-state spiritual practices, and mystery religions can be found.
Chapter
In Chapter 2, we look at the purpose of life from the scientific viewpoint. The approach involves studying how this concept can be demarcated for scientific and pseudo-scientific purposes. Many kinds of appraisals can be made such as the falsification and paradigmatic approaches to research programming and empirical validation. We are concern with how our object conforms to knowledge in the case of Kant, or how object conforms to knowledge in the case of Copernicus. Research is not afraid to look at the hard sciences of physics, mathematics, logics, syntax and semantics, and psychology, as well as in religion and spirituality, for answers and purpose. These methods of the varied disciplines help us to see how the materialistic side of economics has been degenerating, and suggest ways to arrest and turn those trends around. By looking at things from a higher dimension, alternative solutions are possible.KeywordsFalsificationParadigmResearch ProgramQuantumBehaviorism
Article
Full-text available
New findings: What is the topic of this review? To revisit the 2013 article "Physiology is rocking the foundations of evolutionary biology." What advances does it highlight? The discovery that the genome is not isolated from the soma and the environment, and that there is no barrier preventing somatic characteristics being transmitted to the germline, means that Darwin's pangenetic ideas become relevant again. Abstract: Charles Darwin spent the last decade of his life collaborating with physiologists in search of the biological processes of evolution. He viewed physiology as the way forward in answering fundamental questions about inheritance, acquired characteristics, and the mechanisms by which organisms could achieve their ends and survival. He collaborated with 19th century physiologists, notably John Burdon-Sanderson and George Romanes, in his search for the mechanisms of trans-generational inheritance. The discovery that the genome is not isolated from the soma and the environment, and that there is no barrier preventing somatic characteristics being transmitted to the germline, means that Darwin's pangenetic ideas become relevant again. It is time for 21st century physiology to come to the rescue of evolutionary biology. The article outlines research lines by which this could be achieved. This article is protected by copyright. All rights reserved.
Article
Full-text available
The approach the majority of neuroscientists take to the question of how consciousness is generated, it is probably fair to say, is to ignore it. Although there are active research programs looking at correlates of consciousness, and explorations of informational properties of what might be relevant neural ensembles, the tacitly implied mechanism of consciousness in these approaches is that it somehow just happens. This reliance on a “magical emergence” of consciousness does not address the “objectively unreasonable” proposition that elements that have no attributes or properties that can be said to relate to consciousness somehow aggregate to produce it. Neuroscience has furnished evidence that neurons are fundamental to consciousness; at the fine and gross scale, aspects of our conscious experience depend on specific patterns of neural activity – in some way, the connectivity of neurons computes the features of our experience. So how do we get from knowing that some specific configurations of cells produce consciousness to understanding why this would be the case? Behind the voltages and currents electrophysiologists measure is a staggeringly complex system of electromagnetic fields – these are the fundamental physics of neurons and glia in the brain. The brain is entirely made of electromagnetism (EM) phenomena from the level of the atoms up. The EM field literally manifests the computations, or signaling, or information processing/activities performed by connected cellular ensembles that generate a 1st-person perspective. An investigation into the EM field at the cellular scale provides the possibility of identifying the outward signs of a mechanism in fundamental terms (physics), as opposed to merely describing the correlates of our mental abstractions of it.
Article
Full-text available
Employing concepts from physics, chemistry and bioengineering, 'learning-by-building' approaches are becoming increasingly popular in the life sciences, especially with researchers who are attempting to engineer cellular life from scratch. The SynCell2020/21 conference brought together researchers from different disciplines to highlight progress in this field, including areas where synthetic cells are having socioeconomic and technological impact. Conference participants also identified the challenges involved in designing, manipulating and creating synthetic cells with hierarchical organization and function. A key conclusion is the need to build an international and interdisciplinary research community through enhanced communication, resource-sharing, and educational initiatives.
Book
Debates concerning the units and levels of selection have persisted for over fifty years. One major question in this literature is whether units and levels of selection are genuine, in the sense that they are objective features of the world, or merely reflect the interests and goals of an observer. Scientists and philosophers have proposed a range of answers to this question. This Element introduces this literature and proposes a novel contribution. It defends a realist stance and offers a way of delineating genuine levels of selection by invoking the notion of a functional unit.
Article
Full-text available
Cognition-sensing and responding to the environment-is the unifying principle behind the genetic code, origin of life, evolution, consciousness, artificial intelligence, and cancer. However, the conventional model of biology seems to mistake cause and effect. According to the reductionist view, the causal chain in biology is chemicals → code → cognition. Despite this prevailing view, there are no examples in the literature to show that the laws of physics and chemistry can produce codes, or that codes produce cognition. Chemicals are just the physical layer of any information system. In contrast, although examples of cognition generating codes and codes controlling chemicals are ubiquitous in biology and technology, cognition remains a mystery. Thus, the central question in biology is: What is the nature and origin of cognition? In order to elucidate this pivotal question, we must cultivate a deeper understanding of information flows. Through this lens, we see that biological cognition is volitional (i.e., deliberate, intentional, or knowing), and while technology is constrained by deductive logic, living things make choices and generate novel information using inductive logic. Information has been called "the hard problem of life' and cannot be fully explained by known physical principles (Walker et al., 2017). The present paper uses information theory (the mathematical foundation of our digital age) and Turing machines (computers) to highlight inaccuracies in prevailing reductionist models of biology, and proposes that the correct causation sequence is cognition → code → chemicals.
Article
Full-text available
The goals and targets included in the 2030 Agenda compiled by the United Nations want to stimulate action in areas of critical importance for humanity and the Earth. These goals and targets regard everyone on Earth from both the health and economic and social perspectives. Reaching these goals means to deal with Complex Systems. Therefore, Complexity Science is undoubtedly valuable. However, it needs to extend its scope and focus on some specific objectives. This article proposes a development of Complexity Science that will bring benefits for achieving the United Nations’ aims. It presents a list of the features shared by all the Complex Systems involved in the 2030 Agenda. It shows the reasons why there are certain limitations in the prediction of Complex Systems’ behaviors. It highlights that such limitations raise ethical issues whenever new technologies interfere with the dynamics of Complex Systems, such as human beings and the environment. Finally, new methodological approaches and promising research lines to face Complexity Challenges included in the 2030 Agenda are put forward.
Article
Full-text available
Whether or not viruses are alive remains unsettled. Discoveries of giant viruses with translational genes and large genomes have kept the debate active. Here, a fresh approach is introduced, based on the organisational definition of life from within systems biology. It views living as a circular process of self-organisation and self-construction which is ‘closed to efficient causation’. How information combines with force to fabricate and organise environmentally obtained materials, given an energy source, is here explained as a physical embodiment of informational constraint. Comparing a general virus replication cycle with Rosen’s (M,R)-system shows it to be linear, rather than closed. Some viruses contribute considerable organisational information, but so far none is known to supply all required, nor the material nor energy necessary to complete their replication cycle. As a result, no known virus replication cycle is closed to efficient causation: unlike cellular obligate parasites, viruses do not match the causal structure of an (M,R)-system. Analysis based in identifying a Markov blanket in causal structure proved inconclusive, but using Integrated Information Theory on a Boolean representation, it was possible to show that the causal structure of a virocell is not different from that of the host cell.
Article
Full-text available
A collection of essays on the foundations of biology and its connection to other sciences. Its lengthy and profound foreword by Stuart Kauffman, a major figure in the quantitative analysis of biological regulation at the system level, summarizes the intended main point: “we live not only in a world of webs of cause and effect, but webs of opportunities that enable, but do not cause, often in unforeseeable ways, the possibilities of becoming of the bio- sphere, let alone human life. But most importantly, I seek in this new worldview a re-enchantment of humanity”. (p. 1).To some extent, it’s meant as a reaction to what some perceive as a demoralizing aspects of the mechanistic paradigm that is driven by recent advances in the molecular bio-sciences: “I believe we are partially lost in modernity, seeking half-articulated, a pathway forward. Re-enchantment may be an essential part of this transformation.” (p. 1). I do not agree that the current situation is as bleak as many critics of the scientific mainstream suggest, but this book should appeal to anyone interested in the larger questions of biology no matter where they stand on this issue. All in all, it is an extremely enjoyable and valuable tour of important concepts and controversies.
Article
Full-text available
JSE 34:1 Spring 2020 whole issue PDF
Article
This article intends to review William Dembski's recent monograph entitled Being as Communion: A Metaphysics of Information , in which he establishes an entire information-centric metaphysics. This viewpoint is compared with al-Ghazālī’s perspective, a Muslim philosophical theologian from the Medieval period. It is concluded that what Dembski defines as information, which for him is the ontological basis of the natural world, seems remarkably close to al-Ghazālī’s notion of God's will and omnipotence. This article is an explorative comparison of their metaphysical frameworks that are discussed in light of modern scientific developments, highlighting their differences and similarities.
Article
Full-text available
The growth of a population of protocells requires that the two key processes of replication of the protogenetic material and reproduction of the whole protocell take place at the same rate. While in many ODE-based models such synchronization spontaneously develops, this does not happen in the important case of quadratic growth terms. Here we show that spontaneous synchronization can be recovered (i) by requiring that the transmembrane diffusion of precursors takes place at a finite rate, or (ii) by introducing a finite lifetime of the molecular complexes. We then consider reaction networks that grow by the addition of newly synthesized chemicals in a binary polymer model, and analyze their behaviors in growing and dividing protocells, thereby confirming the importance of (i) and (ii) for synchronization. We describe some interesting phenomena (like long-term oscillations of duplication times) and show that the presence of food-generated autocatalytic cycles is not sufficient to guarantee synchronization: in the case of cycles with a complex structure, it is often observed that only some subcycles survive and synchronize, while others die out. This shows the importance of truly dynamic models that can uncover effects that cannot be detected by static graph theoretical analyses.
Chapter
Full-text available
This paper offers a general systems definition of the phrase “evolutionary development” and an introduction to its application to autopoetic (self-reproducing) complex systems, including the universe as a system. Evolutionary development, evo devo or ED, is a term that can be used by philosophers, scientists, historians, and others as a replacement for the more general term “evolution,” whenever a scholar thinks experimental, selectionist, stochastic, and contingent or “evolutionary” processes, and also convergent, statistically deterministic (probabilistically predictable), or “developmental” processes, including replication, may be simultaneously contributing to selection and adaptation in any complex system, including the universe as a system. Like living systems, our universe broadly exhibits both contingent and deterministic components, in all historical epochs and at all levels of scale. It has a definite birth and it is inevitably senescing toward heat death. The idea that we live in an “evo devo universe,” one that has self-organized over past replications both to generate multilocal evolutionary variation (experimental diversity) and to convergently develop and pass to future generations selected aspects of its accumulated complexity (“intelligence”), is an obvious hypothesis. Yet today, few cosmologists or physicists, even among theorists of universal replication and the multiverse, have entertained the hypothesis that our universe may be both evolving and developing (engaging in both unpredictable experimentation and goal-driven, teleological, directional change and a replicative life cycle), as in living systems. Our models of universal replication, like Lee Smolin’s cosmological natural selection (CNS), do not yet use the concept of universal development, or refer to development literature. I will argue that some variety of evo devo universe models must emerge in coming years, including models of CNS with Intelligence (CNSI), which explore the ways emergent intelligence can be expected to constrain and direct “natural” selection, as it does in living systems. Evo devo models are one of several early approaches to an Extended Evolutionary Synthesis (EES), one that explores adaptation in both living and nonliving replicators. They have much to offer as a general approach to adaptive complexity, and may be required to understand several important phenomena under current research, including galaxy formation, the origin of life, the fine-tuned universe hypothesis, possible Earthlike and life fecundity in astrobiology, convergent evolution, the future of artificial intelligence, and our own apparent history of unreasonably smooth and resilient acceleration of both total and “leading edge” adapted complexity and intelligence growth, even under frequent and occasionally extreme past catastrophic selection events. If they are to become better validated in living systems and in nonliving adaptive replicators, including stars, prebiotic chemistry, and the universe as a system, they will require both better simulation capacity and advances in a variety of theories, which I shall briefly review.
Article
The difficulty of obtaining appreciable quantities of biologically important molecules in thermodynamic equilibrium has long been identified as an obstacle to life's emergence, and determining the specific nonequilibrium conditions that might have given rise to life is challenging. To address these issues, we investigate how the concentrations of life's building blocks change as a function of the distance from equilibrium on average, in two example settings: (i) the synthesis of heavy amino acids and (ii) their polymerization into peptides. We find that relative concentrations of the heaviest amino acids can be boosted by four orders of magnitude, and concentrations of the longest peptide chains can be increased by hundreds of orders of magnitude. The average nonequilibrium distribution does not depend on the details of how the system was driven from equilibrium, indicating that environments might not have to be fine-tuned to support life.
Article
Full-text available
Biology differs fundamentally from the physics that underlies it. This paper 1 proposes that the essential difference is that while physics at its fundamental level is Hamiltonian, in biology, once life has come into existence, causation of a contextual branching nature occurs at every level of the hierarchy of emergence at each time. The key feature allowing this to happen is the way biomolecules such as voltage-gated ion channels can act to enable branching logic to arise from the underlying physics, despite that physics per se being of a deterministic nature. Much randomness occurs at the molecular level, which enables higher level functions to select lower level outcomes according to higher level needs. Intelligent causation occurs when organisms engage in deduction, enabling prediction and planning. This is possible because ion channels enable action potentials to propagate in axons. The further key feature is that such branching biological behavior acts down to cause the underlying physical interactions to also exhibit a contextual branching behavior.
Article
Full-text available
Two broad features are jointly necessary for autonomous agency: organisational closure and the embodiment of an objective-function providing a ‘goal’: so far only organisms demonstrate both. Organisational closure has been studied (mostly in abstract), especially as cell autopoiesis and the cybernetic principles of autonomy, but the role of an internalised ‘goal’ and how it is instantiated by cell signalling and the functioning of nervous systems has received less attention. Here I add some biological ‘flesh’ to the cybernetic theory and trace the evolutionary development of step-changes in autonomy: (1) homeostasis of organisationally closed systems; (2) perception-action systems; (3) action selection systems; (4) cognitive systems; (5) memory supporting a self-model able to anticipate and evaluate actions and consequences. Each stage is characterised by the number of nested goal-directed control-loops embodied by the organism, summarised as will-nestedness N. Organism tegument, receptor/transducer system, mechanisms of cellular and whole-organism re-programming and organisational integration, all contribute to causal independence. Conclusion: organisms are cybernetic phenomena whose identity is created by the information structure of the highest level of causal closure (maximum N), which has increased through evolution, leading to increased causal independence, which might be quantifiable by ‘Integrated Information Theory’ measures.
Article
Full-text available
In this paper, I posit that from a research point of view, Data Science is a language. More precisely Data Science is doing Science using computer science as a language for datafied sciences; much as mathematics is the language of, e.g., physics. From this viewpoint, three (classes) of challenges for computer science are identified; complementing the challenges the closely related Big Data problem already poses to computer science. I discuss the challenges with references to, in my opinion, related, interesting directions in computer science research; note, I claim neither that these directions are the most appropriate to solve the challenges nor that the cited references represent the best work in their field, they are inspirational to me. So, what are these challenges? Firstly, if computer science is to be a language, what should that language look like? While our traditional specifications such as pseudocode are an excellent way to convey what has been done, they fail for more mathematics like reasoning about computations. Secondly, if computer science is to function as a foundation of other, datafied, sciences, its own foundations should be in order. While we have excellent foundations for supervised learning—e.g., by having loss functions to optimize and, more general, by PAC learning (Valiant in Commun ACM 27(11):1134–1142, 1984)—this is far less true for unsupervised learning. Kolmogorov complexity—or, more general, Algorithmic Information Theory—provides a solid base (Li and Vitányi in An introduction to Kolmogorov complexity and its applications, Springer, Berlin, 1993). It provides an objective criterion to choose between competing hypotheses, but it lacks, e.g., an objective measure of the uncertainty of a discovery that datafied sciences need. Thirdly, datafied sciences come with new conceptual challenges. Data-driven scientists come up with data analysis questions that sometimes do and sometimes don’t, fit our conceptual toolkit. Clearly, computer science does not suffer from a lack of interesting, deep, research problems. However, the challenges posed by data science point to a large reservoir of untapped problems. Interesting, stimulating problems, not in the least because they are posed by our colleagues in datafied sciences. It is an exciting time to be a computer scientist.
Article
The organic code concept and its operationalization by molecular codes have been introduced to study the semiotic nature of living systems. This contribution develops further the idea that the semantic capacity of a physical medium can be measured by assessing its ability to implement a code as a contingent mapping. For demonstration and evaluation, the approach is applied to a formal medium: elementary cellular automata (ECA). The semantic capacity is measured by counting the number of ways codes can be implemented. Additionally, a link to information theory is established by taking multivariate mutual information for quantifying contingency. It is shown how ECAs differ in their semantic capacities, how this is related to various ECA classifications, and how this depends on how a meaning is defined. Interestingly, if the meaning should persist for a certain while, the highest semantic capacity is found in CAs with apparently simple behavior, i.e., the fixed-point and two-cycle class. Synergy as a predictor for a CA's ability to implement codes can only be used if context implementing codes are common. For large context spaces with sparse coding contexts synergy is a weak predictor. Concluding, the approach presented here can distinguish CA-like systems with respect to their ability to implement contingent mappings. Applying this to physical systems appears straight forward and might lead to a novel physical property indicating how suitable a physical medium is to implement a semiotic system.
Article
Full-text available
Biological organisms must perform computation as they grow, reproduce and evolve. Moreover, ever since Landauer’s bound was proposed, it has been known that all computation has some thermodynamic cost—and that the same computation can be achieved with greater or smaller thermodynamic cost depending on how it is implemented. Accordingly an important issue concerning the evolution of life is assessing the thermodynamic efficiency of the computations performed by organisms. This issue is interesting both from the perspective of how close life has come to maximally efficient computation (presumably under the pressure of natural selection), and from the practical perspective of what efficiencies we might hope that engineered biological computers might achieve, especially in comparison with current computational systems. Here we show that the computational efficiency of translation, defined as free energy expended per amino acid operation, outperforms the best supercomputers by several orders of magnitude, and is only about an order of magnitude worse than the Landauer bound. However, this efficiency depends strongly on the size and architecture of the cell in question. In particular, we show that the useful efficiency of an amino acid operation, defined as the bulk energy per amino acid polymerization, decreases for increasing bacterial size and converges to the polymerization cost of the ribosome. This cost of the largest bacteria does not change in cells as we progress through the major evolutionary shifts to both single- and multicellular eukaryotes. However, the rates of total computation per unit mass are non-monotonic in bacteria with increasing cell size, and also change across different biological architectures, including the shift from unicellular to multicellular eukaryotes. This article is part of the themed issue ‘Reconceptualizing the origins of life’.
Article
Full-text available
Life is so remarkable, and so unlike any other physical system, that it is tempting to attribute special factors to it. Physics is founded on the assumption that universal laws and principles underlie all natural phenomena, but is it far from clear that there are 'laws of life' with serious descriptive or predictive power analogous to the laws of physics. Nor is there (yet) a 'theoretical biology' in the same sense as theoretical physics. Part of the obstacle in developing a universal theory of biological organization concerns the daunting complexity of living organisms. However, many attempts have been made to glimpse simplicity lurking within this complexity, and to capture this simplicity mathematically. In this paper we review a promising new line of inquiry to bring coherence and order to the realm of biology by focusing on 'information' as a unifying concept.
ResearchGate has not been able to resolve any references for this publication.