Article

Evolvable Self-Replicating Molecules in an Artificial Chemistry

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

This paper gives details of Squirm3, a new artificial environment based on a simple physics and chemistry that supports self-replicating molecules somewhat similar to DNA. The self-replicators emerge spontaneously from a random soup given the right conditions. Interactions between the replicators can result in mutated versions that can outperform their parents. We show how artificial chemistries such as this one can be implemented as a cellular automaton. We concur with Dittrich, Ziegler, and Banzhaf that artificial chemistries are a good medium in which to study early evolution.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Some of the research involves actual mechanical devices and some is based on organic chemistry, but we restrict our discussion here to computer simulations of self-replication. We briefly review von Neumann's universal constructor [21], self-replicating loops [6], [10], [11], [12], [13], [18], [19], and artificial chemistry [4], [5]. ...
... Hutton introduced self-replication in an artificial chemistry simulation, using a templatebased approach [4]. A chain of molecules forms a template against which other molecules bond, similar in concept to JohnnyVon 1.0 [17]. ...
... Hutton's first approach was a cellular automata model [4], but the discrete space constrained the mobility of the simulated molecules, hence Hutton's second approach used a continuous space [5], like JohnnyVon 1.0 [17]. In Hutton's second model, molecules move in a continuous two-dimensional space, following linear trajectories until an obstacle (e.g., the container wall or another molecule) is encountered (i.e., the motion is a billiard ball model). ...
Preprint
It has been argued that a central objective of nanotechnology is to make products inexpensively, and that self-replication is an effective approach to very low-cost manufacturing. The research presented here is intended to be a step towards this vision. We describe a computational simulation of nanoscale machines floating in a virtual liquid. The machines can bond together to form strands (chains) that self-replicate and self-assemble into user-specified meshes. There are four types of machines and the sequence of machine types in a strand determines the shape of the mesh they will build. A strand may be in an unfolded state, in which the bonds are straight, or in a folded state, in which the bond angles depend on the types of machines. By choosing the sequence of machine types in a strand, the user can specify a variety of polygonal shapes. A simulation typically begins with an initial unfolded seed strand in a soup of unbonded machines. The seed strand replicates by bonding with free machines in the soup. The child strands fold into the encoded polygonal shape, and then the polygons drift together and bond to form a mesh. We demonstrate that a variety of polygonal meshes can be manufactured in the simulation, by simply changing the sequence of machine types in the seed.
... A significant difference is that the automata in JohnnyVon are mobile. There is some related work on mobile automata that move in a twodimensional space, including Tim Hutton's Squirm3 [6], which has many similarities to JohnnyVon. The work of Lionel and Roger Penrose, who created self-replicating machines using pieces of plywood, is also relevant [11]. ...
... Turing machines move from cell to cell in an N-dimensional grid space, changing the states of the cells and possibly interacting with each other. However, as far as we know, the only investigation of self-replicating mobile automata, other than JohnnyVon, is Hutton's Squirm3 [6]. ...
... For example, in Hutton's system, atoms move around in a discrete space and they can form self-replicating chains, but he found it necessary to force an atom to stop moving in certain situations [6]. An atom cannot move if it is bonded to another atom and moving would take it out of the neighborhood of the atom to which it is bonded. ...
Preprint
JohnnyVon is an implementation of self-replicating machines in continuous two-dimensional space. Two types of particles drift about in a virtual liquid. The particles are automata with discrete internal states but continuous external relationships. Their internal states are governed by finite state machines but their external relationships are governed by a simulated physics that includes Brownian motion, viscosity, and spring-like attractive and repulsive forces. The particles can be assembled into patterns that can encode arbitrary strings of bits. We demonstrate that, if an arbitrary "seed" pattern is put in a "soup" of separate individual particles, the pattern will replicate by assembling the individual particles into copies of itself. We also show that, given sufficient time, a soup of separate individual particles will eventually spontaneously form self-replicating patterns. We discuss the implications of JohnnyVon for research in nanotechnology, theoretical biology, and artificial life.
... The driving hypothesis is that complex organizations emerge thanks to selforganising attractors in chemical networks, which preserve their structure in time (Walker and Ashby, 1966;Wuensche et al., 1992;Kauffman, 1993). While some Artificial Chemistries seek to mimic as closely as possible the properties of the chemistry that gave rise to life on Earth (Flamm et al., 2010;Högerl, 2010;Young and Neshatian, 2013), others abstract away from the particularities of natural chemistry to focus only on their hypothesized core computational properties (Fontana and Buss, 1994;di Fenizio and Banzhaf, 2000;Hutton, 2002;Tominaga et al., 2007;Buliga and Kauffman, 2014;Sayama, 2018). ...
... Interestingly, a few Artificial Chemistries have shown promising results by simulating the emergence of metabolic structures (Bagley and Farmer, 1992), some of which can also self-reproduce (Hutton, 2002;Young and Neshatian, 2013). However, it is not obvious which are the properties in these chemical systems enabling the emergence of these structures. ...
... Other very related Artificial Chemistries are based on graph rewriting systems. Squirm3 (Hutton, 2002) is a chemistry in which atoms are placed in a 2D grid where they can react with each other, creating or breaking bonds. Interestingly, Hutton (2002) shows that self-reproducing evolvable chains can emerge in this environment when using the right set of reactions, which like in the AC here introduced, they have intrinsic conservation laws. ...
Preprint
Researching the conditions for the emergence of life -- not necessarily as it is, but as it could be -- is one of the main goals of Artificial Life. Artificial Chemistries are one of the most important tools in this endeavour, as they allow us to investigate the process by which metabolisms capable of self-reproduction and -- ultimately -- of evolving, might have emerged. While previous work has shown promising results in this direction, it is still unclear which are the fundamental properties of a chemical system that enable emergent structures to arise. To this end, here we present an Artificial Chemistry based on Combinatory Logic, a Turing-complete rewriting system, which relies on a minimal set of possible reactions. Our experiments show that a single run of this chemistry starting from a tabula rasa state discovers with no external intervention a wide range of emergent structures, including autopoietic structures that maintain their organisation unchanged, others that grow recursively, and most notably, patterns that reproduce themselves, duplicating their number on each cycle. All of these structures take the form of recursive algorithms that acquire basic constituents from the environment and decompose them in a process that is remarkably similar to biological metabolisms.
... This includes: physical self-replicating systems of Penrose [102], and Griffith [58]. Following are descriptions of molecular self-replicating systems, that include: the catalytic model by Kiedrowski [134], physics simulation of self-replicating molecules [138,41], and self-replication model based on artificial chemistry [67]. At the end of this chapter, we present the state-of-the-art in DNA tile pattern self-replicators that are closely related to the tile pattern self-replicator discussed in Chapter 7. ...
... Then we briefly illustrate physical self-replication systems inspired by Penrose's model that are recently demonstrated by Saul Griffith [58]. Following is the review of molecular self-replicating systems: 1) JohnnyVon [138,41], a kinematic self-replicating model that uses simulated physics of molecular components as movable CA; 2) Squirm3 [67], a self-replication model using artificial chemistry; 3) Non-enzymatic self-replication systems of Nucleic Acid molecules; and 4) tile pattern self-replicating systems recently studied in the tile self-assembly framework. ...
... Recently, molecular self-replicating systems have been studied in simulations and real implementations: simulated physics models of molecules [138,41], artificial chemistry models [91,67], and catalytic models of real chemical molecules [134,72]. A review of the work in molecular self-replicating systems is given below. ...
... Using genetic algorithms and neural networks, they created virtual organisms that reproduce intelligent behaviors and execute various tasks. Other researchers are interested in the phenomenon of selfreplication (Hutton, 2002;Tominaga, 2005;Hutton, 2007). Using artificial chemistry, these researchers were able to simulate simplified cells that reproduce themselves (Hutton, 2007). ...
... One of the key properties of all living organisms is their ability to reproduce. Researches show that it is possible to obtain simple auto replicative molecules or organisms from artificial chemistry (Tominaga, 2005;Hutton, 2002Hutton, , 2003Hutton, , 2005. Dittrich et al. (2001) gives a definition of such artificial chemistry that is a triple < S, R, A > where S is the set of particles, R is the set of reactions and A is the algorithm that apply the reactions. ...
... Dittrich et al. (2001) gives a definition of such artificial chemistry that is a triple < S, R, A > where S is the set of particles, R is the set of reactions and A is the algorithm that apply the reactions. Using this definition of an artificial chemistry, it was demonstrated that rules can be specified that allow the replication of some molecules and simple cells (Hutton, 2002(Hutton, , 2007. However, as explained in the third experiment of Hutton (2007), it is necessary to randomly modify the state of the atoms to eventually obtain a cell that could use the defined set of chemical rules to replicate. ...
Conference Paper
Full-text available
We present a new artificial chemistry simulator based on sim-ple physical and chemical rules. The simulator relies on a simplification of bonding and internal energy concepts found in chemistry to model simple, large scale, chemical reactions without delay between computation and visualization. En-ergy introduction and removal can be controlled in the sim-ulations in order to modulate reaction rates. The simulations demonstrate that with this simplified model of artificial chem-istry coupled with the concept of energy, it is possible to see the emergence of specific types of compounds, similar to real molecules.
... In the search for an artificial system that recreates the creativity of biological evolution, many different approaches have been investigated. The two key concepts of information-replication and competition have been implemented in machinecode systems [14, 13, 22, 10], cellular automata (CA) [15], artificial chemistries [7] and individualbased models [16, 20, 4]. While there is not a clear consensus on what creativity means in the context of evolution [3], a working definition is that the system should pass the test proposed in [2]. ...
... In [7] we introduced Squirm3, a simple artificial chemistry environment in which molecules that could make copies of themselves were possible. These molecules could exist in different forms and this led to a rapid evolution from longer molecules to shorter ones, since the shorter ones could replicate faster and thus use up the available atoms. ...
... However, at 50000 iterations or so mutations start appearing, and the shorter ones come to dominate because they can replicate faster, using the d182454 enzymes that are floating around. However, as they drive the population of ebaaacbacbaccdf molecules down, the number of enzymes present also drops as a result of the periodic floods [7] with T flood = 20000. The eventual outcome is a global extinction event. ...
Article
We present an artificial chemistry that supports molecules that make copies of themselves and also produce specific enzymes. As part of the replication process the sequence of bases is first transcribed onto a non-replicating molecule which then transforms itself into one or more enzymes using a base 3 encoding. The aim of the system is to give replicators control over their environ-ment with the intention that complexity might evolve, however this is not yet achieved. Simu-lation runs show that the naked replicators are highly vulnerable to parasites, and thus obtain no survival benefit from their enzymes. We speculate that membranes together with an enzyme produc-tion mechanism might be necessary for replicating molecules to evolve towards greater complexity.
... In this third definition, only novelties that increase the complexity of the evolving entities are considered. This definition of OE as a continual production of complexity is popular (Hutton 2002;Bentley 2003;Fernando et al. 2011;Heylighen 2012;Schulman et al. 2012;Ruiz-Mirazo and Moreno 2012;Lehman and Stanley 2012). However, this definition, which implicitly requires the existence of an ''arrow of complexity'' in biological evolution, is rejected by some authors, who state that open-ended evolution does not imply an increase of complexity but, simply, creates the possibility for it (Taylor 1999;Ruiz-Mirazo et al. 2004;Markovitch et al. 2012). ...
... Our argument here applies to any physical simulation, not just to computational ones. Hutton (2002)] or code in genetic programming [for example, Koza (1992)], to give a few examples. ...
Article
Full-text available
The open-endedness of a system is often defined as a continual production of novelty. Here we pin down this concept more fully by defining several types of novelty that a system may exhibit, classified as variation, innovation, and emergence. We then provide a meta-model for including levels of structure in a system’s model. From there, we define an architecture suitable for building simulations of open-ended novelty-generating systems and discuss how previously proposed systems fit into this framework. We discuss the design principles applicable to those systems and close with some challenges for the community.
... Since the invention of cellular automata in the 1940s (von Neumann 1966), researchers have been investigating the computational nature of replication (Langdon 1984). Originally this work was intended to study abstract systems, but computers have allowed more detailed simulations of replicating virtual organisms (Ray 1991, Wilke et al. 2001, Pargellis 1996, virtual chemicals (Hutton 2002), and virtual nucleotides (Smith 2002). Spontaneously emerging, self-replicating structures appear in many of these systems, but are robotic systems limited by macroscopic mechanical rules capable of supporting similar behavior? ...
... To better understand the limits and potential of replicating robots, we propose and implement a simulation of modular robotic "molecubes" in a nonuniform cellular automata variant (Sipper 1995). Inspired by other artificial life and cellular automata simulations which observe spontaneous self-replication (Chou and Reggia 1997, Pargellis 1996, Hutton 2002, Smith 2002, we created an undirected evolutionary environment in which mechanical self-replication could evolve naturally over time. The rules of the simulation are based off the discrete mechanics and capabilities of real molecubes, which have been shown capable of preprogrammed replication in other research (Mytilinaios et al. 2004). ...
Article
We propose and implement a discrete, two-dimensional evolutionary simulation of large numbers of interacting robotic modules called "molecubes," which were shown empirically to have self-replicating ability (Zykov et al. 2005). In this simulation, the spontaneous and continuous emergence of large numbers of simple self-replicating molecube structures is observed without any explicit selection for this behavior. A quantitative comparison of structures' self-replicability is investigated using a universal self-replication metric (Bryant and Lipson 2003), as well as the existence of interacting structure groups. Because of the discrete nature of the simulation and simple interactions, it bears strong resemblance to nonuniform cellular automata (Sipper 1995).
... In [6] a system of eight reaction rules was presented that allowed molecules in an artificial chemistry to replicate. The reaction rules could not be modified through evolution and in experiments the smallest molecule always dominated because it could replicate fastest. ...
... Our ancestor molecule is a string of atoms e8b1b1b1b1b1b1b1b1f1, where each b atom is an enzyme, carrying one of the reactions shown in Table 1. These rules are the same as in [6] with the addition of the enzyme-copying flag. A soup of atoms in state 0 is required for replication to continue. ...
Article
We present a novel form of self-replicator in an artificial chemistry, inspired by DNA and designed to facilitate studies of evo-lution and creativity: an Enzyme Artificial Chemistry (EAC). A string of 'atoms' moves in a two-dimensional lattice space, and interacts with other atoms through local reactions. Each atom can act as an enzyme, catalysing a reaction when the correct reactants are nearby. We show how a string of such enzymes is able to self-replicate using free-floating atoms as 'food'. Mutations cause atoms to be removed from a string, or added with a random enzyme, allowing the very method by which the strings replicate to alter. Evolution in the system is demonstrated.
... Given the large number of uncertainties concerning the possible biochemistry that would lead to the origin of self-replication and life, either on Earth or other planets, researchers have begun to study the process of emergence in an abstract manner. Tools from computer science [6][7][8][9][10][11], information theory [12][13][14][15], and statistical physics [16,17] have been used in an attempt to understand life and its origins at a fundamental level, removed from the peculiarities of any particular chemistry. Investigations along those lines may reveal to us general laws governing the emergence of life that are obscured by the n = 1 nature of our current evidence, point us to experiments that probe such putative laws, and get us closer to understand the inevitability-or perhaps the elusiveness-of life itself [18]. ...
Preprint
While all organisms on Earth descend from a common ancestor, there is no consensus on whether the origin of this ancestral self-replicator was a one-off event or whether it was only the final survivor of multiple origins. Here we use the digital evolution system Avida to study the origin of self-replicating computer programs. By using a computational system, we avoid many of the uncertainties inherent in any biochemical system of self-replicators (while running the risk of ignoring a fundamental aspect of biochemistry). We generated the exhaustive set of minimal-genome self-replicators and analyzed the network structure of this fitness landscape. We further examined the evolvability of these self-replicators and found that the evolvability of a self-replicator is dependent on its genomic architecture. We studied the differential ability of replicators to take over the population when competed against each other (akin to a primordial-soup model of biogenesis) and found that the probability of a self-replicator out-competing the others is not uniform. Instead, progenitor (most-recent common ancestor) genotypes are clustered in a small region of the replicator space. Our results demonstrate how computational systems can be used as test systems for hypotheses concerning the origin of life.
... After all, many of humanity's recent achievements involve mathematical reasoning and technical prowess. It may be the case that low-level control and perception-aspects that many simulations aim to reproduce (Dittrich et al., 2001;Hutton, 2002)are not necessary to evolve these capabilities. Indeed, even evolving morphologies, as many simulations do (Sims, 1994;Silveira and Massad, 1998;Spector et al., 2007;Bessonov et al., 2015;Pathak et al., 2019;Heinemann, 2024), may not be necessary for the evolution of advanced reasoning. ...
Preprint
Full-text available
Human intelligence emerged through the process of natural selection and evolution on Earth. We investigate what it would take to re-create this process in silico. While past work has often focused on low-level processes (such as simulating physics or chemistry), we instead take a more targeted approach, aiming to evolve agents that can accumulate open-ended culture and technologies across generations. Towards this, we present JaxLife: an artificial life simulator in which embodied agents, parameterized by deep neural networks, must learn to survive in an expressive world containing programmable systems. First, we describe the environment and show that it can facilitate meaningful Turing-complete computation. We then analyze the evolved emergent agents' behavior, such as rudimentary communication protocols, agriculture, and tool use. Finally, we investigate how complexity scales with the amount of compute used. We believe JaxLife takes a step towards studying evolved behavior in more open-ended simulations. Our code is available at https://github.com/luchris429/JaxLife
... Moreover, autocatalytic networks arise inevitably with sufficiently distinctive catalysts in the prebiotic "soup" [26]. These have also been simulated in computational experiments [8,27,20,28]. ...
Preprint
Full-text available
The fields of Origin of Life and Artificial Life both question what life is and how it emerges from a distinct set of "pre-life" dynamics. One common feature of most substrates where life emerges is a marked shift in dynamics when self-replication appears. While there are some hypotheses regarding how self-replicators arose in nature, we know very little about the general dynamics, computational principles, and necessary conditions for self-replicators to emerge. This is especially true on "computational substrates" where interactions involve logical, mathematical, or programming rules. In this paper we take a step towards understanding how self-replicators arise by studying several computational substrates based on various simple programming languages and machine instruction sets. We show that when random, non self-replicating programs are placed in an environment lacking any explicit fitness landscape, self-replicators tend to arise. We demonstrate how this occurs due to random interactions and self-modification, and can happen with and without background random mutations. We also show how increasingly complex dynamics continue to emerge following the rise of self-replicators. Finally, we show a counterexample of a minimalistic programming language where self-replicators are possible, but so far have not been observed to arise.
... More formally, an AC can be denoted as a triple (S, R, A) in which S is a set of available molecules, R is a set of all possible interaction rules, and A is an algorithm that describes the system and how the molecules or objects interact with each other [18]. In a case in which molecules can move, an AC allows for rich and more complex interactions to emerge in the system [19], which is in line with the goals of this paper. ACs have been previously used to model neural networks [20], self-organizing systems [21] and self-replicating systems [22]. ...
Preprint
Full-text available
Gene Regulatory Networks are networks of interactions in biological organisms responsible for determining the production levels of proteins and peptides. Proteins are workers of a cell factory, and their production defines the goal of a cell and its development. Various attempts have been made to model such networks both to understand these biological systems better and to use inspiration from understanding them to solve computational problems. In this work, a biologically more realistic model for gene regulatory networks is proposed, which incorporates Cellular Automata and Artificial Chemistry to model the interactions between regulatory proteins called the Transcription Factors and the regulatory sites of genes. The result of this work shows complex dynamics close to what can be observed in nature. Here, an analysis of the impact of the initial states of the system on the produced dynamics is performed, showing that such evolvable models can be directed towards producing desired protein dynamics.
... In previous studies, people used to explore the parameter space manually (typically not presented explicitly in their papers-perhaps because it is difficult to characterize the route of the manual exploration), which is an arduous job, especially for the complicated modeling comprising many parameters (notably, the parameter space increases exponentially with parameter numbers). As a consequence, researchers in this area tend to adopt modeling systems quite abstract (thus involving fewer parameters)-some studies even resorted on oversimplified models (e.g., the so-called "toy models" or even "artificial chemistry" [36][37][38]), which appeared doubtful in their relevance to the reality concerning the origin of life. The awkward situation raises an urgent issue: can we do the parameter-exploration in a more automatic way? ...
Article
Full-text available
The origin of life involved complicated evolutionary processes. Computer modeling is a promising way to reveal relevant mechanisms. However, due to the limitation of our knowledge on prebiotic chemistry, it is usually difficult to justify parameter-setting for the modeling. Thus, typically, the studies were conducted in a reverse way: the parameter-space was explored to find those parameter values “supporting” a hypothetical scene (that is, leaving the parameter-justification a later job when sufficient knowledge is available). Exploring the parameter-space manually is an arduous job (especially when the modeling becomes complicated) and additionally, difficult to characterize as regular “Methods” in a paper. Here we show that a machine-learning-like approach may be adopted, automatically optimizing the parameters. With this efficient parameter-exploring approach, the evolutionary modeling on the origin of life would become much more powerful. In particular, based on this, it is expected that more near-reality (complex) models could be introduced, and thereby theoretical research would be more tightly associated with experimental investigation in this field–hopefully leading to significant steps forward in respect to our understanding on the origin of life.
... One such AC environment is CoreWorld, which can evolve cooperative structures from only 10 basic operating instructions [86]. Squirm3 is another AC environment in which self-replicators based on template-based catalysis emerge spontaneously from a random configuration [87]. Squirm3 comprises atoms which can form bonds between each other made and broken by reactions. ...
Article
Full-text available
We present work in 3D printing electric motors from basic materials as the key to building a self-replicating machine to colonise the Moon. First, we explore the nature of the biological realm to ascertain its essence, particularly in relation to the origin of life when the inanimate became animate. We take an expansive view of this to ascertain parallels between the biological and the manufactured worlds. Life must have emerged from the available raw material on Earth and, similarly, a self-replicating machine must exploit and leverage the available resources on the Moon. We then examine these lessons to explore the construction of a self-replicating machine using a universal constructor. It is through the universal constructor that the actuator emerges as critical. We propose that 3D printing constitutes an analogue of the biological ribosome and that 3D printing may constitute a universal construction mechanism. Following a description of our progress in 3D printing motors, we suggest that this engineering effort can inform biology, that motors are a key facet of living organisms and illustrate the importance of motors in biology viewed from the perspective of engineering (in the Feynman spirit of “what I cannot create, I cannot understand”).
... The AC uses swarm techniques combined with Cellular Automata rules to allow proteins to exist and interact with their 26 nearest neighbours (NNs) in a 3D voxellated environment, partially inspired by artificial chemistry modelling techniques (Hutton, 2002). The AC resides within a membrane-bound 'cell' and receptors in the membrane relay external signals to the AC via a pathway of proteins: the Transduction Pathway (TP). ...
Conference Paper
The Artificial Cytoskeleton (AC) is introduced as a new model for generating adaptive growth of an artificial cell’s morphology throughout its lifetime in response to environmental cues. The AC utilizes swarm and cellular automata techniques. It is closely modelled on the eukaryotic cytoskeleton which is responsible for giving the cell dynamic structure and function. The AC is tested in a simple chemotaxis experiment and is shown to effect morphological adaptation during the cell’s lifetime
... Different AChems take different approaches in terms of determinism. Some systems, such as Hutton (2002), always link particles that encounter each other and match a linking rule. Young and Neshatian (2015) investigate different approaches by which reactants are chosen for linking, but the linking is then deterministic. ...
Conference Paper
Full-text available
Natural chemistry deals with non-deterministic processes, and this is reflected in some artificial chemistries. We can tune these artificial systems by manipulating the functions that define their probabilistic processes. In this work we consider different probabilistic functions for particle linking, applied to our Jordan Algebra Artificial Chemistry. We use five base functions and their variations to investigate the possible behaviours of the system, and try to connect those behaviours to different traits of the functions. We find that, while some correlations can be seen, there are unexpected behaviours that we cannot account for in our current analysis. While we can set and manipulate the probabilities in our system, it is still complex and still displays emergent behaviour that we can not fully control.
... Given the large number of uncertainties concerning the possible biochemistry that would lead to the origin of self-replication and life, either on Earth or other planets, researchers have begun to study the process of emergence in an abstract manner. Tools from computer science [6][7][8][9][10][11], information theory [12][13][14][15], and statistical physics [16,17] have been used in an attempt to understand life and its origins at a fundamental level, removed from the peculiarities of any particular chemistry. Investigations along those lines may reveal to us general laws governing the emergence of life that are obscured by the n = 1 nature of our current evidence, point us to experiments that probe such putative laws, and get us closer to understand the inevitability-or perhaps the elusiveness-of life itself [18]. ...
Article
Full-text available
While all organisms on Earth share a common descent, there is no consensus on whether the origin of the ancestral self-replicator was a one-off event or whether it only represented the final survivor of multiple origins. Here, we use the digital evolution system Avida to study the origin of self-replicating computer programs. By using a computational system, we avoid many of the uncertainties inherent in any biochemical system of self-replicators (while running the risk of ignoring a fundamental aspect of biochemistry). We generated the exhaustive set of minimal-genome self-replicators and analysed the network structure of this fitness landscape. We further examined the evolvability of these self-replicators and found that the evolvability of a self-replicator is dependent on its genomic architecture. We also studied the differential ability of replicators to take over the population when competed against each other, akin to a primordial-soup model of biogenesis, and found that the probability of a self-replicator outcompeting the others is not uniform. Instead, progenitor (most-recent common ancestor) genotypes are clustered in a small region of the replicator space. Our results demonstrate how computational systems can be used as test systems for hypotheses concerning the origin of life. This article is part of the themed issue ‘Reconceptualizing the origins of life’.
... It is clear from Figure 14.2 that most CA-based models make use of a 2D [114,209,210,211,212,213,214] tessellation. Further, it can also be seen that 1D [215,216,217] and 3D [218,219,220] tessellations and combinations of tessellations of different dimensions [10,221] discussed in the same article, exist. ...
... The applications of artificial chemistries go beyond ALife, reaching biology, information processing (in the form of natural and artificial chemical computing models) and evolutionary algorithms for optimization, among other domains. Chemical models have also been used to express replication, reproduction and variation mechanisms (Dittrich & Banzhaf, 1998;Dittrich et al., 2001;Hutton, 2002;Teuscher, 2007;Yamamoto, Schreckling, & Meyer, 2007). ...
Article
Living systems reached a good balance between competition and co-operation in order to prevail – as individual or as a group. Artificial systems either work in isolation or are manually designed to cooperate, which is of paramount importance in networking applications. Recently, research considered bio-inspired approaches to increase the robustness of distributed algorithms. However, when mimicking natural rules such as applying natural selection, the resulting systems often compete rather than cooperate in the struggle for existence. We recently presented an execution model for networking protocols inspired by chemical reactions in which we organized networking soft-ware as self-rewriting sets of "molecules". If memory is limited, our protocol software exhibits remarkable robustness to faults and is able to run on unreliable hardware, because healthy software is able to replicate while faulty elements die out. In this report we study the competitive nature of this environment and propose a methodology to design complex self-healing software that is able to cooperate therein. We resort to the study of self-organization in nature and adapt concepts like Eigen's Hypercycle to our software. As an application case, we demonstrate how the competitive and cooperative forces can be exploited for a controlled update of software in a network.
... Artificial chemistries use the multiset formalism to specify the substrates, catalysts and products of chemical reactions (e.g. [203]) and can be applied to biochemical reactions [125,378]. Sophisticated models of biological processes can be built out of these reaction specifications. ...
... The applications of artificial chemistries go beyond ALife, reaching biology, information processing (in the form of natural and artificial chemical computing models) and evolutionary algorithms for optimization, among other domains. Chemical models have also been used to express replication, reproduction and variation mechanisms (Dittrich & Banzhaf, 1998;Dittrich et al., 2001;Hutton, 2002;Teuscher, 2007;Yamamoto, Schreckling, & Meyer, 2007). ...
Article
Fault tolerant systems are usually built around redundant elements con-trolled by a central observer that decides which of the elements provide the correct results and thus are healthy. Nature lacks such dedicated controllers; instead, proliferation of "good" results is an emergent phe-nomenon achieved through homeostasis – the intrinsic self-regulation in order to maintain a stable, healthy state. We report about a methodology to design distributed homeostatic software systems that are dynamically stable and robust in execution. By continuously replicating its own code base, our software is able to thwart unreliable execution and even accidental code changes. The crucial part is to build the system such that it regulates its replication and thus organizes its own redundancy. This is achieved in an execution environment that mimics chemical kinetics in which we implement self-rewriting programs (Quines). We analyze the robustness of these self-healing programs using phase-type distribution and apply other tools, developed for chemi-cal reaction network analysis, which can now be applied to software execution and networking protocol analysis. We also demonstrate the practical use of our model with a link load balancing protocol.
... Artificial Chemistries (AChems) can be explored from a computational viewpoint, for example, as tools for implementing evolutionary algorithms [9] and controlling robots [6]. They can also be used to model biological systems [10] such as replication [12] and membrane formation [13]. These varied applications of AChems lead to varied ways of defining them, and consequently to AChems defined on different levels of abstraction, with different properties. ...
Article
We introduce multi-level Artificial Chemistries as a way of tackling difficult problems in the evolution of complexity. We present two algorithms for moving between levels of abstrac-tion in a multi-level Artificial Chemistry. (1) Moving up-wards from a low-level description to a high-level description involves making approximations. We discuss these, and pro-vide an algorithm to perform the approximations. (2) Mov-ing downwards is more problematic. We discuss the issues involved in moving down, including conservation of mass. We present an algorithm to generate constraints that any low-level implementation of the system must satisfy. These con-straints can be used to: obtain information about the system; automatically generate a low-level implementation of the sys-tem; guide a search for suitable low-level implementations of the system.
... When agglomerated, it is changed into a nineteen-node chain and finally folded into a control-flow cluster shown in Fig. 8. The algorithm of the replication is based upon what Hutton proposed [8]. The nineteen constituent nodes are classified into five groups according to the functions. ...
Article
Under the framework of Network Artificial Chemistry (NAC), a method to con- struct a phenotypic machine from genetypic information is proposed. The genotype is expressed as a sequence of nodes (a node chain) with symbol data. According to an implemented algorithm, a chain is agglomerated, tangled, and finally folded into a node cluster that works as a control-flow machine in the network. Two examples of control-flow clusters, splitase and replicase, are presented and their functionality is demonstrated. Several meanings and future research directions are discussed.
... The AC uses swarm techniques combined with Cellular Automata rules to allow proteins to exist and interact with their 26 nearest neighbours (NNs) in a 3D voxellated environment , partially inspired by artificial chemistry modelling techniques (Hutton, 2002). The AC resides within a membrane-bound 'cell' and receptors in the membrane relay external signals to the AC via a pathway of proteins: the Transduction Pathway (TP). ...
Article
Full-text available
The Artificial Cytoskeleton (AC) is introduced as a new model for generating adaptive growth of an artificial cell's morphology throughout its lifetime in response to environ-mental cues. The AC utilizes swarm and cellular automata techniques. It is closely modelled on the eukaryotic cy-toskeleton which is responsible for giving the cell dynamic structure and function. The AC is tested in a simple chemo-taxis experiment and is shown to effect morphological adap-tation during the cell's lifetime.
Article
Gene regulatory networks are networks of interactions in organisms responsible for determining the production levels of proteins and peptides. Mathematical and computational models of gene regulatory networks have been proposed, some of them rather abstract and called artificial regulatory networks. In this contribution, a spatial model for gene regulatory networks is proposed that is biologically more realistic and incorporates an artificial chemistry to realize the interaction between regulatory proteins called the transcription factors and the regulatory sites of simulated genes. The result is a system that is quite robust while able to produce complex dynamics similar to what can be observed in nature. Here an analysis of the impact of the initial states of the system on the produced dynamics is performed, showing that such models are evolvable and can be directed toward producing desired protein dynamics.
Chapter
Full-text available
Proceedings from the ninth International Conference on Artificial Life; papers by scientists of many disciplines focusing on the principles of organization and applications of complex, life-like systems. Artificial Life is an interdisciplinary effort to investigate the fundamental properties of living systems through the simulation and synthesis of life-like processes. The young field brings a powerful set of tools to the study of how high-level behavior can arise in systems governed by simple rules of interaction. Some of the fundamental questions include: What are the principles of evolution, learning, and growth that can be understood well enough to simulate as an information process? Can robots be built faster and more cheaply by mimicking biology than by the product design process used for automobiles and airplanes? How can we unify theories from dynamical systems, game theory, evolution, computing, geophysics, and cognition? The field has contributed fundamentally to our understanding of life itself through computer models, and has led to novel solutions to complex real-world problems across high technology and human society. This elite biennial meeting has grown from a small workshop in Santa Fe to a major international conference. This ninth volume of the proceedings of the international A-life conference reflects the growing quality and impact of this interdisciplinary scientific community. Bradford Books imprint
Article
The article presents the DigiHive system, an artificial chemistry simulation environment, and the results of preliminary simulation experiments leading toward building a self-replicating system resembling a living cell. The two-dimensional environment is populated by particles that can bond together and form complexes of particles. Some complexes can recognize and change the structures of surrounding complexes, where the functions they perform are encoded in their structure in the form of Prolog-like language expressions. After introducing the DigiHive environment, we present the results of simulations of two fundamental parts of a self-replicating system, the work of a universal constructor and a copying machine, and the growth and division of a cell-like wall. At the end of the article, the limitations and arising difficulties of modeling in the DigiHive environment are presented, along with a discussion of possible future experiments and applications of this type of modeling.
Article
This article advocates further examination of the role decay aesthetics can play in artificial life (ALife or AL) and art. Opening with the poetics of decay and the shadow that decay taboo has cast in western culture, firstly, we reframe decay as a constructive process of transformation. Secondly, we perform a brief historical survey of early artistic developments in the field of ALife, assessing how these early works addressed decay. We follow with a deeper analysis of contemporary artists through a lens of decay and decomposition, identifying new tendencies of ALife art (deep time simulation, slime intelligence, molecular agents, techno resurrection and ecohybridized computation). Finally, we look to the peripheries of ALife to see how decay is rendered in current technical research and examine these projects with an eye for turbulent production in the form of ‘decaying’ matter. We conclude with a number of open questions on decomposition and decay aesthetics, both within the artistic and technical realms of ALife.
Article
Full-text available
One of the main goals of Artificial Life is to research the conditions for the emergence of life, not necessarily as it is, but as it could be. Artificial chemistries are one of the most important tools for this purpose because they provide us with a basic framework to investigate under which conditions metabolisms capable of reproducing themselves, and ultimately, of evolving, can emerge. While there have been successful attempts at producing examples of emergent self-reproducing metabolisms, the set of rules involved remain too complex to shed much light on the underlying principles at work. In this article, we hypothesize that the key property needed for self-reproducing metabolisms to emerge is the existence of an autocatalyzed subset of Turing-complete reactions. We validate this hypothesis with a minimalistic artificial chemistry with conservation laws, which is based on a Turing-complete rewriting system called combinatory logic. Our experiments show that a single run of this chemistry, starting from a tabula rasa state, discovers-with no external intervention-a wide range of emergent structures including ones that self-reproduce in each cycle. All of these structures take the form of recursive algorithms that acquire basic constituents from the environment and decompose them in a process that is remarkably similar to biological metabolisms.
Chapter
Nature does not know the concept of a dedicated controlling instance; instead, “control” is an emergent phenomenon. This is in stark contrast with computer networking where protocol control loops are (seemingly) in charge: while the functional aspect of a networking service can be well mastered, the dynamic behavior is still difficult to understand and even control. In this chapter, we present a methodology how to design distributed software systems that are dynamically stable and robust in execution. It is based on continuously replicating a system’s own code base in order to thwart unreliable execution and even accidental code changes. The crucial part is to design the system such that it regulates its own replication. This can be achieved by an execution environment inspired by chemistry to which we add the concept of self-rewriting programs (Quines). With a link load balancing example we show how to exploit competition and cooperation in a self-rewriting service implementation.
Chapter
We present a new tool which simulates the development of Artificial Chemistries (AChems) to produce real-time imagery for artistic/entertainment purposes. There have been many such usages of complex systems (CSs) for artistic purposes, but deciding which parameters to use for such unpredictable systems can lead to a feeling of lack of control. For our purposes, we struggled to gain enough control over the AChem real-time image generation tool to accompany music in a video-jockeying application. To overcome this difficulty, we developed a general-purpose clustering approach that attempts to produce sets of parameter configurations which lead to maximally distinct visualisations, thus ensuring users feel that they have influence over the AChem when controlled with a suitable GUI. We present this approach and its application to controlling the development of AChems, along with the results from experiments with different clustering approaches, aided by both machine vision analysis and human curation. We conclude by advocating an overfitting approach supplemented by a final check by a designer, and discuss potential applications of this in artistic and entertainment settings.
Conference Paper
Full-text available
This paper instantiates an architecture for an artificial chemistry featuring continuous physics and discrete chemical reactions. A system of bonded complementary molecular strands replicates in the presence of a catalyst. The catalyst causes the strands to disengage; each strand subsequently replicates its missing complement by bonding to free atoms.
Article
Recent work in developing self-replicating machines has approached the problem as an engineering problem, using engineering materials and methods to implement an engineering analogue of a hitherto uniquely biological function. The question is – can anything be learned that might be relevant to an astrobiological context in which the problem is to determine the general form of biology independent of the Earth. Compared with other non-terrestrial biology disciplines, engineered life is more demanding. Engineering a self-replicating machine tackles real environments unlike artificial life which avoids the problem of physical instantiation altogether by examining software models. Engineering a self-replicating machine is also more demanding than synthetic biology as no library of functional components exists. Everything must be constructed de novo . Biological systems already have the capacity to self-replicate but no engineered machine has yet been constructed with the same ability – this is our primary goal. On the basis of the von Neumann analysis of self-replication, self-replication is a by-product of universal construction capability – a universal constructor is a machine that can construct anything (in a functional sense) given the appropriate instructions (DNA/RNA), energy (ATP) and materials (food). In the biological cell, the universal construction mechanism is the ribosome. The ribosome is a biological assembly line for constructing proteins while DNA constitutes a design specification. For a photoautotroph, the energy source is ambient and the food is inorganic. We submit that engineering a self-replicating machine opens up new areas of astrobiology to be explored in the limits of life.
Chapter
We wish to use Artificial Chemistries to build and investigate open-ended systems. As such, we wish to minimise the number of explicit rules and properties needed. We describe here the concept of sub-symbolic Artificial Chemistries (ssAChems), where reaction properties are emergent properties of the internal structure and dynamics of the component particles. We define the components of a ssAChem, and illustrate it with two examples: RBN-world, where the particles are Random Boolean Networks, the emergent properties come from the dynamics on an attractor cycle, and composition is through rewiring the components to form a larger RBN; and SMAC, where the particles are Hermitian matrices, the emergent properties are eigenvalues and eigenvectors, and composition is through the non-associative Jordan product. We conclude with some ssAChem design guidelines.
Conference Paper
Traditional way of problem solving tries to deliver data to program. But when the problem’s complexity exponentially increases as the data scale increases, to obtain the solution is difficult. Group cooperation computing model works in an inverse way by delivering program to data. It first models each single data as individual and data unit as group of individuals. Then, different cooperation rules are designed for individuals to cooperate with each other. Finally, the solution of the problem emerges through individuals’ cooperation process. This study applies group cooperation computing model to solve Hamilton Path problem which has NP-complete time complexity. Experiment results show that the cooperation model works much better than genetic algorithm. More importantly, the following properties of group cooperation computing are found which may be different from the traditional computing theory. (1) By using different cooperation rules, the same problem with the same scale may exhibit different complexities, such as liner or exponent. (2) By using the same cooperation rule, when the problem scale is less than a specific threshold, the problem’s time complexity is liner. Otherwise, the problem complexity may be exponent.
Conference Paper
The molecules within an Artificial Chemistry form an evolutionary system, capable under certain conditions of displaying interesting emergent behaviours. We investigate experimentally the effect on emergence of the combinations of selected strategies for choosing reactants (Uniform and Kinetic selection) and products (Uniform and Least Energy selection) as measured by three measures of reaction cycle formation. Emergence is maximised by a Kinetic reactant selection strategy; the choice of product selection strategy has minimal effect.
Conference Paper
Artificial chemistry is a man-made system that is similar to a real chemical system. It represents a good starting point to simulate cell processes from the bio-chemistry level. In this article, an artificial chemistry system which strikes a balance among closeness to reality, fast simulation speed and high flexibility is proposed. Preliminary results have shown that the model can simulate a general reversible reaction well.
Article
Nature does not know the concept of a dedicated controlling instance; instead, "control" is an emergent phenomenon. This is in stark contrast with computer networking where protocol control loops are (seemingly) in charge: While the functional aspect of a networking service can be well mastered, the dynamic behavior is still difficult to understand and even control. In this chapter, we present a methodology how to design distributed software systems that are dynamically stable and robust in execution. It is based on continuously replicating a system's own code base in order to thwart unreliable execution and even accidental code changes. The crucial part is to design the system such that it regulates its own replication. This can be achieved by an execution environment inspired by chemistry to which we add the concept of self-rewriting programs (Quines). With a link load balancing example we show how to exploit competition and cooperation in a self-rewriting service implementation.
Chapter
Glossary Definition of the Subject Introduction Basic Building Blocks of an Artificial Chemistry Structure-to‐Function Mapping Space Theory Evolution Information Processing Future Directions Bibliography
Chapter
Survival analysis has received a great deal of attention as a subfield of Bayesian nonparametrics over the last 50 years. In particular, the fitting of survival models that allow for sophisticated correlation structures has become common due to computational advances in the 1990s, in particular Markov chain Monte Carlo techniques. Very large, complex spatial datasets can now be analyzed accurately including the quantification of spatiotemporal trends and risk factors. This chapter reviews four nonparametric priors on baseline survival distributions in common use, followed by a catalogue of semiparametric and nonparametric models for survival data. Generalizations of these models allowing for spatial dependence are then discussed and broadly illustrated. Throughout, practical implementation through existing software is emphasized.
Article
Pre-mRNA splicing is a key process in the gene expression of eukaryotic cells, which includes two major procedures: the cutting of introns and the union of exons. This paper proposes a algorithm to simulate the Pre-mRNA splicing process in the Analog-Cell, which was developed independently. Using the algorithm, the correct results are consistent with the biological principles. Through simulation of the process, the study has found that this algorithm has the capacity to make the introns to form the lariat structure efficiently and accurately, meanwhile to unit the exons selectively. In this way, the Analog-Cell can finish the formation of the mature mRNA.
Article
Full-text available
For millennia people have wondered what makes the living different from the non-living. Beginning in the mid-1980s, artificial life has studied living systems using a synthetic approach: build life in order to understand it better, be it by means of software, hardware, or wetware. This review provides a summary of the advances that led to the development of artificial life, its current research topics, and open problems and opportunities. We classify artificial life research into 14 themes: origins of life, autonomy, self-organization, adaptation (including evolution, development, and learning), ecology, artificial societies, behavior, computational biology, artificial chemistries, information, living technology, art, and philosophy. Being interdisciplinary, artificial life seems to be losing its boundaries and merging with other fields.
Article
This chapter focuses on artificial chemistry, a research approach for constructing life-like systems in artificial environments, and presents its fundamental concepts and system design requirements. Based on this discussion, we move on to evaluate typical artificial chemistry systems: We propose 13 conditions necessary for emergent evolution and three topological conditions that must be met by the rules for transport of symbols. We also introduce the concept of a molecular network, which emulates the spatial relationship of the molecules. With hard-sphere random-walk simulations, we show seven topological conditions that the molecular network must satisfy. In the latter half of this chapter, we feature the example of network artificial chemistry (NAC), in which a molecular network is used for spatial representation, and present some of the research results derived since this was first proposed. We begin by introducing a model wherein a cluster formed by folding a node chain functions as an active machine. We then overview studies on rewiring rules for weak edges, which form the basis for network dynamics. We discuss the merits and demerits of the method expressing spatial constraints derived from the network energy, then introduce models devised to circumvent the difficulties: a model that implements active functions (programs) in the nodes and an improved model that implements the programs in agents to allow them to move within the network. The improved model, called “program-flow computing,” is expected to be refined into a new computing model.
Conference Paper
Full-text available
This paper describes an artificial chemistry featuring atoms and molecules moving and colliding in a continuous manner in a viscous fluid filling a 2D cellular space. Chemical reactions are mappings of discrete cellular configurations to parameterized actions on atoms. Actions allow atom creation and d estruction, bonding and unb onding to make a nd b reak molecules, orientation, type change, and propulsion. Actions are ea sily added in this extensible model. An example involving a complex "foraging" reaction is provided as a demonstration of the capabilities of the framework. The reaction rules can be evolved by a genetic algorithm to exhibit a desired set of reactions. A portion of the foraging reaction was evolved to demonstrate this.
Chapter
Although spatial structures can play a crucial role in chemical systems and can drastically alter the outcome of reactions, the traditional framework of artificial chemistry is a well-stirred tank reactor with no spatial representation in mind. Advanced method development in physical chemistry has made a class of models accessible to the realms of artificial chemistry that represent reacting molecules in a coarse-grained fashion in continuous space. This chapter introduces the mathematical models of Brownian dynamics (BD) and dissipative particle dynamics (DPD) for molecular motion and reaction. It reviews calibration procedures, outlines the computational algorithms, and summarizes examplary applications. Four different platforms for BD and DPD simulations are presented that differ in their focus, features, and complexity.
Article
Self-replication is a process critical to natural and artificial life, but has been investigated to date mostly in simulation and in abstract systems. The near absence of physical demonstrations of self-replication is due primarily to the lack of a physical substrate in which self-replication can be implemented. This paper proposes a substrate composed of simple modular units, in which both simple and complex machines can construct and be constructed by other machines in the same substrate. A number of designs, both hand crafted and evolved, are proposed.
Article
Full-text available
This article lists fourteen open problems in artificial life, each of which is a grand challenge requiring a major advance on a fundamental issue for its solution. Each problem is briefly explained, and, where deemed helpful, some promising paths to its solution are indicated.
Article
Full-text available
Advances in directed evolution and membrane biophysics make the synthesis of simple living cells, if not yet foreseeable reality, an imaginable goal. Overcoming the many scientific challenges along the way will deepen our understanding of the essence of cellular life and its origin on Earth.
Article
Full-text available
In the late 1940s John von Neumann began to work on what he intended as a comprehensive "theory of [complex] automata." He started to develop a book length manuscript on the subject in 1952. However, he put it aside in 1953, apparently due to pressure of other work. Due to his tragically early death in 1957, he was never to return to it. The draft manuscript was eventually edited, and combined for publication with some related lecture transcripts, by Burks in 1966. It is clear from the time and effort that von Neumann invested in it that he considered this to be a very significant and substantial piece of work. However, subsequent commentators (beginning even with Burks) have found it surprisingly difficult to articulate this substance. Indeed, it has since been suggested that von Neumann's results in this area either are trivial, or, at the very least, could have been achieved by much simpler means. It is an enigma. In this paper I review the history of this debate (briefly) and then present my own attempt at resolving the issue by focusing on an analysis of von Neumann's problem situation. I claim that this reveals the true depth of von Neumann's achievement and influence on the subsequent development of this field, and further that it generates a whole family of new consequent problems, which can still serve to inform - if not actually define - the field of artificial life for many years to come.
Article
Full-text available
Biological experience and intuition suggest that self-replication is an inherently complex phenomenon, and early cellular automata models support that conception. More recently, simpler computational models of self-directed replication called sheathed loops have been developed. It is shown here that "unsheathing" these structures and altering certain assumptions about the symmetry of their components leads to a family of nontrivial self-replicating structures, some substantially smaller and simpler than those previously reported. The dependence of replication time and transition function complexity on initial structure size, cell state symmetry, and neighborhood are examined. These results support the view that self-replication is not an inherently complex phenomenon but rather an emergent property arising from local interactions in systems that can be much simpler than is generally believed.
Conference Paper
Full-text available
Acquisition of self-maintenance of cell membranes is an essential step to evolve from molecular to cellular reproduction. In this report, we present a model of artificial chemistry that simulates metabolic reactions, di#usion and repulsion of abstract chemicals in a two-dimensional space to realize the organization of proto-cell structures. It demonstrates that proto-cell structures that maintain and reproduce themselves autonomously emerge from a non-organized initial configuration. The results also suggest that a metabolic system that produces membranes can be selected in the chemical evolution of a pre-cellular stage.
Article
Full-text available
We present a new tierra-inspired artificial life system with local interactions and two-dimensional geometry, based on an update mechanism akin to that of 2D cellular automata. We find that the spatial geometry is conducive to the development of diversity and thus improves adaptive capabilities. We also demonstrate the adaptive strength of the system by breeding cells with simple computational abilities, and study the dependence of this adaptability on mutation rate and population size. 1 Introduction Artificial systems such as Tom Ray's tierra have opened the possibility of studying open-ended evolution in strictly controlled circumstances, allowing experiments that were previously unthinkable as the only alternative was "wetware". The study of evolution in an information-rich artificial environment requires ever larger and faster systems, and present systems are largely restricted by such limits. Distributing tierra simulations over multiple processors is not practical on a large s...
Conference Paper
Full-text available
. Von Neumann's architecture for self-reproducing, evolvable machines is described. From this starting point, a number of issues relating to self-reproduction and evolution are discussed. A summary is given of various arguments which have been put forward regarding the superiority of genetic reproduction over self-inspection methods. It is argued that programs in artificial life platforms such as Tierra reproduce genetically rather than by self-inspection (as has previously been claimed). However, the distinction is blurred because significant parts of the reproduction process in Tierran programs are implicitly encoded in the Tierran operating system. The desirable features of a structure suitable for acting as a seed for an open-ended evolutionary process are discussed. It is found that the properties of such a structure are somewhat different to those of programs in Tierra-like platforms. These analyses suggest ways in which the evolvability of individuals in artificial ...
Thesis
Full-text available
This work addresses the question: What are the basic design considerations for creating a synthetic model of the evolution of living systems (i.e. an `artificial life' system)? It can also be viewed as an attempt to elucidate the logical structure (in a very general sense) of biological evolution. However, with no adequate definition of life, the experimental portion of the work concentrates on more specific issues, and primarily on the issue of open-ended evolution. An artificial evolutionary system called Cosmos, which provides a virtual operating system capable of simulating the parallel processing and evolution of a population of several thousand self-reproducing computer programs, is introduced. Cosmos is related to Ray's established Tierra system [Ray 91], but there are a number of significant differences. A wide variety of experiments with Cosmos, which were designed to investigate its evolutionary dynamics, are reported. An analysis of the results is presented, with particular ...
Article
Full-text available
In 1998, Bedau et al. defined a set of metrics for characterizing the long-term evolutionary dynamics of a system. They argued that no known artificial system demonstrates the unbounded evolutionary activity observed in the fossil record. In response we have developed a series of toy models that approach and eventually succeed in demonstrating unbounded evolutionary activity. The underwhelming success of the models suggests that there must be more to open-ended evolution than just unbounded evolutionary activity as the term is currently defined. We derive some potential extensions to the metrics and requirements for developing open-ended evolution in an artificial system. 1 INTRODUCTION The richness of biological life has never been replicated in our artificial models of evolution. Yet, the challenge to understand the basis of open-ended evolution and to construct an artificial system that demonstrates such a capacity stands as one of the fundamental unsolved problems ...
Conference Paper
Full-text available
We have constructed a simple model of a proto-cell that simulates stochastic dynamics of abstract chemicals on a two-dimensional lattice. We have assumed that chemicals catalyze their reproduction through interaction with each other, and that repulsion occurs between some chemicals. We have shown that chemicals organize themselves into a cell-like structure that maintains its membranes dynamically. Further, we have obtained cells that can divide themselves automatically into daughter cells.
Article
Full-text available
JohnnyVon is an implementation of self-replicating automata in continuous two-dimensional space. Two types of particles drift about in a virtual liquid. The particles are automata with discrete internal states but continuous external relationships. Their internal states are governed by finite state machines but their external relationships are governed by a simulated physics that includes brownian motion, viscosity, and spring-like attractive and repulsive forces. The particles can be assembled into patterns that can encode arbitrary strings of bits. We demonstrate that, if an arbitrary "seed" pattern is put in a "soup" of separate individual particles, the pattern will replicate by assembling the individual particles into copies of itself. We also show that, given sufficient time, a soup of separate individual particles will eventually spontaneously form self-replicating patterns. We discuss the implications of JohnnyVon for research in nanotechnology, theoretical biology, and artificial life.
Technical Report
JohnnyVon is an implementation of self-replicating automata in continuous twodimensional space. Two types of particles drift about in a virtual liquid. The particles are automata with discrete internal states but continuous external relationships. Their internal states are governed by finite state machines but their external relationships are governed by a simulated physics that includes brownian motion, viscosity, and springlike attractive and repulsive forces. The particles can be assembled into patterns that can encode arbitrary strings of bits. We demonstrate that, if an arbitrary 'seed' pattern is put in a 'soup' of separate individual particles, the pattern will replicate by assembling the individual particles into copies of itself. We also show that, given sufficient time, a soup of separate individual particles will eventually spontaneously form self-replicating patterns. We discuss the implications of JohnnyVon for research in nanotechnology, theoretical biology, and artificial life.
Article
Autocatalysis in a nonenzymatic, templete-directed condensation has been demonstrated in a system consisting of three oligonucleotides. A simple form of self-replication occurs, albeit only to a small extent: the template T organizes the building blocks Å and B in such a way that condensation can occur, leading to a second template molecule. Such a nonenzymatic process has long been sought, because it is postulated as a sine qua non for prebiotic evolution in theories on the origin of life.
Article
Past cellular automata models of self-replication have generally done only one thing: replicate themselves. However, it has recently been demonstrated that such self-replicating structures can be programmed to also carry out a task during the replication process. Past models of this sort have been limited in that the "program" involved is copied unchanged from parent to child, so that each generation of replicants is executing exactly the same program on exactly the same data. Here we take a different approach in which each replicant receives a distinct partial solution that is modified during replication. Under artificial selection, replicants with promising solutions proliferate while those with failed solutions are lost. We show that this approach can be applied successfully to solve an NP-complete problem, the satisfiability problem. Bounds are given on the cellular space size and time needed to solve a given problem, and simulations demonstrate that this approach works effectively. These and other recent results raise the possibility of evolving self-replicating structures that have a simulated metabolism or that carry out useful tasks.
Article
Sur la base de la decouverte d'activites enzymatiques de certains ARN (chez E. coli au cours de la maturation des ARN+ et chez Tetrahymena avec un exon d'un ARNr a auto-epissage), l'auteur postule un systeme, auto-replicatif a l'origine uniquement compose de molecules d'ARN
Article
Molecular self-assembly plays a crucial role as a structural and an organizational principle in supramolecular architecture. The key feature of this process is the generation of higher order molecular structures, and is solely determined by the dynamics of the individual molecular objects, characterized by an overall minimum free energy situation. Equally important as the constructional aspect of the formation process, these macromolecular assemblies carry novel functionalities which can be solely observed at the level of the supramolecular aggregates and not at any of the organizational levels below. This paper discusses the formation and successive self-reproduction of membraneous compartments in a polar environment in 2D using a lattice gas based simulation technique, the Lattice Molecular Automaton. This method describes realistic physico-chemical interactions as well as chemical reactivity between molecular units via discrete force fields propagated on the lattice. We investigate the formation dynamics of micelles, i.e., organized amphiphilic polymers in polar environment, as well as the kinetics of a concomittant micelle self-reproduction based on the formation of catalytic interfaces closely following in vitro experimental results: Micelle self-reproduction is a complex phenomenon based on concerted dynamics of the individual polymers within the many particle aggregate. All observables, i.e., micelle formation and autocatalytic micelle self-reproduction, are solely based on the properties of the individual chemical objects (amphiphilic polymers in polar environment), and are therefore emergent phenomena generated by the implicitly defined system dynamics. We introduce the formal concept of the emergence of novel functions in dynamical hierarchies and finally discuss these issues within the context of self-reproducing dynamical hierarchies.
Article
Self-reproduction in cellular automata is discussed with reference to Langton's criteria as to what constitutes genuine self-reproduction. It is found that it is possible to construct self-reproducing structures that are substantially less complex than that presented by Langton.
Article
Reviews the performance of computer programs for writing poetry and prose, including MARK V. SHANEY, MELL, POETRY GENERATOR, THUNDER THOUGHT, and ORPHEUS. Discusses the writing principles of the programs. Provides additional information on computer magnification techniques. (YP)
Article
Our approach towards “core-and-shell self-reproduction” is reviewed here. With this term, we indicate a process by which the shell-reproduction of a spherically bounded system (micelles or vesicles) proceeds simultaneously with the replication of nucleic acid which is hosted inside the micelles or vesicles. The realization of such a process is seen as a step towards the construction of a synthetic cell model. Two chemical systems are examined here, one based on the polynucleotide phosphorylase-catalyzed synthesis of poly(A) starting from ADP; and the other based on the enzyme Qβ replicase, which is able to catalyze the synthesis of a RNA template. In both cases, production of RNA macromolecules proceeds simultaneously with self-reproduction of the oleic acid/oleate vesicles. One still open question is the determination of the redistribution of the guest macromolecular components during the reproduction of the shell. Within these limits, it is argued that this work offers a system which goes beyond the two approaches to self-replication presented until now in the literature, namely the template self-reproduction of linear sequences of oligonucleotides and the autopoietic shell reproduction of micelles and vesicles.
Article
Past cellular automata models of self-replication have generally done only one thing: replicate themselves. However, it has recently been demonstrated that such self-replicating structures can be programmed to also carry out a task during the replication process. Past models of this sort have been limited in that the “program” involved is copied unchanged from parent to child, so that each generation of replicants is executing exactly the same program on exactly the same data. Here we take a different approach in which each replicant receives a distinct partial solution that is modified during replication. Under artificial selection, replicants with promising solutions proliferate while those with failed solutions are lost. We show that this approach can be applied successfully to solve an NP-complete problem, the satisfiability problem. Bounds are given on the cellular space size and time needed to solve a given problem, and simulations demonstrate that this approach works effectively. These and other recent results raise the possibility of evolving self-replicating structures that have a simulated metabolism or that carry out useful tasks.
Article
Past cellular automata models of self-replication have always been initialized with an original copy of the structure that will replicate, and have been based on a transition function that only works for a single, specific structure. This article demonstrates for the first time that it is possible to create cellular automata models in which a self-replicating structure emerges from an initial state having a random density and distribution of individual components. These emergent self-replicating structures employ a fairly general rule set that can support the replication of structures of different sizes and their growth from smaller to larger ones. This rule set also allows “random” interactions of self-replicating structures with each other and with other structures within the cellular automata space. Systematic simulations show that emergence and growth of replicants occurs often and is essentially independent of the cellular space size, initial random pattern of components, and initial density of components, over a broad range of these parameters. The number of replicants and the total number of components they incorporate generally approach quasi-stable values with time.
Article
A computer model is described that explores some of the possible behavior of biological life during the early stages of evolution. The simulation starts with a primordial soup composed of randomly generated sequences of computer operations selected from a basis set of 16 opcodes. With a probability of about 10−4, these sequences spontaneously generate large and inefficient self-replicating “organisms”. Driven by mutations, these protobiotic ancestors more efficiently generate offspring by initially eliminating unnecessary code. Later they increase their complexity by adding additional subroutines as they compete for the system's two limited resources, computer memory and CPU time. The ensuing biology includes replicating hosts, parasites and colonies.
Article
Living cells must maintain their membranes by active metabolism. The membrane is not static but a dynamic structure that has evolved along with its internal reactions. When we reflect on the emergence and evolution of primitive cells, we should not forget the mutual dependency between membranes and metabolic cycles inside the cell. In this paper, we present a simple abstract model of the self-maintaining cell. A metabolic cycle will produce a self-assembling membrane that will enclose the metabolic cycle. We show that a self-maintaining cell has the potential to reproduce itself spontaneously. Further, we have demonstrated two different ways of cellular reproduction depending on the mobility of chemicals. In the first case, a cell releases autocatalytic chemicals that create new cells outside the mother cell. In the second case, a cell grows larger and divides itself into daughter cells by creating a new internal dividing membrane.
Article
Self-reproduction in cellular automata is discussed with reference to the models of von Neumann and Codd. The conclusion is drawn that although the capacity for universal construction is a sufficient condition for self-reproduction, it is not a necessary condition. Slightly more “liberal” criteria for what constitutes genuine self-reproduction are introduced, and a simple self-reproducing structure is exhibited which satisfies these new criteria. This structure achieves its simplicity by storing its description in a dynamic “loop”, rather than on a static “tape”.
Article
Self-reproducing, cellular automata-based systems developed to date broadly fall under two categories; the first consists of machines which are capable of performing elaborate tasks, yet are too complex to simulate, while the second consists of extremely simple machines which can be entirely implemented, yet lack any additional functionality aside from self-reproduction. In this paper we present a self-reproducing system which is completely realizable, while capable of executing any desired program, thereby exhibiting universal computation. Our starting point is a simple self-reproducing loop structure onto which we “attach” an executable program (Turing machine) along with its data. The three parts of our system (loop, program, data) are all reproduced, after which the program is run on the given data. The system reported in this paper has been simulated in its entirety; thus, we attain a viable, self-reproducing machine with programmable capabilities.
Article
Theory of Self-Reproducing Automata
Article
This article describes in detail an implementation of John von Neumann's self-reproducing machine. Self-reproduction is achieved as a special case of construction by a universal constructor. The theoretical proof of the existence of such machines was given by John von Neumann in the early 1950s [6], but was first implemented in 1994, by the author in collaboration with R. Nobili. Our implementation relies on an extension of the state-transition rule of von Neumann's original cellular automaton. This extension was introduced to simplify the design of the constructor. The main operations in our constructor can be mapped into operations of von Neumann's machine.
Article
A population of RNA molecules that catalyze the template-directed ligation of RNA substrates was made to evolve in a continuous manner in the test tube. A simple serial transfer procedure was used to achieve approximately 300 successive rounds of catalysis and selective amplification in 52 hours. During this time, the population size was maintained against an overall dilution of 3 × 10298. Both the catalytic rate and amplification rate of the RNAs improved substantially as a consequence of mutations that accumulated during the evolution process. Continuous in vitro evolution makes it possible to maintain laboratory “cultures” of catalytic molecules that can be perpetuated indefinitely.
Article
Contemporary classics on the the major approaches to emergence found in contemporary philosophy and science, with chapters by such prominent scholars as John Searle, Stephen Weinberg, William Wimsatt, Thomas Schelling, Jaegwon Kim, Daniel Dennett, Herbert Simon, Stephen Wolfram, Jerry Fodor, Philip Anderson, David Chalmers, and others. Emergence, largely ignored just thirty years ago, has become one of the liveliest areas of research in both philosophy and science. Fueled by advances in complexity theory, artificial life, physics, psychology, sociology, and biology and by the parallel development of new conceptual tools in philosophy, the idea of emergence offers a way to understand a wide variety of complex phenomena in ways that are intriguingly different from more traditional approaches. This reader collects for the first time in one easily accessible place classic writings on emergence from contemporary philosophy and science. The chapters, by such prominent scholars as John Searle, Stephen Weinberg, William Wimsatt, Thomas Schelling, Jaegwon Kim, Robert Laughlin, Daniel Dennett, Herbert Simon, Stephen Wolfram, Jerry Fodor, Philip Anderson, and David Chalmers, cover the major approaches to emergence. Each of the three sections ("Philosophical Perspectives," "Scientific Perspectives," and "Background and Polemics") begins with an introduction putting the chapters into context and posing key questions for further exploration. A bibliography lists more specialized material, and an associated website (http://mitpress.mit.edu/emergence) links to downloadable software and to other sites and publications about emergence. ContributorsP. W. Anderson, Andrew Assad, Nils A. Baas, Mark A. Bedau, Mathieu S. Capcarrère, David Chalmers, James P. Crutchfield, Daniel C. Dennett, J. Doyne Farmer, Jerry Fodor, Carl Hempel, Paul Humphreys, Jaegwon Kim, Robert B. Laughlin, Bernd Mayer, Brian P. McLaughlin, Ernest Nagel, Martin Nillson, Paul Oppenheim, Norman H. Packard, David Pines, Steen Rasmussen, Edmund M. A. Ronald, Thomas Schelling, John Searle, Robert S. Shaw, Herbert Simon, Moshe Sipper, Stephen Weinberg, William Wimsatt, and Stephen Wolfram Bradford Books imprint
Article
We constructed a simple evolutionary system, "evoloop," on a deterministic nine-state five-neighbor cellular automata (CA) space by improving the structurally dissolvable self-reproducing loop we had previously contrived [14] after Langton's self-reproducing loop [7]. The principal role of this improvement is to enhance the adaptability (a degree of the variety of situations in which structures in the CA space can operate regularly) of the self-reproductive mechanism of loops. The experiment with evoloop met with the intriguing result that, though no mechanism was explicitly provided to promote evolution, the loops varied through direct interaction of their phenotypes, smaller individuals were naturally selected thanks to their quicker self-reproductive ability, and the whole population gradually evolved toward the smallest ones. This result gives a unique example of evolution of self-replicators where genotypical variation is caused by precedent phenotypical variation. Such interrelation of genotype and phenotype would be one of the important factors driving the evolutionary process of primitive life forms that might have actually occurred in ancient times.
Article
Textbooks often assert that life began with specialized complex molecules, such as RNA, that are capable of making their own copies. This scenario has serious difficulties, but an alternative has remained elusive. Recent research and computer simulations have suggested that the first steps toward life may not have involved biopolymers. Rather, non-covalent protocellular assemblies, generated by catalyzed recruitment of diverse amphiphilic and hydrophobic compounds, could have constituted the first systems capable of information storage, inheritance and selection. A complex chain of evolutionary events, yet to be deciphered, could then have led to the common ancestors of today's free-living cells, and to the appearance of DNA, RNA and protein enzymes.
Article
Amoeba is a computer model designed to facilitate the study of the origin and evolution of digital life. Specifically, an initially disordered system, consisting of random sequences of machine instructions, self-organizes into an ordered system containing self-replicating programs. The current version of Amoeba broadens the original system's capability by using a basis set of 32 machine instructions that is computationally universal. In addition, Amoeba uses a set of 64 address labels, each of which is randomly assigned to a machine instruction each time a sequence is randomly created. This eliminates the constraint that occurs when the complements of predefined codons are used for addressing. A more open-ended system results because programs can now form subroutines that are arranged in an arbitrary manner.
Article
This article reviews the growing body of scientific work in artificial chemistry. First, common motivations and fundamental concepts are introduced. Second, current research activities are discussed along three application dimensions: modeling, information processing, and optimization. Finally, common phenomena among the different systems are summarized. It is argued here that artificial chemistries are "the right stuff" for the study of prebiotic and biochemical evolution, and they provide a productive framework for questions regarding the origin and evolution of organizations in general. Furthermore, artificial chemistries have a broad application range of practical problems, as shown in this review.
Article
Contemporary classics on the the major approaches to emergence found in contemporary philosophy and science, with chapters by such prominent scholars as John Searle, Stephen Weinberg, William Wimsatt, Thomas Schelling, Jaegwon Kim, Daniel Dennett, Herbert Simon, Stephen Wolfram, Jerry Fodor, Philip Anderson, David Chalmers, and others. Emergence, largely ignored just thirty years ago, has become one of the liveliest areas of research in both philosophy and science. Fueled by advances in complexity theory, artificial life, physics, psychology, sociology, and biology and by the parallel development of new conceptual tools in philosophy, the idea of emergence offers a way to understand a wide variety of complex phenomena in ways that are intriguingly different from more traditional approaches. This reader collects for the first time in one easily accessible place classic writings on emergence from contemporary philosophy and science. The chapters, by such prominent scholars as John Searle, Stephen Weinberg, William Wimsatt, Thomas Schelling, Jaegwon Kim, Robert Laughlin, Daniel Dennett, Herbert Simon, Stephen Wolfram, Jerry Fodor, Philip Anderson, and David Chalmers, cover the major approaches to emergence. Each of the three sections ("Philosophical Perspectives," "Scientific Perspectives," and "Background and Polemics") begins with an introduction putting the chapters into context and posing key questions for further exploration. A bibliography lists more specialized material, and an associated website (http://mitpress.mit.edu/emergence) links to downloadable software and to other sites and publications about emergence. ContributorsP. W. Anderson, Andrew Assad, Nils A. Baas, Mark A. Bedau, Mathieu S. Capcarrère, David Chalmers, James P. Crutchfield, Daniel C. Dennett, J. Doyne Farmer, Jerry Fodor, Carl Hempel, Paul Humphreys, Jaegwon Kim, Robert B. Laughlin, Bernd Mayer, Brian P. McLaughlin, Ernest Nagel, Martin Nillson, Paul Oppenheim, Norman H. Packard, David Pines, Steen Rasmussen, Edmund M. A. Ronald, Thomas Schelling, John Searle, Robert S. Shaw, Herbert Simon, Moshe Sipper, Stephen Weinberg, William Wimsatt, and Stephen Wolfram Bradford Books imprint
Article
All life that is known to exist on Earth today and all life for which there is evidence in the geological record seems to be of the same form--one based on DNA genomes and protein enzymes. Yet there are strong reasons to conclude that DNA- and protein-based life was preceded by a simpler life form based primarily on RNA. This earlier era is referred to as the 'RNA world', during which the genetic information resided in the sequence of RNA molecules and the phenotype derived from the catalytic properties of RNA.
Article
We present a new self-reproducing cellular automaton capable of construction and computation beyond self-reproduction. Our automaton makes use of some of the concepts developed by Langton for his self-reproducing automaton, but provides the added advantage of being able to perform independent constructional and computational tasks alongside self-reproduction. Our automaton is capable, like Langton's automaton and with comparable complexity, of simple self-replication, but it also provides (at the cost, naturally, of increased complexity) the option of attaching to the automaton an executable program which will be duplicated and executed in each of the copies of the automaton. After describing in some detail the self-reproduction mechanism of our automaton, we provide a non-trivial example of its constructional capabilities. 1 Introduction The history of self-reproducing cellular automata basically begins with John von Neumann's research in the field of complex self-reproducing machi...
Modular designer chemistries for artificial life
  • K L L Downing
  • E D Spector
  • A Goodman
  • W B Wu
  • H Langdon
  • M Voigt
  • M Gen
  • S Sen
  • M Dorigo
  • S Pezeshk
Downing, K. L. (2001). Modular designer chemistries for artificial life. In L. Spector, E. D. Goodman, A. Wu, W. B. Langdon, H.-M. Voigt, M. Gen, S. Sen, M. Dorigo, S. Pezeshk, M. H. Garzon, & E. Burke (Eds.), Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001) (pp. 845– 852). San Francisco, CA: Morgan Kaufmann.
John von Neumann and the evolutionary growth of complexity: Looking backwards, looking forwards Model of self-replicating cell capable of selfmaintenance
  • B Mcmullin
  • N Ono
  • T Ikegami
McMullin, B. (2000). John von Neumann and the evolutionary growth of complexity: Looking backwards, looking forwards... In M. A. Bedau, J. S. Mc-Caskill, N. H. Packard, & S. Rasmussen (Eds.), Artificial Life VII: Proceedings of the seventh international conference (pp. 467–476). Cambridge, MA: MIT Press. [18] Ono, N., & Ikegami, T. (1999). Model of self-replicating cell capable of selfmaintenance. In D. Floreano, J. -D. Nicoud, & F. Mondada (Eds.), Proceedings of the Fifth European Conference on Artificial Life (ECAL99) (pp. 399–406). Lausanne, Switzerland: Springer.
Creativity in evolution: Individuals, interactions and environment The Society for the Study of Artificial Intelligence and Simulation of Behaviour
  • T Taylor
Taylor, T. (1999). Creativity in evolution: Individuals, interactions and environment. In P. Bentley & D. Corne (Eds.), Proceedings of the AISB'99 Symposium on Creative Evolutionary Systems, The Society for the Study of Artificial Intelligence and Simulation of Behaviour. Edinburgh: Morgan Kaufman.