Article

The coreworld: Emergence and evolution of cooperative structures in a computational chemistry

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We have developed an artificial chemistry in the computer core, where one is able to evolve assembler-automaton code without any predefined evolutionary path. The core simulator in the present version has one dimension, is updated in parallel, the instructions are only able to communicate locally, and the system is continuously subjected to noise. The system also has a notion of local computational resources. We see different evolutionary paths depending on the specified parameters and the level of complexity, measured as distance from initially randomized core. For several initial conditions the system is able to develop extremely viable cooperative structures (organisms?) which totally dominate the core. We have been able to identify seven successive evolutionary epochs each characterized by different functional properties. Our study demonstrates a successive emergence of complex functional properties in a computational environment.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... A mathematical description of various hysteretic models can be found in [35,36]. Recently, the Preisach hysteretic model has been adopted to describe the response behavior of smart materials, such as shape-memory alloys [37,38]. The model is significantly versatile in representing diverse hysteretic patterns, and even capable of capturing minor loops present in many physical phenomena. ...
... Overall, the normalized error of Eq. (38) can be estimated for a given system of SDEs of Eq. (4) for assessing the relative accuracy of the approximate response transition PDF both for various x f values, and for various time instants t f . ...
... The N -dimensional circulant system of the form where for φ(a 1 , a 2 , . . . , a N ) = sin (a 2 ) , is typically called the labyrinth model [35][36][37] , and has been used extensively in diverse applications [38][39][40] for representing auto-catalytic systems. In the following example, a three-dimensional version of the labyrinth model given by Sprott [36] is utilized to assess the accuracy of the approximate response PDF of Eq. (42) . ...
... This new relational structure is responsible for new law-like regularities to the behavior of the system at that higher level of organization" (KJee, 1984). Campbell (Campbell, 1974) and Popper (Popper, 1987) use the word "downward causation" to describe the notion of macrodetermination, i.e. the effect of the whole on the parts. For organismic biology, this phenomenon of macrodetermination whereby the whole subjects the parts to certain ordering constraints is essential for explaining the living, living systems "being dominated by the interactions of numerous variables, all of which can at once be both cause and effect" (Popper, 1987). ...
... Campbell (Campbell, 1974) and Popper (Popper, 1987) use the word "downward causation" to describe the notion of macrodetermination, i.e. the effect of the whole on the parts. For organismic biology, this phenomenon of macrodetermination whereby the whole subjects the parts to certain ordering constraints is essential for explaining the living, living systems "being dominated by the interactions of numerous variables, all of which can at once be both cause and effect" (Popper, 1987). According to Weiss, "it is solely the ordered interactions of the molecules -their behavior -that makes them participants in the process of life", which in turn influences the molecules (Weiss, 1970). ...
... When facing all these definitions of emergence, the issue is not that of finding which one is the right one, but finding the domain of application of each definition, some overlapping being likely to occur between the domains. Popper (Popper, 1987) Several other classifications of emergence are proposed by Cariani (Cariani, 1989): ...
Article
Full-text available
Emergence seems to be a central concept in Artificial Life, Cognitive Science, and many other related domains, but the meaning of which is not really agreed upon. In this paper, we critically review some major conceptions of emergence and give some examples of phenomena that are usually considered emergent. Resume La notion d'emergence se situe au centre des Sciences Cognitives, de la Vie Artificielle, et de nombreux autres domaines connexes. II s'agit pourtant lit d' un concept sur lequel les avis ne s' accordent pas reellement. Nous nous proposons ici d'etablir un etat des lieu x, en decrivant les courants majeurs qui se dessinent autour de I' emergence, et en dressant une liste, courte mais representative, de phenomenes qui sont generalement consideres comme emergents.
... Most particularly, there is a well developed theory of their structural properties found in the framework of computational mechanics. In contrast with individuals in previous, related pre-biotic models-such as machine language programs (Rasmussen et al., 1990;Rasmussen et al., 1992;Ray, 1991;Adami and Brown, 1994), tags (Farmer et al., 1986;Bagley et al., 1989), λexpressions (Fontana, 1991), and cellular automata (Crutchfield and Mitchell, 1995), ε-machines have a well defined (and calculable) notion of structural complexity. For the cases of machine language and λ-calculus, in contrast, it is known that algorithms do not even exist to calculate such properties since these representations are computation universal (Brookshear, 1989). ...
... The individuals are simply objects whose internal structure determines how they interact. The benefit of this when modeling prebiotic evolution is that there is no assumed distinction between gene and protein (Schrödinger, 1967;von Neumann, 1966) or between data and program (Rasmussen et al., 1990;Rasmussen et al., 1992;Ray, 1991;Adami and Brown, 1994). ...
Preprint
Current analyses of genomes from numerous species show that the diversity of organism's functional and behavioral characters is not proportional to the number of genes that encode the organism. We investigate the hypothesis that the diversity of organismal character is due to hierarchical organization. We do this with the recently introduced model of the finitary process soup, which allows for a detailed mathematical and quantitative analysis of the population dynamics of structural complexity. Here we show that global complexity in the finitary process soup is due to the emergence of successively higher levels of organization, that the hierarchical structure appears spontaneously, and that the process of structural innovation is facilitated by the discovery and maintenance of relatively noncomplex, but general individuals in a population.
... Although several evolutionary systems have utilized code generation for bytecode virtual machines closely inspired by machine code (Rasmussen et al., 1990;Ray, 1997), to the author's knowledge, no immersive evolutionary artwork has yet utilized the generation of actual machine code to reconcile open-endedness with real-time performance. ...
... Memory accesses can be guarded by providing run-time bounds checking where compile-time range analysis is not possible. Alternatively, memory addressing consistency can be enforced by more radical strategies, such as the address modulo of Core Wars red code (Rasmussen et al., 1990), or template matching of Tierra (Ray, 1997). ...
Thesis
In the interactive computer arts, any advance that significantly amplifies or extends the limits and capacities of software can enable genuinely novel aesthetic experiences. Within compute-intensive media arts, flexibility is often sacrificed for needs of efficiency, through the total separation of machine code optimization and run-time execution. Compromises based on modular run-time combinations of prior-optimized ‘black box’ components confine results to a pre-defined palette with less computational efficiency overall: limiting the open-endedness of devel- opment environments and the generative scope of artworks. This dissertation demonstrates how the trade-off between flexibility and efficiency can be relaxed using reflective meta-programming and dynamic compilation: extending a pro- gram with new efficient routines while it runs. It promises benefits of more open- ended real-time systems, more complex algorithms, richer media, and ultimately unprecedented aesthetic experiences. The dissertation charts the significant differences that this approach implies for interactive computational arts, builds a conceptual framework of techniques and requirements to respond to its challenges, and documents supporting implemen- tations in two specific scenarios. The first concentrates on open-ended creativity support within always-on authoring environments for studio work and live cod- ing performance, while the second concerns the open-endedness of generative art through interactive, immersive artificial-life worlds.
... In addition to naturally evolving systems, researchers are exploring an increasing number of synthetic or artificial evolving systems [30][31][32][33][34][35][36] that range from minor modifications of natural systems, such as proteins with non-natural amino acids [37][38][39][40][41], to completely artificial systems such as digital organisms and computer viruses [31,33,34]. We know little about the genotype-phenotype maps of such artificial systems. ...
... Any such comparison should take into account that the genotype-phenotype map of artifical systems has not evolved, but in contrast to that of natural systems, is designed. Here we address these issues with the Avida platform for digital evolution [30]. ...
Article
Full-text available
To understand how evolving systems bring forth novel and useful phenotypes, it is essential to understand the relationship between genotypic and phenotypic change. Artificial evolving systems can help us understand whether the genotype-phenotype maps of natural evolving systems are highly unusual, and it may help create evolvable artificial systems. Here we characterize the genotype-phenotype map of digital organisms in Avida, a platform for digital evolution. We consider digital organisms from a vast space of 10¹⁴¹ genotypes (instruction sequences), which can form 512 different phenotypes. These phenotypes are distinguished by different Boolean logic functions they can compute, as well as by the complexity of these functions. We observe several properties with parallels in natural systems, such as connected genotype networks and asymmetric phenotypic transitions. The likely common cause is robustness to genotypic change. We describe an intriguing tension between phenotypic complexity and evolvability that may have implications for biological evolution. On the one hand, genotypic change is more likely to yield novel phenotypes in more complex organisms. On the other hand, the total number of novel phenotypes reachable through genotypic change is highest for organisms with simple phenotypes. Artificial evolving systems can help us study aspects of biological evolvability that are not accessible in vastly more complex natural systems. They can also help identify properties, such as robustness, that are required for both human-designed artificial systems and synthetic biological systems to be evolvable.
... In other words, we can expect the particle to move in counterintuitive ways in a maze-like structure while shedding energy in the process. As for the systems' appearance in other disciplines, the system is a simplification of auto-catalytic models which are prevalent in evolution [8], chemical reactions [7], and ecology [15]. ...
Preprint
Full-text available
In this paper we are going to make an analytical and numerical analysis for the Thomas system. Physically, this system describes a particle, driving by a system of oscillators, dissipated by a dissipation term b > 0. Mathematically, this system is very interesting because it contains rich dynamics in it which is generated by only one bifurcation parameter b. Depending on the value of b, the system is undergo through a stable regime, limit cycles, infinite amount of bifurcations, a series growing to infinity of fixed points, and chaos containing multiple attractors. Another interesting behaviour of the system is in the limit of b goes to zero, which means that there is no dissipation term. The system is then containing an infinite number of fixed points and behaves like a Brownian motion.
... Barricelli's ideas were forgotten until the days of Artificial Life in the 1980s [295][296][297]. Two crucial developments attracted renewed attention to the evolution of virtual agents: (a) the formalisation and propagation of computer viruses and (b) the creation of computer ecologies. ...
Preprint
Full-text available
It has been argued that the historical nature of evolution makes it a highly path-dependent process. Under this view, the outcome of evolutionary dynamics could have resulted in organisms with different forms and functions. At the same time, there is ample evidence that convergence and constraints strongly limit the domain of the potential design principles that evolution can achieve. Are these limitations relevant in shaping the fabric of the possible? Here, we argue that fundamental constraints are associated with the logic of living matter. We illustrate this idea by considering the thermodynamic properties of living systems, the linear nature of molecular information, the cellular nature of the building blocks of life, multicellularity and development, the threshold nature of computations in cognitive systems, and the discrete nature of the architecture of ecosystems. In all these examples, we present available evidence and suggest potential avenues towards a well-defined theoretical formulation.
... This is one of the five most difficult systems (out of the 131 known chaotic systems listed in [22]) to learn from data using sparse kernel flows, when judging based on the symmetric mean absolute percentage error criterion [62]. This is a three-dimensional system representative of a large class of auto-catalytic models that occur frequently in chemical reactions [49], ecology [13], and evolution [30]. It is described by the following equations ...
Preprint
Full-text available
With the advent of supercomputers, multi-processor environments and parallel-in-time (PinT) algorithms offer ways to solve initial value problems for ordinary and partial differential equations (ODEs and PDEs) over long time intervals, a task often unfeasible with sequential solvers within realistic time frames. A recent approach, GParareal, combines Gaussian Processes with traditional PinT methodology (Parareal) to achieve faster parallel speed-ups. The method is known to outperform Parareal for low-dimensional ODEs and a limited number of computer cores. Here, we present Nearest Neighbors GParareal (nnGParareal), a novel data-enriched PinT integration algorithm. nnGParareal builds upon GParareal by improving its scalability properties for higher-dimensional systems and increased processor count. Through data reduction, the model complexity is reduced from cubic to log-linear in the sample size, yielding a fast and automated procedure to integrate initial value problems over long time intervals. First, we provide both an upper bound for the error and theoretical details on the speed-up benefits. Then, we empirically illustrate the superior performance of nnGParareal, compared to GParareal and Parareal, on nine different systems with unique features (e.g., stiff, chaotic, high-dimensional, or challenging-to-learn systems).
... This project was primarily inspired by three recent efforts: Hutton's artificial chemistry Squirm3 (2002), Smith, Turney and Ewaschuk's JohnnyVon (2003), and my prior work on intercellular signaling (Portegys 2002). Prior to these, Dittrich et al. (2001) compared a wide range of artificial chemistry approaches, including assembler automata (Rasmussen et al. 1990, Ray 1992, Adami and Brown 1994, Ono and Ikegami's autocatalytic membrane formation (1999), and lattice molecular systems (McMullin and Varela 1997), in which the atoms comprising a molecule map discretely to cellular space. ...
Chapter
Full-text available
Proceedings from the ninth International Conference on Artificial Life; papers by scientists of many disciplines focusing on the principles of organization and applications of complex, life-like systems. Artificial Life is an interdisciplinary effort to investigate the fundamental properties of living systems through the simulation and synthesis of life-like processes. The young field brings a powerful set of tools to the study of how high-level behavior can arise in systems governed by simple rules of interaction. Some of the fundamental questions include: What are the principles of evolution, learning, and growth that can be understood well enough to simulate as an information process? Can robots be built faster and more cheaply by mimicking biology than by the product design process used for automobiles and airplanes? How can we unify theories from dynamical systems, game theory, evolution, computing, geophysics, and cognition? The field has contributed fundamentally to our understanding of life itself through computer models, and has led to novel solutions to complex real-world problems across high technology and human society. This elite biennial meeting has grown from a small workshop in Santa Fe to a major international conference. This ninth volume of the proceedings of the international A-life conference reflects the growing quality and impact of this interdisciplinary scientific community. Bradford Books imprint
... Parallel to the maturation of developmental biology as a major biological sub-discipline in the 1990s, accumulating evidence began to clarify how certain developmental properties, at the organismal level, influence a population's ability to respond adaptively to novel environmental challenges. Some of these new observations seemed counterintuitive, like imposing certain constraints in variation promoted evolvability in virtual organisms (e.g., Rasmussen et al., 1990). Or perhaps more unexpected was the finding that greater robustness-i.e., intrinsic resistance to change-actually promotes a greater build-up of variation and enhances evolvability indirectly (outlined in Kirschner & Gerhart, 1998). ...
Book
Full-text available
Change is the fundamental idea of evolution. Explaining the extraordinary biological change we see written in the history of genomes and fossil beds is the primary occupation of the evolutionary biologist. Yet it is a surprising fact that for the majority of evolutionary research, we have rarely studied how evolution typically unfolds in nature, in changing ecological environments, over space and time. While ecology played a major role in the eventual acceptance of the population genetic viewpoint of evolution in the synthetic era (circa 1918-1956), it held a lesser role in the development of evolutionary theory until the 1980s, when we began to systematically study the evolutionary dynamics of natural populations in space and time. As a result, early evolutionary theory was initially constructed in an abstract vacuum that was unrepresentative of evolution in nature. The subtle synthesis between ecology with evolutionary biology (eco-evo synthesis) over the past 40 years has progressed our knowledge of natural selection dynamics as they are found in nature, thus revealing how natural selection varies in strength, direction, form, and, more surprisingly, level of biological organization. Natural selection can no longer be reduced to lower levels of biological organization (i.e., individuals, selfish genes) over shorter timescales but should be expanded to include adaptation at higher levels and over longer timescales. Long-term and/or emergent evolutionary phenomena, such as multilevel selection or evolvability, have thus become tenable concepts within an evolutionary biology that embraces ecology and spatiotemporal change. Evolutionary biology is currently suspended at an intermediate stage of scientific progress that calls for the organization of all the recent knowledge revealed by the eco-evo synthesis into a coherent and unified theoretical framework. This is where philosophers of biology can be of particular use, acting as a bridge between the subdisciplines of biology and inventing new theoretical strategies to organize and accommodate the recent knowledge. Philosophers have recommended transitioning away from outdated philosophies that were originally derived from physics within the philosophical zeitgeist of logical positivism (i.e., monism, reductionism, and monocausation) and toward a distinct philosophy of biology that can capture the natural complexity of multifaceted biological systems within diverse ecosystems—one that embraces the emerging philosophies of pluralism, emergence, and multicausality. Therefore, I see recent advances in ecology, evolutionary biology, and the philosophy of biology as laying the groundwork for another major biological synthesis, what I refer to as the Second Synthesis because, in many respects, it is analogous to the aims and outcomes of the first major biological synthesis (but is notably distinct from the inorganic and contrived progressive movement known as the extended evolutionary synthesis). With the general development of a distinctive philosophy of science, biology has rightfully emerged as an autonomous science. Thus, while the first synthesis legitimized biology, the Second Synthesis autonomized biology and afforded biology its own philosophy, allowing biology to finally realize its full scientific potential.
... Theoretical models based on feedback circuits are critical for understanding cell differentiation and regulatory network [2] [3]. It is also applicable in chemistry as representative autocatalytic reactions [4], ecology [5], and in evolution [6]. Under suitable conditions, spatio-temporal patterns are observed for many such oscillators with a nonlinear coupling scheme in [7] while with a linear coupling and nonidentical oscillators in [8]. ...
Preprint
Full-text available
In this letter, we report a numerical study on the collective dynamics of two mutually coupled Thomas oscillators with linear/nonlinear coupling. Thomas system is a biologically motivated system with feedback circuits and extraordinary dynamical features that can lead to the design of novel materials. Our model calculations can explain the diffusion of interacting particles in a fluid for the specific choice of system parameters. In a fluid, frequent momentum transfer between particles keeps them moving with correlated time behaviour, and we treat it as a synchronization process. The linear diffusive coupling is equivalent to weak momentum transfer, leading to conventional dynamics and synchronization. The sinusoidal nonlinear coupling, or harmonic momentum transfer, produces exceptional dynamical features. The coupled system passes through an interval of transient chaos before it settles into a chaotic or limit cycle attractor. When the attractor is chaotic or an unstable transient attractor, the nature of synchronization is complete (directed motion). In contrast, it is either lag, anti-lag, or space lag for a limit cycle. In such situations, the diffusion is due to particles pedalling and eddy/swirling motion on top of translatory motion via transient chaos. Also, the trajectories of the two particles in the state space resemble the chiral phenomenon.
... Theoretical models based on feedback circuits are useful for understanding the phenomena of many real systems like cell differentiation [19] and regulatory network [20]. It represent many autocatalytic chemical reactions [21]. In general, this system is suitable for the mathematical modelling of biological systems [22] [23]. ...
Preprint
Full-text available
In this letter, we demonstrate the cyclically symmetric Thomas oscillators as swarmalators and describe their possible collective dynamics. We achieve this by sewing Kuromoto-type phase dynamics to particle dynamics represented by the Thomas model. More precisely, this is equivalent to a non-linear particle aggregation model with cyclic symmetry of coordinates and position-dependent phase dynamics. The non-linear equations describe spatiotemporal patterns of crystalline order and chaotic randomness at two extreme values of the system parameter. This pattern is the outcome of non-linear self-organization, which leads to a new class of turbulent flow - active turbulence. We claim that this model can capture the dynamics of many naturally occurring microorganisms and micro-swimmers. The model described in this letter can be a prototypical model for understanding active systems and may shed light on the possibility of making novel materials(active matter) with exciting biomedical and industrial applications. The key to this is the understanding and control over the complex dynamics of active systems, an out-of-equilibrium system, which is potentially helpful in making functional materials, nano and micromachines.
... The b = 0 case, the conservative limit, is an example of threedimensional fractional Brownian motion in a purely deterministic system and it is the only example of this sort where fractional Brownian motion is connected to non-linear feedback circuits [3,6]. The properties, as mentioned earlier, of the system make it suitable for applications in chemistry as representative autocatalytic reactions [7], ecology [8], and in evolution [9]. Spatio-temporal patterns are observed for many such oscillators with a non-linear coupling scheme in [10] while with a linear coupling and non-identical oscillators in [6]. ...
Article
Full-text available
In this letter, we provide a detailed numerical examination of the dynamics of a charged Thomas oscillator in an external magnetic field. We do so by adopting and then modifying the cyclically symmetric Thomas oscillator to study the dynamics of a charged particle in an external magnetic field. These dynamical behaviours for weak and strong field strength parameters fall under two categories; conservative and dissipative. The system shows a complex quasi-periodic attractor whose topology depends on initial conditions for high field strengths in the conservative regime. There is a transition from adiabatic motion to chaos on decreasing the field strength parameter. In the dissipative regime, the system is chaotic for weak field strength and weak damping but shows a limit cycle for high field strengths. Such behaviour is due to an additional negative feedback loop that comes into action at high field strengths and forces the system dynamics to be stable in periodic oscillations. For weak damping and weak field strength, the system dynamics mimic Brownian motion via chaotic walks. We claim that modified Thomas oscillator is a prototypical model to understand the dynamics of an active particle.
... More complex systems were introduced in the 80's to model Darwinian evolution with the use of a new type of artificial life where organisms described as computer programs could self-replicate, adapt and mutate by natural selection, mostly competing for the control of the memory of the computer (e.g. CoreWar [10] and Avida [11,12]). The introduction of these digital organisms to address fundamental biological questions was supported on two main statements. ...
Article
Full-text available
Recent years have witnessed the detection of an increasing number of complex organic molecules in interstellar space, some of them being of prebiotic interest. Disentangling the origin of interstellar prebiotic chemistry and its connection to biochemistry and ultimately, to biology is an enormously challenging scientific goal where the application of complexity theory and network science has not been fully exploited. Encouraged by this idea, we present a theoretical and computational framework to model the evolution of simple networked structures toward complexity. In our environment, complex networks represent simplified chemical compounds and interact optimizing the dynamical importance of their nodes. We describe the emergence of a transition from simple networks toward complexity when the parameter representing the environment reaches a critical value. Notably, although our system does not attempt to model the rules of real chemistry nor is dependent on external input data, the results describe the emergence of complexity in the evolution of chemical diversity in the interstellar medium. Furthermore, they reveal an as yet unknown relationship between the abundances of molecules in dark clouds and the potential number of chemical reactions that yield them as products, supporting the ability of the conceptual framework presented here to shed light on real scenarios. Our work reinforces the notion that some of the properties that condition the extremely complex journey from the chemistry in space to prebiotic chemistry and finally, to life could show relatively simple and universal patterns.
... Mutation operators must either ensure that mutated labels are syntactically valid, or else cope with an abundance of broken code. These choices result in either a search space that is overly constrained or one that is rugged and difficult to navigate [56]. ...
... The b = 0 case, the conservative limit, is an example of three-dimensional fractional Brownian motion in a purely deterministic system and it is the only example of this sort where fractional Brownian motion is connected to nonlinear feedback circuits[3] [6]. The properties, as mentioned earlier, of the system make it suitable for applications in chemistry as representative autocatalytic reactions [7], ecology [8], and in evolution [9]. Spatio-temporal patterns are observed for many such oscillators with a nonlinear coupling scheme in [10] while with a linear coupling and nonidentical oscillators in [6] Exceptional properties materialise when a charged particle interacts with an external magnetic field. ...
Preprint
Full-text available
In this letter, we provide a detailed numerical examination of the dynamics of a charged Thomas oscillator in an external magnetic field. We do so by adopting and then modifying the cyclically symmetric Thomas oscillator to study the dynamics of a charged particle in an external magnetic field. These dynamical behaviours for weak and strong field strength parameters fall under two categories; conservative and dissipative. The system shows a complex quasi-periodic attractor whose topology depends on initial conditions for high field strengths in the conservative regime. There is a transition from adiabatic motion to chaos on decreasing the field strength parameter. In the dissipative regime, the system is chaotic for weak field strength and weak damping but shows a limit cycle for high field strengths. Such behaviour is due to an additional negative feedback loop that comes into action at high field strengths and forces the system dynamics to be stable in periodic oscillations. For weak damping and weak field strength, the system dynamics mimic Brownian motion via chaotic walks.
... Inspired by Core War, Rasmussen and colleagues created Core World, in which they introduced the possibility for random mutations when a program copied itself (Rasmussen et al., 1989(Rasmussen et al., , 1990. The command used by replicator programs to copy themselves was imperfect, sometimes writing a random instruction instead of copying the intended instruction. ...
Article
Full-text available
Symbiosis, the living together of unlike organisms as symbionts, is ubiquitous in the natural world. Symbioses occur within and across all scales of life, from microbial to macro-faunal systems. Further, the interactions between symbionts are multimodal in both strength and type, can span from parasitic to mutualistic within one partnership, and persist over generations. Studying the ecological and evolutionary dynamics of symbiosis in natural or laboratory systems poses a wide range of challenges, including the long time scales at which symbioses evolve de novo, the limited capacity to experimentally control symbiotic interactions, the weak resolution at which we can quantify interactions, and the idiosyncrasies of current model systems. These issues are especially challenging when seeking to understand the ecological effects and evolutionary pressures on and of a symbiosis, such as how a symbiosis may shift between parasitic and mutualistic modes and how that shift impacts the dynamics of the partner population. In digital evolution, populations of computational organisms compete, mutate, and evolve in a virtual environment. Digital evolution features perfect data tracking and allows for experimental manipulations that are impractical or impossible in natural systems. Furthermore, modern computational power allows experimenters to observe thousands of generations of evolution in minutes (as opposed to several months or years), which greatly expands the range of possible studies. As such, digital evolution is poised to become a keystone technique in our methodological repertoire for studying the ecological and evolutionary dynamics of symbioses. Here, we review how digital evolution has been used to study symbiosis, and we propose a series of open questions that digital evolution is well-positioned to answer.
... Mutation operators must either ensure that mutated labels are syntactically valid, or else cope with an abundance of broken code. These choices result in either a search space that is overly constrained or one that is rugged and difficult to navigate [56]. ...
Article
Full-text available
We introduce and experimentally demonstrate the utility of tag-based genetic regulation, a new genetic programming (GP) technique that allows programs to dynamically adjust which code modules to express.Tags are evolvable labels that provide a flexible mechanism for referencing code modules. Tag-based genetic regulation extends existing tag-based naming schemes to allow programs to “promote” and “repress” code modules in order to alter expression patterns. This extension allows evolution to structure a program as a gene regulatory network where modules are regulated based on instruction executions. We demonstrate the functionality of tag-based regulation on a range of program synthesis problems. We find that tag-based regulation improves problem-solving performance on context-dependent problems; that is, problems where programs must adjust how they respond to current inputs based on prior inputs. Indeed, the system could not evolve solutions to some context-dependent problems until regulation was added. Our implementation of tag-based genetic regulation is not universally beneficial, however. We identify scenarios where the correct response to a particular input never changes, rendering tag-based regulation an unneeded functionality that can sometimes impede adaptive evolution. Tag-based genetic regulation broadens our repertoire of techniques for evolving more dynamic genetic programs and can easily be incorporated into existing tag-enabled GP systems.
... Such properties were often counterintuitive. For example, researchers quickly noticed that imposing certain constraints on the kind of variation generated during development was actually beneficial to a virtual organism's evolvability (e.g., Rasmussen et al. 1990). Similarly, evolvability could not be taken for granted. ...
... Mutation operators must either ensure that mutated labels are syntactically valid, or else cope with an abundance of broken code. These choices result in either a search space that is overly constrained or one that is rugged and difficult to navigate [52]. ...
Preprint
Full-text available
We introduce and experimentally demonstrate tag-based genetic regulation, a new genetic programming (GP) technique that allows evolving programs to conditionally express code modules. Tags are evolvable names that provide a flexible mechanism for labeling and referring to code modules. Tag-based genetic regulation extends existing tag-based naming schemes to allow programs to "promote" and "repress" code modules. This extension allows evolution to structure a program as an arbitrary gene regulatory network where genes are program modules and program instructions mediate regulation. We demonstrate the functionality of tag-based regulation on several diagnostic tasks as well as a more challenging program synthesis problem. We find that tag-based regulation improves problem-solving performance on context-dependent problems where programs must adjust responses to particular inputs over time (e.g., based on local context). We also observe that our implementation of tag-based genetic regulation can impede adaptive evolution when expected outputs are not context-dependent (i.e., the correct response to a particular input remains static over time). Tag-based genetic regulation broadens our repertoire of techniques for evolving more dynamic genetic programs and can easily be incorporated into existing tag-enabled GP systems.
... Prior to these, Dittrich et al. (2001) compared a wide range of artificial chemistry approaches, including assembler automata (Rasmussen et al. 1990, Ray 1992, Adami and Brown 1994, Ono and Ikegami's autocatalytic membrane formation (1999), and lattice molecular systems (McMullin and Varela 1997), in which the atoms comprising a molecule map discretely to cellular space. ...
Conference Paper
Full-text available
This paper instantiates an architecture for an artificial chemistry featuring continuous physics and discrete chemical reactions. A system of bonded complementary molecular strands replicates in the presence of a catalyst. The catalyst causes the strands to disengage; each strand subsequently replicates its missing complement by bonding to free atoms.
... The N -dimensional circulant system of the form where for φ(a 1 , a 2 , . . . , a N ) = sin (a 2 ) , is typically called the labyrinth model [35][36][37] , and has been used extensively in diverse applications [38][39][40] for representing auto-catalytic systems. In the following example, a three-dimensional version of the labyrinth model given by Sprott [36] ⎧ ⎪ ⎪ ⎨ is utilized to assess the accuracy of the approximate response PDF of Eq. (42) . ...
Article
Full-text available
An approximate solution technique is developed for a class of coupled multi-dimensional stochastic differential equations with nonlinear drift and constant diffusion coefficients. Relying on a Wiener path integral formulation and employing the Cauchy-Schwarz inequality , an approximate closed-form expression for the joint response process transition probability density function is determined. Next, the accuracy of the approximation is further enhanced by proposing a more versatile closed-form expression with additional "degrees of freedom"; that is, parameters to be determined. To this aim, an error minimization problem related to the corresponding Fokker-Planck equation is formulated and solved. Several diverse numerical examples are considered for demonstrating the reliability of the herein developed solution technique, which requires minimal computational cost for determining the joint response transition probability density function and exhibits satisfactory accuracy as compared with pertinent Monte Carlo simulation data.
... Many of the important features of Avida derive indirectly from the system called Core- world ( Rasmussen et al., 1990). The Coreworld system itself was inspired by Core War (Dewdney, 1984), which was released as a computer game mainly oriented for program- mers. ...
Thesis
The seminal architecture of machine self-reproduction originally formulated by John von Neumann underpins the mechanism of self-reproduction equipped with genotype and phenotype. In this thesis, initially, a hand-designed prototype von Neumann style selfreproducer as an ancestor is described within the context of the artificial life system Avida. The behaviour of the prototype self-reproducer is studied in search of evolvable genotype-phenotype mapping that may potentially give rise to evolvable complexity. A finding of immediate degeneration of the prototype into a self-copying mode of reproduction requires further systematic analysis of mutational pathways. Through demarcating a feasible and plausible characterisation and classification of strains, the notion of viability is revisited, which ends up being defined as quantitative potential for exponential population growth. Based on this, a framework of analysis of mutants' evolutionary potential is proposed, and, subsequently, the implementation of an enhanced version of the standard Avida analysis tool for viability analysis as well as the application of it to the prototype self-reproducer strain are demonstrated. Initial results from a one-step single-point-mutation space of the prototype, and further, from a multi-step mutation space, are presented. In the particular case of the analysis of the prototype, the majority of mutants unsurprisingly turn out to be simply infertile, without viability; whereas mutants that prove to be viable are a minority. Nevertheless, by and large, it is pointed out that distinguishing reproduction modes algorithmically is still an open question, much less finer-grained distinction of von Neumann style self-reproducers. Including this issue, speciifc limitations of the enhanced analysis are discussed for future investigation in this direction.
... Such properties were often counterintuitive. For example, researchers quickly noticed that imposing certain constraints on the kind of variation generated during development was actually beneficial to a virtual organism's evolvability (e.g., Rasmussen et al. 1990). Similarly, evolvability could not be taken for granted. ...
Chapter
Evolvability, the ability of a biological system to respond to selection, has recently become a key concept in evolutionary developmental biology and an integral part of the vocabulary of a budding extended evolutionary synthesis. While some of the theoretical principles behind the evolvability of complex organisms have been established, there are also several aspects of it that remain controversial. How does evolvability itself evolve? Is evolvability constrained by mutation? Can current definitions account for evolutionary innovations? Here, I will describe some of the research programs dedicated to the study of evolvability of complex organisms. I will then establish its relationship with modularity and robustness and conclude with questions about the nature of evolvability that remain unresolved. My aim is to show that research in evolvability has become integrative in nature and that this change has been aided by an increasing incorporation of the genotype-to-phenotype map into the variation-based evolutionary theory.
... We note that although bytecode generation is well explored in artificial life [22,23], run-time machine code generation has rarely been explored for interactive generative art. We have elsewhere indicated the potential for exploratory data visualization within the constraints of immersive performance [33]. ...
Conference Paper
Full-text available
We document techniques and insights gained through the creation of interactive visualizations of biologically-inspired complex sys- tems that have been exhibited as mixed-reality art installations since 2007. A binding theme is the importance of endogenous accounts: that all perceivable forms have dynamic ontological ca- pacities within the world; that the simulated world is able to au- tonomously originate; that as a result interaction can lead to ex- ploratory discovery; and that visitors become part of the ecosys- tem, both through immersive display and through interactions that induce presence. Details of how each of these components have been applied in the visualization, sonification, and interaction de- sign are given with specific examples of prototypes and exhibited installations.
... Building upon the framework of Core Wars, Steen Rasmussen designed a different program called "Coreworld", (Rasmussen, 1990). Random noise was added to the system incorporated within the MOV command, which copies instructions from one location in memory to another. ...
Thesis
Full-text available
John von Neumann first presented his theory of machine self reproduction in the late 1940’s in which he described a machine capable of performing the logical steps necessary to accommodate self reproduction, and provided an explanation in principle for how arbitrarily complex machines can construct other (offspring) machines of equal or even greater complexity. In this thesis, a machine having the von Neumann architecture for self reproduction is designed to operate within the computational world of Tierra. This design implements a (mutable) genotype-phenotype mapping during reproduction, and acts as an exploratory model to observe the phenomena which may arise with such a system. A substitution mapping was chosen to carry out the genotypephenotype mapping, and two specific implementations of a substitution mapping were investigated, via the use of a look-up table and a translation table. During implementation of the look-up table, preliminary experiments showed a degeneration to self copiers where a lineage of von Neumann style self reproducers degenerated into self copiers. Further experiments showed that a particular phenomenon emerges, where pathological constructors quickly develop, which can ultimately lead to total ecosystem collapse. If redundancy is introduced to the genotype-phenotype mapping, certain inheritable perturbations (mutations) prove to be non-reversible via a change to the genotype, which leads to a bias in the evolution of the genotype-phenotype mapping, consistently resulting in the loss of any target symbols from the mapping which are not vital for reproduction. It demonstrated how instances of Lamarkian inheritance may occur, which allowed these genetically “non-reversible” perturbations to be reversed, but only when accompanied by a very specific perturbation to the phenotype. The underlying dynamics of the chosen coding system was studied in order to better understand why these phenomena occur. When implementing a translation table, the space of possible mutations to the genotype-phenotype mapping was investigated and the same phenomena observed, where non vital symbols were lost from the mapping, and an instance of Lamarkian inheritance is necessary in order to introduce symbols to the mapping.
... Also noteworthy is Alexander Waitʼs project Quantum Coreworld, first reported in 2004 [132]. The basic system was inspired by Rasmussen et al.ʼs early ALife system Coreworld [92], with the addition of "physics" inspired by quantum mechanics. The program was written in the C language and ran continually on a web server. ...
Article
Full-text available
We present a survey of the first 21 years of web-based artificial life (WebAL) research and applications, broadly construed to include the many different ways in which artificial life and web technologies might intersect. Our survey covers the period from 1994-when the first WebAL work appeared-up to the present day, together with a brief discussion of relevant precursors. We examine recent projects, from 2010-2015, in greater detail in order to highlight the current state of the art. We follow the survey with a discussion of common themes and methodologies that can be observed in recent work and identify a number of likely directions for future work in this exciting area. THIS PAPER IS FREELY AVAILABLE OPEN ACCESS AT http://www.mitpressjournals.org/doi/abs/10.1162/ARTL_a_00211
... So-called ''coreworlds'' or ''automata chemistries'' denote a loosely defined set of related, but distinct, evolutionary computational systems. Examples include Coreworld (Rasmussen et al. 1990), Tierra , Avida (Adami and Brown 1994) and Stringmol (Hickinbotham et al. 2010(Hickinbotham et al. , 2016. These are conceptually, or superficially, similar to GP systems, in that the Darwinian individuals are computational processes (executing programs) hosted in some shared, or at least interconnected, memory system. ...
Article
Full-text available
The open-endedness of a system is often defined as a continual production of novelty. Here we pin down this concept more fully by defining several types of novelty that a system may exhibit, classified as variation, innovation, and emergence. We then provide a meta-model for including levels of structure in a system’s model. From there, we define an architecture suitable for building simulations of open-ended novelty-generating systems and discuss how previously proposed systems fit into this framework. We discuss the design principles applicable to those systems and close with some challenges for the community.
... We note that although bytecode generation is well explored in artificial life [22,23], run-time machine code generation has rarely been explored for interactive generative art. We have elsewhere indicated the potential for exploratory data visualization within the constraints of immersive performance [33]. ...
Article
Full-text available
Since 2007, Graham Wakefield and Haru Ji have looked to nature for inspiration as they have created a series of "artificial natures," or interactive visualizations of biologically inspired complex systems that can evoke nature-like aesthetic experiences within mixed-reality art installations. This article describes how they have applied visualization, sonification, and interaction design in their work with artificial ecosystems and organisms using specific examples from their exhibited installations.
... Objective-free evolution as well as self-replication have been studied in Artificial Life since Rasmussen's [15] and Ray's [16] work. Such research primarily investigates evolutionary dynamics in the absence of tasks, but as a result of implicit or environmental criteria that impact the ability to spread genomes through the population. ...
Conference Paper
Full-text available
The MONEE framework endows collective adaptive robotic systems with the ability to combine environment-and task-driven selection pressures: it enables distributed online algorithms for learning behaviours that ensure both survival and accomplishment of user-defined tasks. This paper explores the trade-off between these two requirements that evolution must establish when the task is detrimental to survival. To this end, we investigate experiments with populations of 100 simulated robots in a foraging task scenario where successfully collecting resources negatively impacts an individual's remaining lifetime. We find that the population remains effective at the task of collecting pucks even when the negative impact of collecting a puck is as bad as halving the remaining lifetime. A quantitative analysis of the selection pressures reveals that the task-based selection exerts a higher pressure than the environment.
... In the original Core War game, the diversity of organisms could not increase, and hence no evolution was possible. Rasmussen then designed a system similar to Core War in which the command that copied instructions was flawed and would sometimes write a random instruction instead on the one intended [33]. This flawed copy command introduced mutations into the system, and thus the potential for evolution. ...
... Avida is a computational evolution system that has been in development by Adami et al. since 1994 [1,4] . Inspired by its predecessors including Coreworld [6] and Tierra [7], Avida is an abstraction of a typical distributed or cluster computer. Each node comprises a virtual CPU running on memory, and the CPU has components such as several registers and stacks and various heads to work with. ...
Conference Paper
The theory of machine self-reproduction formalised by John von Neumann illustrates the real living organisms’ self-reproduction equipped with genotype and phenotype. However, within such a simulated world as Avida, this particular style of self-reproduction has not been previously studied. In an attempt to characterise the von Neumann style self-reproducer in a computational system, we have implemented a novel seed program that self-reproduces using von Neumann’s architecture. We expected that distinctly different evolutionary dynamics of organisms in the system would be observed, specifically including the possibility of mutationally altered genotype-phenotype mapping. However, what we have observed is degenerative displacement by self-copiers, which are conventional self-reproducers in the system. The mutational easiness of this degeneration was not anticipated, although we knew the selective advantage that such self-copiers intrinsically would have in the system.
... It is a model of single active machines, not of mutually interacting machines mutually defining their properties. This design decision, along with making a less 'brittle' programming language, was made with the aim of overcoming problems in earlier 'Core Wars' implementations (eg, [25]), where mutations mostly just destroyed the system. We believe that the biological inspiration strongly supports mutual modification, however, and that the routes to overcoming the Core Wars issues are a more sophisticated energy model, and a 'softer' language, particularly in respect to binding properties [4]. ...
Article
Full-text available
We model some of the crucial properties of biological novelty generation, and abstract these out into minimal requirements for an ALife system that exhibits constant novelty generation (open ended evolution) combined with robustness. The requirements are an embodied genome that supports run-time metaprogramming ('self modifying code'), generation of multiple behaviours expressible as interfaces, and special-isation via (implicit or explicit) removal of interfaces. The main application of self modifying code to date has been top down, in the branch of Artificial Intelligence concerned with learning to learn. However, here we take the bottom up Artificial Life philosophy seriously, and apply the concept to low level behaviours, in order to develop emergent novelty.
... These individuals are increasingly apt at surviving and procreating in their environment. In particular, adaptation to the environment through open-ended evolution without any task at all has long been studied in artificial life [13] and yielded a variety of methods and algorithms such as TIERRA [14] and followers ( [15][16][17][18][19][20], etc.). Such research investigates evolution as an adaptive process per se, driven by selection pressure that emerges from the interaction between individuals and their environment, just as it does in nature. ...
Article
Full-text available
Embodied evolutionary robotics is a sub-field of evolutionary robotics that employs evolutionary algorithms on the robotic hardware itself, during the operational period, i.e., in an on-line fashion. This enables robotic systems that continuously adapt, and are therefore capable of (re-)adjusting themselves to previously unknown or dynamically changing conditions autonomously, without human oversight. This paper addresses one of the major challenges that such systems face, viz. that the robots must satisfy two sets of requirements. Firstly, they must continue to operate reliably in their environment (viability), and secondly they must competently perform user-specified tasks (usefulness). The solution we propose exploits the fact that evolutionary methods have two basic selection mechanisms-survivor selection and parent selection. This allows evolution to tackle the two sets of requirements separately: survivor selection is driven by the environment and parent selection is based on task-performance. This idea is elaborated in the Multi-Objective aNd open-Ended Evolution (monee) framework, which we experimentally validate. Experiments with robotic swarms of 100 simulated e-pucks show that monee does indeed promote task-driven behaviour without compromising environmental adaptation. We also investigate an extension of the parent selection process with a 'market mechanism' that can ensure equitable distribution of effort over multiple tasks, a particularly pressing issue if the environment promotes specialisation in single tasks.
... There are simulators that use this approach. Some of them are based on computer code (Rasmussen et al., 1990;Ray, 1991;Adami and Brown, 1994). Computer code forms programs in the core. ...
Conference Paper
Full-text available
We present a new artificial chemistry simulator based on sim-ple physical and chemical rules. The simulator relies on a simplification of bonding and internal energy concepts found in chemistry to model simple, large scale, chemical reactions without delay between computation and visualization. En-ergy introduction and removal can be controlled in the sim-ulations in order to modulate reaction rates. The simulations demonstrate that with this simplified model of artificial chem-istry coupled with the concept of energy, it is possible to see the emergence of specific types of compounds, similar to real molecules.
Article
Full-text available
It has been argued that the historical nature of evolution makes it a highly path-dependent process. Under this view, the outcome of evolutionary dynamics could have resulted in organisms with different forms and functions. At the same time, there is ample evidence that convergence and constraints strongly limit the domain of the potential design principles that evolution can achieve. Are these limitations relevant in shaping the fabric of the possible? Here, we argue that fundamental constraints are associated with the logic of living matter. We illustrate this idea by considering the thermodynamic properties of living systems, the linear nature of molecular information, the cellular nature of the building blocks of life, multicellularity and development, the threshold nature of computations in cognitive systems and the discrete nature of the architecture of ecosystems. In all these examples, we present available evidence and suggest potential avenues towards a well-defined theoretical formulation.
Article
This article is an afterword to the book Rise of the Self-Replicators: Early Visions of Machines, AI and Robots That Can Reproduce and Evolve, coauthored by Tim Taylor and Alan Dorin (2020). The book covered the early history of thought about self-reproducing and evolving machines, from initial speculations in the 17th century up to the early 1960s (from which point onward the more recent history is already well covered elsewhere). This article supplements the material discussed in the book by presenting several relevant sources that have come to the author’s attention since the book was published. The most significant additions to the history are from the German-born, 19th-century inventor and utopian John Adolphus Etzler in the 1830s–1840s, the Hungarian author and satirist Frigyes Karinthy in 1916, and the U.S. mathematician and computer scientist Fred Stahl in 1960. ***** For further information and a link to a free author-formatted version of the paper, see https://www.tim-taylor.com/paper-details/taylor2024afterword.html *****
Conference Paper
Full-text available
This paper describes how a user modeling knowledge base for personalized TV servers can be generated starting from an analysis of lifestyles surveys. The aim of this research is the construction of well-designed stereotypes for generating adaptive electronic program guides (EPGs) which filter the information about TV events depending on the user’s interests.
Article
Full-text available
We present work in 3D printing electric motors from basic materials as the key to building a self-replicating machine to colonise the Moon. First, we explore the nature of the biological realm to ascertain its essence, particularly in relation to the origin of life when the inanimate became animate. We take an expansive view of this to ascertain parallels between the biological and the manufactured worlds. Life must have emerged from the available raw material on Earth and, similarly, a self-replicating machine must exploit and leverage the available resources on the Moon. We then examine these lessons to explore the construction of a self-replicating machine using a universal constructor. It is through the universal constructor that the actuator emerges as critical. We propose that 3D printing constitutes an analogue of the biological ribosome and that 3D printing may constitute a universal construction mechanism. Following a description of our progress in 3D printing motors, we suggest that this engineering effort can inform biology, that motors are a key facet of living organisms and illustrate the importance of motors in biology viewed from the perspective of engineering (in the Feynman spirit of “what I cannot create, I cannot understand”).
Article
Amoeba, a computer platform inspired by the Tierra system, is designed to study the generation of self-replicating sequences of machine operations (opcodes) from a prebiotic world initially populated by randomly selected opcodes. Point mutations drive opcode sequences to become more fit as they compete for memory and CPU time. Significant features of the Amoeba system include the lack of artificial encapsulation (there is no write protection) and a computationally universal opcode basis set. Amoeba now includes two additional features: Pattern-based addressing and injecting entropy into the system. It was previously thought such changes would make it highly unlikely that an ancestral replicator could emerge from a fortuitous combination of randomly selected opcodes. Instead, Amoeba shows a far richer emergence, exhibiting a self-organization phase followed by the emergence of self-replicators. First, the opcode basis set becomes biased. Second, short opcode building blocks are propagated throughout memory space. Finally, prebiotic building blocks can combine to form self-replicators. Self-organization is quantified by measuring the evolution of opcode frequencies, the size distribution of sequences, and the mutual information of opcode pairs.
Article
A Typogenetics is a formal system designed to study origins of life from a “primordial soup” of DNA molecules, enzymes and other building materials. It was introduced by Hofstadter (1979) in his seminal book Dialogues with Gödel, Escher, Bach: An Eternal Golden Braid. Autoreplicating molecules and systems of mutually replicating and catalyzing molecules (autoreplicators and hypercycles) are modeled in the present paper in a form composed of two strands of symbols. These strands are vehicle of two fold information: The first one corresponds to information that is transferred by strands. The second type of information specifies a process of strand replication. The used replicating molecules - strands are created by an approach closely related to evolutionary algorithms. While a small hypercycle of two molecules mutually supporting their reproduction can be created without extreme difficulties, it is nearly impossible to create a hypercycle involving more than 4 autoreplicators at once. This paper demonstrates that larger hypercycles can be created by an optimization and inclusion of new molecules into a smaller hypercycle. Such a sequential construction of hypercycles can substantially reduce the combinatorial complexity in comparison with a simultaneous optimization of single components of a large hypercycle.
Chapter
Full-text available
Artificial life has now become a mature inter-discipline. In this contribution, its roots are traced, its key questions are raised, its main methodological tools are discussed, and finally its applications are reviewed. As part of the growing body of knowledge at the intersection between the life sciences and computing, artificial life will continue to thrive and benefit from further scientific and technical progress on both sides, the biological and the computational. It is expected to take center stage in natural computing.
Conference Paper
This paper is inspired by a vision of self-sufficient robot collectives that adapt autonomously to deal with their environment and to perform user-defined tasks at the same time. We introduce the monee algorithm as a method of combining open-ended (to deal with the environment) and task-driven (to satisfy user demands) adaptation of robot controllers through evolution. A number of experiments with simulated e-pucks serve as proof of concept and show that with monee, the robots adapt to cope with the environment and to perform multiple tasks. Our experiments indicate that monee distributes the tasks evenly over the robot collective without undue emphasis on easy tasks.
Chapter
Full-text available
We introduce the distinctive, self-referential, logic of self-reproduction originally formulated by John von Neumann, and we present some initial results from novel realisations of this abstract architecture, embedded within two computational worlds: Tierra and Avida. In both cases, the von Neumann architecture proves to be evolutionarily fragile, for unanticipated, but relatively trivial, reasons. We briefly discuss some implications, and sketch prospects for further investigation.
Chapter
Glossary Definition of the Subject Introduction Basic Building Blocks of an Artificial Chemistry Structure-to‐Function Mapping Space Theory Evolution Information Processing Future Directions Bibliography
Article
Quantifying evolution and understanding robustness are best done with a system that is both rich enough to frustrate rigging of the answer and simple enough to permit comparison against either existing systems or absolute measures. Such a system is provided by the self-referential model matrix-genome, replication and translation, based on the concept of operators, which is introduced here. Ideas are also taken from the evolving micro-controller research. This new model replaces micro-controllers by simple matrix operations. These matrices, seen as abstract proteins, work on abstract genomes, peptides or other proteins. Studying the evolutionary properties shows that the protein-only hypothesis (proteins as active elements) shows poor evolvability and the RNA-before-protein hypothesis (genomes controlling) exhibits similar intricate evolutionary dynamics as in the micro-controller model. A simple possible explanation for this surprising difference in behavior is presented. In addition to existing evolutionary models, dynamical and organizational changes or transitions occurring late in long-term experiments are demonstrated.
Conference Paper
Evolution can be employed for two goals. Firstly, to provide a force for adaptation to the environment as it does in nature and in many artificial life implementations - this allows the evolving population to survive. Secondly, evolution can provide a force for optimisation as is mostly seen in evolutionary robotics research - this causes the robots to do something useful. We propose the MONEE algorithmic framework as an approach to combine these two facets of evolution: to combine environment-driven and task-driven evolution. To achieve this, MONEE employs environment-driven and task-based parent selection schemes in parallel. We test this approach in a simulated experimental setting where the robots are tasked to collect two different kinds of puck. MONEE allows the robots to adapt their behaviour to successfully tackle these tasks while ensuring an equitable task distribution at no cost in task performance through a market-based mechanism. In environments that discourage robots performing multiple tasks and in environments where one task is easier than the other, MONEE's market mechanism prevents the population completely focussing on one task.
Chapter
Full-text available
We discuss ways of defining complexity in physics, and in particular for symbol sequences typically arising in autonomous dynamical systems. We stress that complexity should be distinct from randomness. This leads us to consider the difficulty of making optimal forecasts as one (but not the only) suitable measure. This difficulty is discussed in detail for two different examples: left-right symbol sequences of quadratic maps and 0–1 sequences from 1-dimensional cellular automata iterated just one single time. In spite of the seeming triviality of the latter model, we encounter there an extremely rich structure.
Article
Full-text available
Quantities are defined operationally which qualify as measures of complexity of patterns arising in physical situations. Their main features, distinguishing them from previously used quantities, are the following: (1) they are measuretheoretic concepts, more closely related to Shannon entropy than to computational complexity; and (2) they are observables related to ensembles of patterns, not to individual patterns. Indeed, they are essentially Shannon information needed to specify not individual patterns, but either measure-theoretic or algebraic properties of ensembles of patterns arising ina priori translationally invariant situations. Numerical estimates of these complexities are given for several examples of patterns created by maps and by cellular automata.
Article
We construct a simplified model for the chemistry of molecules such as polypeptides or single stranded nucleic acids, whose reactions can be restricted to catalyzed cleavage and condensation. We use this model to study the spontaneous emergence of autocatalytic sets from an initial set of simple building blocks, for example short strands of amino acids or nucleotides. When the initial set exceeds a critical diversity, autocatalytic reactions generate large molecular species in abundance. Our results suggest that the critical diversity is not very large. Autocatalytic sets formed in this way can be regarded as primitive connected metabolisms, in which particular species are selected if their chemical properties are advantageous for the metabolism. Such autocatalytic sets may have played a crucial role in the origin of life, providing a bridge from simple molecular species to complex proteins and nucleic acids. Many of our results are experimentally testable.
Article
This article investigates the possibility that the emergence of reflexively autocatalytic sets of peptides and polypeptides may be an essentially inevitable collective property of any sufficiently complex set of polypeptides. The central idea is based on the connectivity properties of random directed graphs. In the set of amino acid monomer and polymer species up to some maximum length, M, the number of possible polypeptides is large, but, for specifiable "legitimate" end condensation, cleavage and transpeptidation exchange reactions, the number of potential reactions by which the possible polypeptides can interconvert is very much larger. A directed graph in which arrows from smaller fragments to larger condensation products depict potential synthesis reactions, while arrows from the larger peptide to the smaller fragments depict the reverse cleavage reactions, comprises the reaction graph for such a system. Polypeptide protoenzymes are able to catalyze such reactions. The distribution of catalytic capacities in peptide space is a fundamental problem in its own right, and in its bearing on the existence of autocatalytic sets of proteins. Using an initial idealized hypothesis that an arbitrary polypeptide has a fixed a priori probability of catalyzing any arbitrary legitimate reaction to assign to each polypeptide those reactions, if any, which it catalyzes, the probability that the set of polypeptides up to length M contains a reflexively autocatalytic subset can be calculated and is a percolation problem on such reaction graphs. Because, as M increases, the ratio of reactions among the possible polypeptides to polypeptides rises rapidly, the existence of such autocatalytic subsets is assured for any fixed probability of catalysis. The main conclusions of this analysis appear independent of the idealizations of the initial model, introduce a novel kind of parallel selection for peptides catalyzing connected sequences of reactions, depend upon a new kind of minimal critical complexity whose properties are definable, and suggest that the emergence of self replicating systems may be a self organizing collective property of critically complex protein systems in prebiotic evolution. Similar principles may apply to the emergence of a primitive connected metabolism. Recombinant DNA procedures, cloning random DNA coding sequences into expression vectors, afford a direct avenue to test the distribution of catalytic capacities in peptide space, may provide a new means to select or screen for peptides with useful properties, and may ultimately lead toward the actual construction of autocatalytic peptide sets.
Theoretical Immuniology, I, II, SFI Studies in the Sciences of Complexity
  • A Perelson
A. Perelson, Theoretical Immuniology, I, II, SFI Studies in the Sciences of Complexity, Vols. II, III, ed. A. Perel-son (Addison-Wesley, Reading, MA 1988).
Aspects of instabilities and self-organizing processes
  • Rasmussen
S. Rasmussen, Aspects of instabilities and self-organizing processes, Ph.D. Thesis, Physics Laboratory III, The Technical University of Denmark (1985) (in Danish).
Digital chemistry, preprint
  • W Fontana
W. Fontana, Digital chemistry, preprint (1989).
Elements of a quantitative theory of prebiotic evolution, LA-UR-89-1881
  • S Rasmussen
  • B Bollobás
  • E Mosekilde
S. Rasmussen, B. Bollob~ts and E. Mosekilde, Elements of a quantitative theory of prebiotic evolution, LA-UR-89-1881, J. Theor. Biol., to appear,
Toward a quantitative theory of the orion of life, in: Artificial Life
  • S Rasmussen
S. Rasmussen, Toward a quantitative theory of the orion of life, in: Artificial Life, SFI Studies in the Sciences of Complexity, Vol. VI, ed. C. Langton (Addison-Wesley, Reading, MA, 1989) pp. 79-104.
A minimal integrated recognition-processing model for macromolecular evolution, preprint
  • J Mccaskill
J. McCaskill, A minimal integrated recognition-processing model for macromolecular evolution, preprint (1988).
Theoretical Immuniology, I, II
  • Perelson