Article

Computer recreations: In the game called Core War hostile programs engage in a battle of bits

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... 4. Given a source code implementation of the kernel annotated with pre-and postconditions. 5. Demonstrate that the event code does not violate separation by constructing the following: ...
... game created by D. G. Jones and A. K. Dewdney [5,6,7,8] in which two or more battle programs (called "warriors") compete for control of a virtual computer. These battle programs are written in an abstract assembly language called Redcode and run by a program called MARS (Memory Array Redcode Simulator). ...
Technical Report
Full-text available
The initiative of this work is to investigate the computer security. We start from the semantic specification of an operating system machine pMARS, and then to give the formal analysis of the GWV separation policy. This work can be used as the bases for investigation of design and implementation of micro-processors and operating systems. pMARS is a very simple machine. Unlike most of ordinary micro-processors it has no register arrangement and interruption mechanism, the thread context switch is in the level of instructions, therefore the formal analysis of the security policy for pMARS becomes easy. As the next step we will extend our method to a real micro-processor architecture.
... Work on more naturally evolving computational systems began in 1990, when Steen Rasmussen was inspired by the computer game "Core War" [10]. In this game, programs are written in a simplified assembly language and made to compete in the simulated core memory of a computer. ...
... Now, BX= 1, CX= 2, and AX is undefined. 10 nop-C 11 ...
... One of the oldest and most popular venues for the development and research of programs executing in a simulated environment is CoreWar, in which programs (referred to as warriors) attempt to survive in a looping memory array. The system was introduced in 1984 by A. K. Dewdney in an article in the Scientific American [1]. Basically, two programs are placed in the array end executed until one is completely eliminated from the process queue. ...
... In some competitions warriors are allowed to access this memory and change their strategy, if necessary, to ensure better performance in future rounds. CoreWar was introduced by A. K. Dewdney in 1984, in an article published in the Scientific American [1]. Today, CoreWar exists as a programming game with ongoing online competitions on several servers, among which are www.koth.org/ ...
Conference Paper
Full-text available
CoreWar is a computer simulation where two programs written in an assembly language called redcode compete in a virtual memory array. These programs are referred to as warriors. Over more than twenty years of development anumber of different battle strategies have emerged, making it possible to identify different warrior types. Systems for automatic warrior creation appeared more recently, evolvers being the dominant kind. This paper describes an attempt to analyze the output of the CCAI evolver, and explores the possibilities for performing automatic categorization by warrior type using representations based on redcode source, as opposed to instruction execution frequency. Analysis was performed using EM clustering, as well as information gain and gain ratio attribute evaluators, and revealed which mainly brute-force types of warriors were being generated. This, along with the observed correlation between clustering and the workings of the evolutionary algorithm justifies our approach and calls for more extensive experiments based on annotated warrior benchmark collections.
... One of the oldest and most popular venues for the development and research of programs executing in a simulated environment is CoreWar, in which programs (referred to as warriors) attempt to survive in a looping memory array. The system was introduced in 1984 by A. K. Dewdney in an article in the Scientific American [1]. Basically, two programs are placed in the array end executed until one is completely eliminated from the process queue. ...
... CoreWar was introduced by A. K. Dewdney in 1984, in an article published in the Scientific American [1]. Today, CoreWar exists as a programming game with ongoing online competitions on several servers, among which are www.koth.org/ ...
... The first computer worm immediately prompted the corresponding malware detection software [1]. Since then, entire textbooks [2], [3] and conferences [4] have been devoted to the automated detection of unwanted software, to remove it from computer systems or preventing it from arriving at all. ...
Conference Paper
Full-text available
Open-source, community-driven package repositories see thousands of malware packages each year, but do not currently run automated malware detection systems. In this work, we explore the security goals of the repository administrators and the requirements for deploying such malware scanners via a case study of the Python ecosystem and PyPI repository, including interviews with administrators and maintainers. Further, we evaluate existing malware detection techniques for deployment in this setting by creating a benchmark dataset and comparing several existing tools: the malware checks implemented in PyPI, Bandit4Mal, and OSSGadget's OSS Detect Backdoor. We find that repository administrators have exacting requirements for such malware detection tools. Specifically, they consider a false positive rate of even 0.1% to be unacceptably high, given the large number of package releases that might trigger false alerts. Measured tools have false positive rates between 15% and 97%; increasing thresholds for detection rules to reduce this rate renders the true positive rate useless. While automated tools are far from reaching these demands, we find that a socio-technical malware detection system has emerged to meet these needs: external security researchers perform repository malware scans, filter for useful results, and report the results to repository administrators. These parties face different incentives and constraints on their time and tooling. We conclude with recommendations for improving detection capabilities and strengthening the collaboration between security researchers and software repository administrators.
... Il y a cependant des exemples de programmes dans lesquels la fonction émerge de l'arrangement d'éléments simples, par exemple les créatures de Karl Sims [Sims, 1994 a et b], des blocs géométriques animés, reliés entre eux par des effecteurs (des « muscles ») et dont l'évolution se faisait sur la structure de contrôle (un réseau) coordonnant leur activité motrice, ou les Core Wars proposés par D.G. Jones et A.K. Dewdney [Jones, 1984 ;Dewdney, 1984Dewdney, & 1985Rennard, 2002], des environnements mémoire dans lesquels des créatures formées d'instruction en pseudo-langage assembleur 57 se combattent et évoluent, Venus de S. Rasmussen [Rasmussen, 1989[Rasmussen, & 1990 ou Tierra de T. Ray [Ray, 1992[Ray, & 1994, puis Avida [Adami, 2000] en étant des raffinements particulièrement étudiés. Ces modèles historiques en ont suscité de nombreux autres dont je ne vais pas faire l'inventaire ici. ...
Thesis
[FR] Les origines de la vie, la nature même de la matière vivante et son activité biologique, les représentations théoriques que nous en faisons pour la modéliser, l’appréhender, la contrôler, et les conséquences que l’usage de ces représentations ont sur sur notre façon de penser le vivant sont autant de considérations théoriques, méthodologiques et philosophiques que j’aborde dans ce manuscrit en m’appuyant sur mes recherches expérimentales et théoriques actuelles et passées concernant l’organisation du vivant, ses origines et son évolution. Cette réflexion et les recherches effectives que nous menons, mes étudiants, collaborateurs et moi, se placent dans le cadre de l’émergence et la complexification de la vie. Les systèmes biologiques depuis les âges prébiotiques (ex : proto-métabolismes, proto-génetique) jusqu’à nos jours (ex : réseaux de régulation génétique, organisations cellulaires, subcellulaires, réseaux de neurones) ont évolué mais dépendent toutefois de systèmes existants. L’étude des propriétés structurelles et dynamiques (complexité, robustesse) mais également évolutives, de systèmes formels comme les réseaux Booléens ou autres systèmes de vie artificielle, et la recherche d’ensembles de systèmes aux propriétés structurelles ou dynamiques communes, permet d’introduire une vision ensembliste des systèmes biologiques et penser l’évolution ou les variations de fonctionnement d’un système biologique comme des trajectoires dans un paysage morphogénétique, un meta-réseau. Le passage, évolutif ou fonctionnel, d’un système à un autre est permis par leurs proximités structurelles et dynamiques. Notre travail utilise de tels systèmes formels en tant qu’abstractions des systèmes biologiques pour étudier (i) comment ils évoluent tout en préservant des caractères et comportements ancestraux, (ii) ce qui détermine l’évolutivité de ces réseaux, leur respective robustesse vis-à-vis de changements structuraux et leur relation avec la complexité structurelle et fonctionnelle, (iii) comment des trajectoires peuvent exister sous contraintes de viabilité dans de tels paysages morphogénétiques, et (iv) comment des systèmes et leurs comportements associés peuvent se combiner au cours de ces évolutions. L’essai que je propose ici s’inscrit dans le cadre général de la biologie théorique et de la philosophie des sciences. J’y aborde des questions aussi variées que la méthode scientifique, l’humain dans les sciences, l’organisation et le fonctionnement de la vie, la nature des modèles, le contrôle, la finalité du vivant. J’y réfléchis, sur la base des recherches effectives que nous menons, à comment et pourquoi s’éloigner de la pensée mécaniste pour considérer davantage un vivant dont la nature et l’activité, depuis ses origines jusqu’au vivant actuel, sont davantage fondées sur ce que je désigne par rare, faible et amorphe, des qualificatifs qui s’éloignent de la conception que nous en avons, le vivant-machine, et des outils dont nous disposons pour l’étude du monde biologique. [EN] I address in this manuscript theoretical, methodological and philosophical considerations based on my current and past, experimental and theoretical research on the organisation of life, its origins and its evolution. Of particular interest is how the theoretical representations of the very nature of living matter and the origins of life have consequences when using modeling to understand and control living matter. These reflection and research are carried out by my students, collaborators and myself in the context of the emergence and complexification of life. Biological systems from prebiotic ages (e.g. proto-metabolisms, proto-genetics) to the present day (e.g. gene regulation networks, cellular and subcellular organisations, neural networks) have evolved but continue to depend on current systems. The study of structural and dynamical properties (complexity, robustness) but also evolutionary properties of formal systems such as Boolean networks or other artificial life systems, and the search for sets of systems with common structural or dynamical properties, allows us to introduce a holistic vision of biological systems and to think of their evolution and variations in terms of trajectories in a morphogenetic landscape, i.e. a meta-network. An evolutionary or functional path from one system to another is allowed by their structural and dynamical proximities. In our work, using such formal systems as abstractions of biological systems, we study (i) how they evolve while preserving ancestral traits and behaviours, (ii) what determines the evolvability of these networks, their respective robustness to structural changes and their relation to structural and functional complexity, (iii) how trajectories can exist under viability constraints in such morphogenetic landscapes, and (iv) how systems and their associated behaviours can combine during these evolutions. The essay I propose falls within the general framework of theoretical biology and the philosophy of sciences. I try to address questions as various as the scientific methodology, humans in science, the organisation of life, the nature of models, control, purpose in life and many others. While using machine-based formalisms such as Boolean networks, I aim at legitimate the need to move away from a mechanistic thinking and the usual conception of the biological world, i.e. Life as a Machine, and the associated theoretical and experimental tools we use to study it. I actually propose to consider an alternate view of the nature and functioning of living matter - from its origins to the present day - based more on what I call the rare, the weak and the amorphous.
... After the creation of the first computer worm, the first malware detection software followed almost immediately [1]. The prevalence of malware has only increased over time [2]. ...
Preprint
Full-text available
Open-source, community-driven package repositories see thousands of malware packages each year, but do not currently run automated malware detection systems. In this work, we explore the security goals of the repository administrators and the requirements for deploying such malware scanners via a case study of the Python ecosystem and PyPI repository, including interviews with administrators and maintainers. Further, we evaluate existing malware detection techniques for deployment in this setting by creating a benchmark dataset and comparing several existing tools: the malware checks implemented in PyPI, Bandit4Mal, and OSSGadget’s OSS Detect Backdoor. We find that repository administrators have exacting requirements for such malware detection tools. Specifically, they consider a false positive rate of even 0.1% to be unacceptably high, given the large number of package releases that might trigger false alerts. Measured tools have false positive rates between 15% and 97%; increasing thresholds for detection rules to reduce this rate renders the true positive rate useless. While automated tools are far from reaching these demands, we find that a socio-technical malware detection system has emerged to meet these needs: external security researchers perform repository malware scans, filter for useful results, and report the results to repository administrators. These parties face different incentives and constraints on their time and tooling. We conclude with recommendations for improving detection capabilities and strengthening the collaboration between security researchers and software repository administrators.
... Inspired by Darwin, Core War places two competing assembly programs in a virtual computer which battle for total control [12]. Both programs are loaded into a random location in memory and take turns executing one instruction at a time. ...
Preprint
Full-text available
In this work, a neural network is trained to replicate the code that trains it using only its own output as input. A paradigm for evolutionary self-replication in neural programs is introduced, where program parameters are mutated, and the ability for the program to more efficiently train itself leads to greater reproductive success. This evolutionary paradigm is demonstrated to produce more efficient learning in organisms from a setting without any explicit guidance, solely based on natural selection favoring organisms with faster reproductive maturity.
... [5] A Computer Recreations feature from 1984 devoted to computer viruses made no mention of Elk Cloner. [6] More than three decades after its release, the Apple ][ retains a devoted following. Thousands of disk images of Apple ][ programs are currently available for download. ...
Preprint
Although self-replicating programs and viruses have existed since the 1960s and 70s, Elk Cloner was the first virus to circulate among personal computers in the wild. Despite its historical significance, it received comparatively little attention when it first appeared in 1982. In this paper, we: present the first detailed examination of the operation and structure of Elk Cloner; discuss the effect of environmental characteristics on its virulence; and provide supporting evidence for several hypotheses about why its release was largely ignored in the early 1980s.
... Many of the important features of Avida derive indirectly from the system called Core- world ( Rasmussen et al., 1990). The Coreworld system itself was inspired by Core War (Dewdney, 1984), which was released as a computer game mainly oriented for program- mers. The basic idea of the game is that programmers write computer programs that would beat others in the Core War battle. ...
Thesis
The seminal architecture of machine self-reproduction originally formulated by John von Neumann underpins the mechanism of self-reproduction equipped with genotype and phenotype. In this thesis, initially, a hand-designed prototype von Neumann style selfreproducer as an ancestor is described within the context of the artificial life system Avida. The behaviour of the prototype self-reproducer is studied in search of evolvable genotype-phenotype mapping that may potentially give rise to evolvable complexity. A finding of immediate degeneration of the prototype into a self-copying mode of reproduction requires further systematic analysis of mutational pathways. Through demarcating a feasible and plausible characterisation and classification of strains, the notion of viability is revisited, which ends up being defined as quantitative potential for exponential population growth. Based on this, a framework of analysis of mutants' evolutionary potential is proposed, and, subsequently, the implementation of an enhanced version of the standard Avida analysis tool for viability analysis as well as the application of it to the prototype self-reproducer strain are demonstrated. Initial results from a one-step single-point-mutation space of the prototype, and further, from a multi-step mutation space, are presented. In the particular case of the analysis of the prototype, the majority of mutants unsurprisingly turn out to be simply infertile, without viability; whereas mutants that prove to be viable are a minority. Nevertheless, by and large, it is pointed out that distinguishing reproduction modes algorithmically is still an open question, much less finer-grained distinction of von Neumann style self-reproducers. Including this issue, speciifc limitations of the enhanced analysis are discussed for future investigation in this direction.
... What we have chosen is the Core War machine. Core War is a programming game created by D. G. Jones and A. K. Dewdney [5,6,7,8] in which two or more battle programs (called "warriors") compete for control of a virtual computer. These battle programs are written in an abstract assembly language called Redcode and run by a program called MARS (Memory Array Redcode Simulator). ...
... Core War is a programming game created by D. G. Jones and A. K. Dewdney [5,6,7,8] in which two or more battle programs (called "warriors") compete for control of a virtual computer. These battle programs are written in an abstract assembly language called Redcode and run by a program called MARS (Memory Array Redcode Simulator). ...
Technical Report
Full-text available
This work is about computer security. We will start from working out of a formal specification of a computer game called Core War, then providing analysis on its properties. The work is carried out completely in terms of a formal mathematical proof development system called PowerEpsilon. We are not going to investigate the computer virus, since we are not authorized to do the research in this area. Instead, our target is to work out the protection mechanisms for computer software. This work is the first step for our entire project.
... What we have chosen is the Core War machine. Core War is a programming game created by D. G. Jones and A. K. Dewdney [5,6,7,8] in which two or more battle programs (called "warriors") compete for control of a virtual computer. These battle programs are written in an abstract assembly language called Redcode and run by a program called MARS (Memory Array Redcode Simulator). ...
Technical Report
Full-text available
We are involved in a research project of embedded real-time operating system requested for providing a formalization model. What we are going to do are the following: - To design a small and simple abstract operating system machine which is easy to be modified and enhanced, and to be able to reflect the features we are gong to investigate. - We are not going to give a complete proof of correctness for existed real-time operating system. What means a correct operating system is still an open issue. Instead, we will concentrate on modeling the mathematical properties we have concerned about such as the separation, schedulability, safety, and liveness.
... Tanto los VI como los VB únicamente manifiestan actividad sobre el "huésped" si éste está vivo (célula, etc. en el caso de los VB) o encendido (máquina; en el caso de los VI). (1949 y 1955 In: Mur et al, 1990), los Core War (Dewdney, 1984 et gusanos, etc. Efectivamente los VI compartirían con ellos la capacidad de autoreproducirce y el código máquina binario pero, en principio, sólo eso y me parece algo exagerado derivar de dichos programas, (tan poco parecidos a los VI) a los mismos virus informáticos tan sólo por una o dos características emergentes en común. INTEL TM 8088 u 8086 y sistema operativo DOS, pero no pueden infectar o funcionar en otras máquinas DOS con microprocesadores compatibles con aquel como el NEC V20, NEC V30, NEC V40, 80286 (INTEL TM , AMD TM , Harris TM ), 80386 (INTEL TM , AMD TM , Chips & Technologies TM ), 80486 (INTEL TM , AMD TM , Cyrix TM , UMC TM ). ...
... Furthermore, different software elements often struggle for the same memory resources and compete against each other rather than collaborate. A similar behavior was observed and became famous in Core War (Dewdney, 1984), where hostile programs engage in a battle of memory bits. Thus, the adoption of natural mechanisms comes along with natural forces, which we have to tame. ...
Article
Living systems reached a good balance between competition and co-operation in order to prevail – as individual or as a group. Artificial systems either work in isolation or are manually designed to cooperate, which is of paramount importance in networking applications. Recently, research considered bio-inspired approaches to increase the robustness of distributed algorithms. However, when mimicking natural rules such as applying natural selection, the resulting systems often compete rather than cooperate in the struggle for existence. We recently presented an execution model for networking protocols inspired by chemical reactions in which we organized networking soft-ware as self-rewriting sets of "molecules". If memory is limited, our protocol software exhibits remarkable robustness to faults and is able to run on unreliable hardware, because healthy software is able to replicate while faulty elements die out. In this report we study the competitive nature of this environment and propose a methodology to design complex self-healing software that is able to cooperate therein. We resort to the study of self-organization in nature and adapt concepts like Eigen's Hypercycle to our software. As an application case, we demonstrate how the competitive and cooperative forces can be exploited for a controlled update of software in a network.
... Instead they focus on algorithm development for traditional processors. Many algorithms or exploration techniques were developed over the past decades, whether inspired from evolution with genetic algorithms and evolutionary strategies, the human immune system with artificial immune systems, flocking and insect swarming with swarm intelligence, the brain with artificial neural networks, competition for survival with Core Wars (Dewdney 1984) and self-replication and speciation with Tierra (Ray 1990) or Avida (Ofria and Wilke 2004). Dedicated languages and frameworks such as Push and PushGP (Spector 2001) were also designed and developed to assist implementation of evolutionary algorithms. ...
Chapter
Full-text available
Natural systems provide unique examples of computation in a form very different from contemporary computer architectures. Biology also demonstrates capabilities such as adaptation, self-repair and self-organisation that are becoming increasingly desirable for our technology. To address these issues a computer model and architecture with natural characteristics is presented. Systemic computation is Turing Complete; it is designed to support biological algorithms such as neural networks, evolutionary algorithms and models of development, and shares the desirable capabilities of biology not found in conventional architectures. In this chapter we describe the first platform implementing such computation, including programming language, compiler and virtual machine. We first demonstrate that systemic computing is crash-proof and can recover from severe damage. We then illustrate various benefits of systemic computing through several implementations of bio-inspired algorithms: a self-adaptive genetic algorithm, a bio-inspired model of artificial neural networks, and finally we create an "artificial organism" -a program with metabolism that eats data, expels waste, clusters cells based on data inputs and emits danger signals for a potential artificial immune system. Research on systemic computation is still ongoing, but the research presented in this chapter shows that computers that process information according to this bio-inspired paradigm have many of the features of natural systems that we desire.
... Work on more naturally evolving computational systems began in 1990, when Steen Rasmussen was inspired by the computer game "Core War" [14]. In this game, programs are written in a simplified assembly language and made to compete in the simulated core memory of a computer. ...
Article
Avida1 is a software platform for experiments with self-replicating and evolving computer programs. It provides detailed control over experimental settings and protocols, a large array of measurement tools, and sophisticated methods to analyze and post-process experimental data. This chapter explains the general principles on which Avida is built, its main components and their interactions, and gives an overview of some prior research.
... About 20 years later, in 1984, D. G. Jones and A. K. Dewdney wrote the "Core War Guidelines" formalizing the modern Core War. In the same year, Dewdney's column in Scientific American [1] popularized the game drawing a huge interest from both the scientific community and hobbyists. ...
Article
Full-text available
In this paper, Core War, a very peculiar game popular in mid 80’s, is exploited as a benchmark to improve the µGP, an evolutionary algorithm able to gene rate Touring-complete, realistic assembly programs. Two techniques were analyzed: co-evolution and a modified island model. Experimental results showed,that the former is e ssential in the beginning of the evolutionary process, but may be d eceptive in the end. Differently, the latter enables focusing the search on specific region of the search space and lead to dramatic improvements. The use of both techniques to help the µGP in its real task (test program generation for microprocessor) is currently being evaluated.
... An example is Core Wars [46], for which evolutionary experiments have been run. However, the most famous evolutionary experiments on machine code organisms is indisputably the Tierra system [146]. ...
... We may derive from this study some insights on the evolvability of artificial evolutionary systems. Fig. 1.9 shows a rough analogy between the development of digital organisms made by computer programs [1,5,7,9] and the development of self-reproducing loops on CA including the evoloop. Behaviors of these artificial systems are classified here into three categolies: self-reproductive, competitive, and evolvable; the last category is further divided into two: adaptive to physical environment and adaptive to other individuals. ...
... CoreWar was introduced to the scientific community by A.K. Dewdney in 1984 in an article published in Scientific American [1]. It represented an interesting, but not completely unfamiliar concept. ...
Article
Full-text available
In this paper, an optimizer for programs written in an assembly-like language called Redcode is presented. Relevance of code optimization in evolutionary program creation strategies and code categorization is discussed. CoreWar Optimizer is the first user-friendly optimization tool for CoreWar programs offering various optimization methods and a carefully picked benchmark. The methods at the user's disposal are: random, modified hill climbing algorithm, simulated annealing, predator-prey particle swarm optimization and genetic algorithms. All these methods use a speed-up trick which drives the value optimization across three fitness landscapes instead of just one.
... Whenever the VICI Agent's Surgical repair restores the proper values, the kernel thread writes the improper values back again, negating the repair. Upon witnessing the failure of its Surgical repair, the VICI Agent moves to the Core War repair action inspired by the classic game of Core War [9]. The Core War repair finds the rootkit's text by following the improper pointer from the system call vector, and then " neuters " the rootkit by re-writing its code to jump immediately to the kernel's proper function without performing any of the rootkit's malicious functionality. ...
Conference Paper
When systems are under constant attack, there is no time to restore those infected with malware to health manually--repair of infected systems must be fully automated and must occur within milliseconds. After detecting kernel-modifying rootkit infections using Virtual Machine Introspection, the VICI Agent applies a collection of novel repair techniques to automatically restore infected kernels to a healthy state. The VICI Agent operates without manual intervention and uses a form of automated reasoning borrowed from robotics to choose its best repair technique based on its assessment of the current situation, its memory of past engagements, and the potential cost of each technique. Its repairs have proven effective in tests against a collection of common kernel-modifying rootkit techniques. Virtualized systems monitored by the VICI Agent experience a decrease in application performance of roughly 5%.
... The true origin of corewar dates back to Darwin, a game devised by Vyssotsky, Morris and Ritchie in the early 1960s at Bell Labs. However, the popularization of the game is due to Dewdney column in Scientific American [3] in 1984. In the same year, Dewdney and Jones rigorously characterize corewar and redcode in a document titled "Corewar Guidelines". ...
Conference Paper
Full-text available
The paper describes the attempt to cultivate programs to climb the nano hill, a contest with exceptionally tight parameters of the game called corewar. An existing tool, called muGP, has been exploited. Two genetic operators were added to tackle the peculiarities of the objective. The generated programs compared favorably with others, either manually written or evolved. muGP autonomously reproduced the same structure of the current champion of the competition, and devised a sharp self-modifying program exploiting a completely new strategy
... A number of artificial life systems have been developed that evolve machine code representations, most notably Core War [6], Tierra [13], and Avida [2]. Unlike ObjRecombGA, however, they all work with programs encoded in bytecodes designed to facilitate evolutionary search rather than the object files produced by standard compilers. ...
Conference Paper
Full-text available
This paper presents ObjRecombGA, a genetic algorithm framework for recombining related programs at the object file level. A genetic algorithm guides the selection of object files, while a robust link resolver allows working program binaries to be produced from the object files derived from two ancestor programs. Tests on compiled C programs, including a simple web browser and a well-known D video game, show that functional program variants can be created that exhibit key features of both ancestor programs. This work illustrates the feasibility of applying evolutionary techniques directly to commodity applications
... CoreWar was introduced by A. K. Dewdney in 1984 in an article published in Scientific American [1]. It was based on a game called Darwin developed in Bell Labs in 1960, devised by Victor Vyssotsky, Robert Morris Sr. and Dennis Richie. ...
Conference Paper
Full-text available
CoreWar is a computer simulation devised in the 1980s where pro- grams loaded into a virtual memory array compete for control over the virtual machine. These programs are written in a special-purpose assembly language called Redcode and referred to as warriors. A great variety of environments and battle strategies have emerged over the years, leading to formation of different warrior types. This paper deals with the problem of automatic warrior categoriza- tion, presenting results of classification based on several approaches to warrior representation, and offering insight into ambiguities concerning the identification of strategic classes. Over 600 human-coded warriors were annotated, forming a training set for classification. Several major classifiers were used, SVMs proving to be the most reliable, reaching accuracy of 84%. Classification of an evolved warrior set using the trained classifiers was also conducted. The obtained results proved helpful in outlining the issues with both automatic and manual Redcode program categorization.
... The technical possibility of self-modifying (including self-deleting) code has been sufficiently demonstrated, for example, by the makers and players of the computer game 'CoreWars'[12]; this is a consequence of the non-difference between 'instruction' and 'data' in the already mentioned von-Neumann/Zuse model of computation. ...
Article
Full-text available
The purpose of this article (based on an earlier draft available as technical report: Gruner S, Mobile agent systems and cellular automata. LaBRI Research Reports, 2006) is to make a step towards uniting the paradigms of cellular automata and mobile agents, thus consequentially the fields of artificial life and multi agent systems, which have significant overlap but are still largely perceived as separate fields. In Chalopin et al. (Mobile agent algorithms versus message passing algorithms, pp. 187–201, 2006) the equivalent power of classical distributed algorithms and mobile agent algorithms was demonstrated for asynchronous systems with interleaving semantics under some further constraints and assumptions. Similar results are still being sought about mobile agent systems and distributed systems under other constraints and assumptions in search of a comprehensive general theory of these topics. This article investigates the relationship between mobile agent systems and a generalized form of cellular automata. With a particular notion of local equivalence, a cellular automaton can be translated into a mobile agent system and vice versa. The article shows that if the underlying network graph is finite, then the degree of pseudo-synchrony of the agent system simulating the cellular automaton can be made arbitrarily high, even with an only small number of active agents. As a possible consequence of this theoretical result, the Internet might be used in the future to implement large cellular automata of almost arbitrary topology.
... High latency can be compensated by slowing down the pace of the game or by gathering the commands of a certain period and issuing them simultaneously like the SMS television games [4]. Because the computational burden lies now in the server, we can even allow the players to code the operational level logic themselves like in Core War [5] or AIsHockey [6]. ...
Conference Paper
Full-text available
We introduce three game design concepts which, at the same time, allow interaction among multiple players but do not require as large networking resources as real-time interaction. These concepts can be used, for example, in mobile platforms where the communication resources are limited.
... The more direct descendent of the Vyssotsky system was the Core War game, developed by Dewdney and others in the early 1980's. This now relied on an interpreter and offered much more varied gameplay opportunities (Dewdney, 1984). Following the establishment of an international tournament (Dewdney, 1987), Core War has had a sustained following, and remains active to this day. ...
Article
I (briefly) review the history of work in Artificial Life on the problem of the open-ended evolutionary growth of complexity in computational worlds. This is then put into the context of evolutionary epistemology and human creativity. @InProceedings{mcmullin:DSP:2009:2200, author = {Barry McMullin}, title = {Artificial Life Meets Computational Creativity?}, booktitle = {Computational Creativity: An Interdisciplinary Approach}, year = {2009}, editor = {Margaret Boden and Mark D'Inverno and Jon McCormack}, number = {09291}, series = {Dagstuhl Seminar Proceedings}, ISSN = {1862-4405}, publisher = {Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik, Germany}, address = {Dagstuhl, Germany}, URL = {http://drops.dagstuhl.de/opus/volltexte/2009/2200}, annote = {Keywords: Artificial life, complexity, computational creativity,} }
... Dieses Bild geht zurück auf das Spiel "Core War" von Dewdney[38] in dem die Spieler mit ihren Programmen versuchen, durch Manipulation des gemeinsamen Speicherbereichs (der Core), gegnerische Programme zum Ausführen einer ungültigen Instruktion zu bewegen. Der für die Programmierung verwendete "Redcode" ist Assembler ähnlich und enthält zehn verschiedenen Instruktionen, unter anderem Instruktionen für die Aufteilung und parallele Ausführen von Programmen. ...
Article
Full-text available
Der genetischen Programmierung (GP) liegt zumeist die Annahme zugrunde, dass die Individuen eine evolvierte, wohldefinierte Struktur haben und ihre Ausführung deterministisch erfolgt. Diese Annahme hat ihren Ursprung nicht beim methodischen Vorbild, der natürlichen Evolution, sondern ist ein bewusstes oder unbewusstes Erbe der Umgebung, in der die Evolution nachgebildet wird - der von-Neumann-Architektur. John von Neumann hat mit der nach ihm benannten von-Neumann-Architektur weit mehr in der Informatik beeinflusst als das Gebiet der Rechnerarchitekturen. Daher ist sein Einfluss auf die Evolution von Algorithmen mittels genetischer Programmierung nicht verwunderlich, auch wenn die von-Neumann-Architektur wenig gemein mit den in der Natur evolvierten Systemen hat. In den letzten Jahren entstanden eine ganze Reihe von Konzepten und theoretischen Modellen, die nur noch wenig Anleihen bei von Neumanns Rechnerarchitektur machen und die in ihren Eigenschaften stärker natürlichen Systemen ähneln. Die Fähigkeit dieser Systeme, Berechnungen durchzuführen, entsteht erst durch die Interaktion ihrer parallel agierenden, nichtdeterministischen und dezentral organisierten Komponenten. Die Fähigkeit emergiert. Über die Evolution von Algorithmen für solche Systeme jenseits der von-Neumann-Architektur weiß man noch vergleichsweise wenig. Die vorliegende Arbeit nimmt sich dieser Fragestellung an und bedient sich hierbei der algorithmischen Chemie, einer künstlichen Chemie, die bei vereinfachter Betrachtungsweise aus einem veränderten Programmzeigerverhalten in der von-Neumann-Architektur resultiert. Reaktionen, eine Variante einfacher Instruktionen, werden hierbei in zufälliger Reihenfolge gezogen und ausgeführt. Sie interagieren miteinander, indem sie Produkte anderer Reaktionen verwenden und das Ergebnis ihrer Transformation, gespeichert in sogenannten Molekülen, anderen Reaktionen zur Verfügung stellen. Zur experimentellen Auswertung dieses nichtdeterministischen Systems wird die sequenzielle Parameteroptimierung um ein Verfahren zur Verteilung eines Experimentbudgets erweitert. Das systematische Design der Experimente und ihre anschließende Analyse ermöglichen es, generalisierte Erkenntnisse über das Systemverhalten jenseits konkreter Parametrisierungen zu gewinnen. Im Fall der genetischen Programmierung einer algorithmischen Chemie führen die gewonnenen Erkenntnisse zu einer Neuentwicklung des Rekombinationsoperators nach dem Vorbild homologer Rekombinationsoperationen und damit zu einer weiteren Verbesserung der Systemperformance. Es zeigt sich, dass die für ein zielgerichtetes Verhalten einer algorithmischen Chemie notwendigen Reaktionsschemata mittels genetischer Programmierung erlernt werden können. Für gängige Problemstellungen aus dem Bereich der genetischen Programmierung werden Lösungen gefunden, die in ihrer Güte mit denen anderer GP-Varianten und maschineller Lernverfahren vergleichbar sind. Die evolvierten Lösungen fallen dabei deutlich kompakter bezüglich der Datenflusshöhe und der Anzahl benötigter Operationen aus, als in dem zum Vergleich herangezogenen linearen GP-System.
... Eine solche Ausprägung als Turnier findet sich z. B. im System Coreworld (Rasmussen et al. 1990;Rasmussen et al. 1992), das auf der Architektur Core War (Dewdney 1984) ...
Article
Die Dissertation stellt ein konstruktives dynamisches System vor. Kennzeichen eines solchen Systems ist die Bildung neuer Komponenten, die selbst wiederum auf die Systemdynamik einwirken. Die Modellierung, Simulation und Analyse erfolgt im Rahmen einer Künstlichen Chemie (KC). Diese an die Chemie angelehnten, aber abstrakteren Modelle sind sehr gut geeignet, hochgradig parallele Systeme zu simulieren. Das globale Verhalten ergibt sich emergent aus dem Zusammenspiel einer Vielzahl von lokalen Interaktionen (Reaktionen) zwischen wenigen Objekten. Die vorliegende Arbeit führt durch Festlegung der Objektstruktur sowie der Reaktionsvorschrift eine besondere Form der KC ein: die resolutionsbasierte Künstliche Chemie, kurz RESAC. Diese Spezialisierung vereinfacht den Umgang mit KC-Systemen und ermöglicht eine einheitliche Konstruktion und Analyse. Das Reaktionsverhalten ist dabei durch die Anwendung der aus der Logik bekannten Resolutions-Inferenzregel implizit festgelegt. Da die Wirkung prinzipiell bekannt ist, erhält man durch diese Konstruktion eine dynamische Regel, die gleichzeitig vielseitig und vorhersagbar ist (Potenzial und Steuerung). Dem Anwender bleibt so allein die Aufgabe, die Moleküle zu definieren, deren Struktur wiederum vorgegeben ist durch die Syntax der Prädikatenlogik 1. Ordnung. Bei dieser Art der Programmierung ist kein tief gehendes Wissen über die Theorie Künstlicher Chemien nötig. Um vom verteilten Rechnen ohne Steuerinstanz, ferner von der Parallelität und von Emergenzphänomenen zu profitieren, muss allein eine auf der Syntax der Prädikatenlogik aufbauende Problemrepräsentation gefunden werden. Möglicherweise öffnet dies die KC-Forschung für einen größeren Anwenderkreis, vielleicht auch für die Industrie.Die Arbeit zeigt die Konstruktion und Parallelisierung einer resolutionsbasierten KC sowie deren Grenzen und Anwendungsmöglichkeiten auf dem Gebiet der anwendungsorientierten Problemlösung. Da das System mindestens über die gleiche Ausdruckskraft wie herkömmliche Programmiersprachen verfügt, wird gleichsam eine Programmierbarkeit der KC erreicht. Ein besonderer Anwendungsbereich ist das automatische Theorembeweisen, worauf vertiefend eingegangen wird. Neben der Realisierung in großen Rechenclustern wie beim Grid-Computing wird eine Übertragung auf natürliche Prozesse mit inhärentem massiven Parallelismus diskutiert und angedacht. Die hohe Informationsdichte und Reaktionsgeschwindigkeit, ferner ein hoher Grad an Parallelität übertrügen sich auf RESAC. In diesem Fall fungierte RESAC als Bindeglied zwischen traditioneller Problemformulierung (Logik) und neuen Berechnungsparadigmen. Inspired by real chemical processes, artificial chemistries (ACs) follow an intriguing computational paradigm. Abstract molecules (objects) interact in an autonomous, distributed and parallel way. Guided by reaction rules these objects act on one another, thereby proliferating or altering their information content or structure. Modified structure may imply a different function according to the imposed interaction scheme. Consequently, the role of a molecule can change in time, so AC provides a powerful framework to define constructive systems. In this thesis, a resolution-based AC named RESAC is presented, which incorporates the resolution inference rule as the only interaction rule. Being universal and predictable at the same time (thus offering potential and control), this interaction scheme simplifies the AC design and the analysis of the dynamics of the system. In combination with the first-order predicate logic (FOPL) serving as a general and intuitive molecule structure, a programmable AC is gained. Once converted to FOPL, any problem exactly fits this restricted AC model and thereby benefits from parallel and distributed computation driven by non-deterministic emergent processes. This thesis introduces RESAC and shows its construction and parallelization. Furthermore, the system is analyzed and its application area is under examination. Besides programming ACs a traditional application is investigated in detail: automated theorem proving.The presented system can easily be transferred to parallel computing platforms like multi-processor systems or distributed internet agents due to the inherent parallelism. A transfer to natural computing systems is mentioned and considered a challenging future task.
... We may derive from this study some insights on the evolvability of artificial evolutionary systems. Figure 14 shows a rough analogy between the development of digital organisms made by computer programs [1,5,10,12] and the development of self-reproducing loops on CA including the evoloop. Behaviors of these artificial systems are classified here into three categories: self-reproductive, competitive, and evolvable; the last category is further divided into two: adaptive to physical environment and adaptive to other individuals. ...
Article
We constructed a simple evolutionary system, "evoloop," on a deterministic nine-state five-neighbor cellular automata (CA) space by improving the structurally dissolvable self-reproducing loop we had previously contrived [14] after Langton's self-reproducing loop [7]. The principal role of this improvement is to enhance the adaptability (a degree of the variety of situations in which structures in the CA space can operate regularly) of the self-reproductive mechanism of loops. The experiment with evoloop met with the intriguing result that, though no mechanism was explicitly provided to promote evolution, the loops varied through direct interaction of their phenotypes, smaller individuals were naturally selected thanks to their quicker self-reproductive ability, and the whole population gradually evolved toward the smallest ones. This result gives a unique example of evolution of self-replicators where genotypical variation is caused by precedent phenotypical variation. Such interrelation of genotype and phenotype would be one of the important factors driving the evolutionary process of primitive life forms that might have actually occurred in ancient times.
Article
This article is an afterword to the book Rise of the Self-Replicators: Early Visions of Machines, AI and Robots That Can Reproduce and Evolve, coauthored by Tim Taylor and Alan Dorin (2020). The book covered the early history of thought about self-reproducing and evolving machines, from initial speculations in the 17th century up to the early 1960s (from which point onward the more recent history is already well covered elsewhere). This article supplements the material discussed in the book by presenting several relevant sources that have come to the author’s attention since the book was published. The most significant additions to the history are from the German-born, 19th-century inventor and utopian John Adolphus Etzler in the 1830s–1840s, the Hungarian author and satirist Frigyes Karinthy in 1916, and the U.S. mathematician and computer scientist Fred Stahl in 1960. ***** For further information and a link to a free author-formatted version of the paper, see https://www.tim-taylor.com/paper-details/taylor2024afterword.html *****
Conference Paper
Full-text available
This paper describes how a user modeling knowledge base for personalized TV servers can be generated starting from an analysis of lifestyles surveys. The aim of this research is the construction of well-designed stereotypes for generating adaptive electronic program guides (EPGs) which filter the information about TV events depending on the user’s interests.
Chapter
Statistical ReasoningFuzzy SetsFuzzy Constraint Satisfaction ProblemSummaryExercises
Chapter
Artificial life is the study of life and life-like processes through simulation and synthesis.
Chapter
Full-text available
Artificial life has now become a mature inter-discipline. In this contribution, its roots are traced, its key questions are raised, its main methodological tools are discussed, and finally its applications are reviewed. As part of the growing body of knowledge at the intersection between the life sciences and computing, artificial life will continue to thrive and benefit from further scientific and technical progress on both sides, the biological and the computational. It is expected to take center stage in natural computing.
Chapter
Glossary Definition of the Subject Introduction Basic Building Blocks of an Artificial Chemistry Structure-to‐Function Mapping Space Theory Evolution Information Processing Future Directions Bibliography
Article
Kuru and the transmissible virus dementias are in a group of virus-induced slow infections that we have described as subacute spongiform virus encephalopathies because of the strikingly similar histopathological lesions they induce. Scrapie, mink encephalopathy, the chronic wasting disease with spongiform encephalopathy of captive mule deer and of captive elk, and bovine spongiform encephalopathy all appear, from their histopathology, pathogenesis, and the similarities of their infectious agents, to belong to the same group (Gajdusek and Gibbs, 1975; Gajdusek et al., 1965, 1966; Hope et al., 1988; Masters et al., 1981a,b; Wilesmith et al., 1988; Williams and Young, 1980, 1982; Williams et al., 1982). The basic neurocytological lesions in all these diseases are a progressive vacuolation in the dendritic and axonal processes and cell bodies of neurons and, to a lesser extent, in astrocytes and oligodendrocytes; an extensive astroglial hypertrophy and proliferation; and spongiform change or status spongiosis of gray matter and extensive neuronal loss (Beck et al., 1975, 1982; Klatzo et al., 1959).
Article
Life as we know it is based on cells that use proteins and RNA to carry out metabolism, self-replication, and other essential tasks. The genes that code for these molecules are encoded in DNA, and through the processes of transcription and translation the cell expresses its genes. Some proteins are transcription factors that regulate the transcription rate of genes, so genes interact and form a gene regulatory network. In a random Boolean network the genes are modeled as being either ON or OFF, and the regulatory interactions are drawn from some ensemble that may be based on biological observations. Here, the average behavior of observables of dynamics (e.g., attractor count) and stability (e.g., robust-ness to perturbations) is studied, both in the original Kauffman model and in models based on data from yeast. Signal transduction, the propagation of information about the external and internal environment of the cell, often affects transcription factors, thereby altering gene expression levels. Signaling pathway profiling is proposed as a way to reduce the complexity of microarray data and find biologically relevant signals. The core regulatory system of embryonic stem cells is a concrete example of a network where attractor basins and stability are important for biological function, and we explore its dynamics in a continuous model. Finally, the what effect transcriptional re-gulation has on fitness is studied in the context of metabolism in a very simple system, and the benefit of regulation is made clear.
Article
Quantifying evolution and understanding robustness are best done with a system that is both rich enough to frustrate rigging of the answer and simple enough to permit comparison against either existing systems or absolute measures. Such a system is provided by the self-referential model matrix-genome, replication and translation, based on the concept of operators, which is introduced here. Ideas are also taken from the evolving micro-controller research. This new model replaces micro-controllers by simple matrix operations. These matrices, seen as abstract proteins, work on abstract genomes, peptides or other proteins. Studying the evolutionary properties shows that the protein-only hypothesis (proteins as active elements) shows poor evolvability and the RNA-before-protein hypothesis (genomes controlling) exhibits similar intricate evolutionary dynamics as in the micro-controller model. A simple possible explanation for this surprising difference in behavior is presented. In addition to existing evolutionary models, dynamical and organizational changes or transitions occurring late in long-term experiments are demonstrated.
Book
Full-text available
This guide provides broad coverage of computational Artificial Life, a field encompassing the theories and discoveries underpinning the invention and study of technology-based living systems. The book focusses specifically on Artificial Life realised in computer software. Topics covered include the pre-history of Artificial Life, artificial chemistry, artificial cells, organism growth, locomotion, group behaviour, evolution and ecosystem simulation.
Article
Kuru and the transmissible virus dementias have been classified in a group of virusinduced slow infections that we have described as subacute spongiform virus encephalopathies because of the strikingly similar histopathological lesions they induce. Scrapie, mink encephalopathy, and the chronic wasting disease with spongiform encephalopathy of captive mule deer and of captive elk all appear, from their histopathology, pathogenesis, and the similarities of their infectious agents, to belong to the same group (Gajdusek and Gibbs, 1975; Gajdusek et al., 1965, 1966; Masters et al., 198la,b; Williams and Young, 1980,1982; Williams et al., 1982). The basic neurocytological lesions in all these diseases are a progressive vacuolation in the dendritic and axonal processes and cell bodies of neurons and, to a lesser extent, in astrocytes and oligodendrocytes; an extensive astroglial hypertrophy and proliferation; and spongiform change or status spongiosis of gray matter, and extensive neuronal loss (Beck et al., 1975, 1982; Klatzo et al., 1959).
Article
To understand biological viruses, some notions of the fundamental knowledge of the structure of DNA, the genetic code, the biosynthesis of proteins, the transcription, replication and transfer processes,... are presented so as to give an idea as to how the genetic information is decrypted by biological mechanisms and consequently, how viruses work.A computer "virus" can be defined as a piece of code with a self-reproducing mechanism riding on other programs which cannot exist by itself. In contrast, a worm can exist independently. A computer "virus" can be considered as another category of computer user, the problem of protection against such a "virus" can be reduced to the problem of protection against users.The choice of the term Self-Reproducing Program (SRP) appears to be unambiguous in comparison to the word "virus". After having created the computer in 1948, John Von Neumann said in 1949 that it must be possible to imagine logical or mechanical engines that would be able to be self-reproducing. We propose that "good" SRP's should be useful for the automatic maintenance of software, by infection of old versions by the most recent version in the form of such an SRP.Protection is possible by a better understanding of computer systems and their mechanisms of exchange of data and processes. Such a study is presented for the DOS which should be protected by a watchdog system and suggests the need for a real-time analysis on the most vulnerable points. Security models including Cryptography should offer preventive solutions and "vaccines", the treatment of minor troubles...while prevention requires a better understanding of men and their ambiguities.The idea that there is a need for a better knowledge of SRP's, Worms, Trojan horses...justifies a call for the constitution of a special database concerning them.
Article
We have developed an artificial chemistry in the computer core, where one is able to evolve assembler-automaton code without any predefined evolutionary path. The core simulator in the present version has one dimension, is updated in parallel, the instructions are only able to communicate locally, and the system is continuously subjected to noise. The system also has a notion of local computational resources. We see different evolutionary paths depending on the specified parameters and the level of complexity, measured as distance from initially randomized core. For several initial conditions the system is able to develop extremely viable cooperative structures (organisms?) which totally dominate the core. We have been able to identify seven successive evolutionary epochs each characterized by different functional properties. Our study demonstrates a successive emergence of complex functional properties in a computational environment.
Article
Kuru and the transmissible virus dementias are in a group of virus-induced slow infections that we have described as sub-acute spongiform virus encephalopathies (SSVEs) because of the strikingly similar histopathological lesions they induce. Scrapie, mink encephalopathy, and the chronic wasting disease with spongiform encephalopathy of captive mule deer and of captive elk all appear, from their histopathology, pathogenesis, and the similarities of their infectious agents, to belong to the same group (Gajdusek and Gibbs 1975; Gajdusek etal. 1965, 1966; Masters et al. 1981 a, b; Williams and Young 1980,1982; Williams et al. 1982). The basic neurocytological lesions in all these diseases are a progressive vacuolation in the dendritic and axonal processes and cell bodies of neurons and, to a lesser extent, in astrocytes and oligodendrocytes; an extensive astroglial hypertrophy and proliferation; and, spongiform change or status spongiosis of gray matter and extensive neuronal loss (Beck et al. 1975, 1982; Klatzo et al. 1959).
ResearchGate has not been able to resolve any references for this publication.