Published in Advances in Artificial Life, F. Moran, A. Moreno, J. J. Merelo, O.
Chacon, eds., Springer, Berlin, 1995, pp. 23-38. (scanned copy)
Artificial Life Needs a Real Epistemology
H. H. Pattee
Systems Science and Industrial Engineering Department
State University of New York at Binghamton
Binghamton, New York 13902-6000
Abstract Foundational controversies in artificial life and artificial intelligence arise
from lack of decidable criteria for defining the epistemic cuts that separate knowledge of
reality from reality itself, e.g., description from construction, simulation from realization,
mind from brain. Selective evolution began with a description-construction cut, i.e., the
genetically coded synthesis of proteins. The highly evolved cognitive epistemology of
physics requires an epistemic cut between reversible dynamic laws and the irreversible
process of measuring initial conditions. This is also known as the measurement problem.
Good physics can be done without addressing this epistemic problem, but not good
biology and artificial life, because open-ended evolution requires the physical
implementation of genetic descriptions. The course of evolution depends on the speed and
reliability of this implementation, or how efficiently the real or artificial physical
dynamics can be harnessed by non-dynamic genetic symbols.
Life is peculiar, said Jeremy. As compared with what? asked the spider. 
1. What Can Artificial Life Tell Us About Reality?
When a problem persists, unresolved, for centuries in spite of enormous increases in our knowledge, it is a
good bet that the problem entails the nature of knowledge itself. The nature of life is one of these
problems. Life depends on matter, but life is not an inherent property of matter. Life is peculiar,
obviously, because it is so different from nonliving matter. It is different, not so obviously, because it
realizes an intrinsic epistemic cut between the genotype and phenotype. Our knowledge of physics,
chemistry, molecular biology, genetics, development, and evolution is enormous, but the question persists:
Do we really understand how meaning arises from matter? Is it clear why nonliving matter following
inexorable universal laws should acquire symbolic genes that construct, control, and evolve new functions
and meanings without apparent limit? In spite of all this knowledge, most of us still agree with Jeremy.
Where we find disagreement is on the answer to the spider's question. Artificial life must ask this
question: With what do we compare artificial life? The founding characterizations of artificial life
comparing "life-as-it-could-be" with "life-as-we-know-it" [29, 30], or "implementation-independent"
computer life with space-time-energy-dependent material life  was a creative beginning, but this
highly formal view of life was immediately questioned. Such abstract characterizations do not clearly
separate science fiction and computer games from physical reality . On the other hand, the idea of life
in a computer does stimulate the philosophical imagination.
An alternative view of artificial life uses computation to control robots in a real physical world.
Although in this approach the more fundamental philosophical issues are not as apparent, it has the
enormous advantage in a practical sense of using the physical world at face value. As Brooks 
understates the point: "It is very hard to simulate the actual dynamics of the real world."
My first answer to the spider's question is that we can only compare life to nonlife, that is, to the
nonliving world from which life arises and evolves. Artificial life must be compared with a real or an
artificial nonliving world. Life in an artificial world requires exploring what we mean by an alternative
physical or mathematical reality. I want to follow Dennett's  suggestion that we use artificial life as a
"prosthetically controlled thought experiment" that may provide some insights into such foundational
questions. Metaphysical questions, like whether reality is material, formal, or mental, are empirically
undecidable, but nevertheless, discussion of these concepts are an important part of scientific discovery.
Historically we have seen concepts of reality shifting ground and new horizons discovered, especially with
the advent of quantum theory, computation theory, and cosmology. The question is: Can artificial life add
some new ideas to the problem of knowledge and the epistemic cut or will it only increase the confusion?
2. Life Requires an Epistemic Cut
The first problem for life in a computer is to recognize it. How peculiar does artificial life have to be? That
is, how will we distinguish the living parts of the computation from the nonliving parts? And what are the
parts? I have argued for many years that life is peculiar, fundamentally, because it separates itself from
nonliving matter by incorporating, within itself, autonomous epistemic cuts [32, 33, 34, 38, 39].
Metaphori-cally, life is matter with meaning. Less metaphorically, organisms are material structures with
memory by virtue of which they construct, control and adapt to their environment. Evolution entails
semantic information , and open-ended evolution requires an epistemic cut between the genotype and
phenotype, i.e., between description and construction. The logical necessity of this epistemic cut is the
fundamental point of von Neumann's  self-replicating automaton. It is this type of logical argument
that gives some validity to the concept of formal life, implementation-independent life, or life in a
It is not clear how far such logical arguments can take us. As von Neumann warned, if one studies only
formal life, " . . . one has thrown half the problem out the window, and it may be the more important
half." In spite of all our knowledge of the chemical properties of the components of the genotype and
phenotype, no one knows the answer to von Neumann's "most intriguing, exciting and important question
of why the molecules or aggregates which in nature really occur . . . are the sorts of things they are." In
fact, this question is the best reason I know for studying artificial life where we can invent different "sorts
of things" and see how they behave.
2.1 The Epistemic Cut Requires Implementation
What does implementing a description mean? Descriptions are nondynamic, stored structures that do
nothing until they are interpreted and implemented. In life-as-we-know-it this means translating and
constructing what is described. We know that this is a very complex process in real life involving DNA,
messenger RNA, transfer RNA, coding enzymes, ribosomes, and a metabolism to drive the entire
synthesis process. It is therefore not clear what total implementation-independence or formalization of
artificial life can tell us. It is precisely the effectiveness of implementation of genetic descriptions that
evolution by natural selection is all about. Complete formalization would indeed throw half the problem
out the window, as von Neumann says.
The central problem of artificial life, as theoretical biology, is to separate the essential aspects of this
implementation from the frozen accidents or the incidental chemistry and physics of the natural world
that might have been otherwise. Of course all these levels of detail are useful for the problems they
address, but to answer the question of why these molecules are the "sorts of things" they are requires
abstracting just the right amount.
It is not generally appreciated in artificial life studies why formal self-replication is only half the
problem. All of evolution, emergence, adaptation, and extinction, depends on how quickly and efficiently
the variations in the genotype can be implemented in phenotypic functions. How does a symbolic-
sequence space map into a physical-function space? In spite of all the physical and chemical knowledge
we have, it still appears unreasonably fortuitous that only linear sequences of nucleotides are sufficient to
instruct the synthesis of all the structural proteins including their self-folding and self-assembling
properties, and all the coordinated, highly specific and powerful enzymes that control the dynamics of all
forms of life. It is significant that even at the simplest level the implementation entails a computationally
intractable problem - the polymer folding problem.
The advantage of the autonomous robotics approach to artificial life is that it avoids the most
intractable computational problems in the same way that real life does - it harnesses the real physics.
However, robotics does not face the more fundamental construction and self-assembly problems. The
question is how much can we learn from computational models alone about such efficient
implementations of genetic information? Such questions depend largely on our epistemology of
computation, that is, how we think of measurements and symbols constraining a dynamics. The same
problem exists for all physical systems, as I will discuss in Sec. 3. I will survey some current concepts of
computation after outlining what I mean by an epistemology and summarizing the standard epistemic
principles of physical theory.
In traditional philosophy epistemic cuts are viewed as problems only at the cognitive level. They are
called problems of reference or how symbols come to "stand for" or to "be about" material structures and
events [7, 22, 52]. I have always found the complementary problem much more fundamental: How do
material structures ever come to be symbolic? I think if we fully understood how molecules become
messages in cells we would have some understanding of how messages have meaning. That is why the
origin of life problem is important for philosophy.
3. What Is an Epistemology?
An epistemology is a theory or practice that establishes the conditions that make knowledge possible.
There are many epistemologies. Religious mystics, and even some physicists , believe that higher
knowledge is achieved by a state of ineffable oneness with a transcendent reality. Mystics do not make
epistemic cuts. While this may work for the individual, it does not work for populations that require
heritable information or common knowledge that must be communicable . Knowledge is potentially
useful information about something. Information is commonly represented by symbols. Symbols stand for
or are about what is represented. Knowledge may be about what we call reality, or it may be about other
knowledge. It is the implementation of "standing for" and "about" - the process of executing the epistemic
cut - that artificial life needs to explore.
Heritable, communicable, or objective knowledge requires an epistemic cut to distinguish the
knowledge from what the knowledge is about. By useful information or knowledge I mean information in
the evolutionary sense of information for construction and control, measured or selected information, or
information ultimately necessary for survival. This is contrasted with ungrounded, unmeasured,
unselected, hence, purely formal or syntactic information. My usage does not necessarily imply higher-
level cognitive concepts like understanding and explanation, neither does it exclude them. I am not
troubled by the apparent paradox that primitive concepts may be useful without being precisely
understood. I agree with C. F. von Weizsäcker , "Thus we will have to understand that it is the very
nature of basic concepts to be practically useful without, or at least before, being analytically clarified."
3.1 The Epistemology of Physical Theory
The requirement for heritable or objective knowledge is the separation of the subject from the object, the
description from the construction, the knower from the known. Hereditary information originated with life
with the separation of description and construction, and after 3.6 billion years of evolution this separation
has developed into a highly specialized and explicit form at the cognitive level. Von Neumann  states
this epistemology of physical theory clearly: " . . . we must always divide the world into two parts, the one
being the observed system, the other the observer. The boundary between the two is arbitrary to a very
large extent . . . but this does not change the fact that the boundary must be put somewhere, if the method
is not to proceed vacuously . . ." In physical theory, the observer is formally related to the observed system
only by the results of measurements of the observables defined by the theory, but the formulation of the
theory, the choice of observables, the construction of measuring devices, and the measurement process
itself cannot be formalized.
No matter where we divide the world into observed and observer, the fundamental condition for
physical laws is that they are invariant to different observers or to the frames of reference or states of
observers. Laws therefore hold everywhere - they are universal and inexorable. In addition to the
invariance or symmetry principles, the laws must be separated from the initial conditions that are
determined only by measurement. The distinction between laws and initial conditions can also be
expressed in terms of information and algorithmic complexity theory . Algorithmic complexity of
information is measured by the shortest program on some Turing-like machine that can compute this
information. Laws then represent information about the world that can be enormously shortened by
algorithmic compression. Initial conditions represent information that cannot be so compressed.
Mystical and heritable epistemologies are not necessarily incompatible. They simply refer to different
forms of knowledge . For example, Penrose  agrees that this separation of laws is "historically of
vital importance" but then expresses more mystically the "very personal view" that "when we come
ultimately to comprehend the laws . . . this distinction between laws and boundary conditions will dissolve
3.2 Incomplete Knowledge - the Necessity of Statistical laws
The epistemology of physics would be relatively simple if this were all there were to it, but laws and initial
conditions alone are not enough to make a complete physical theory that must include measurement.
Measurement and control require a third category of knowledge called boundary conditions or constraints.
These are initial conditions that can be compressed locally but that are neither invariant nor universal like
laws. When such a constraint is viewed abstractly it is often called a rule; when it is viewed concretely it is
often called a machine or hardware.
Both experience and logic teach us that initial conditions cannot be measured, nor boundary conditions
constructed, with the deterministic precision of the formal dynamical laws. Consequently, this third
category of knowledge requires statistical laws. Statistical laws introduce one of the great unresolved
fundamental problems of epistemology. The dynamical laws of physics are all symmetric in time and
therefore reversible, while statistical laws are irreversible. Formally, these two types of laws are
incompatible. It is even difficult to relate them conceptually. From Bernoulli and Laplace to the present
day this problem persists. As Planck  says, "For it is clear to everybody that there must be an
unfathomable gulf between a probability, however small, and an absolute impossibility." He adds, "Thus
dynamics and statistics cannot be regarded as interrelated." Von Neumann  agreed with Planck but
cautioned, ". . . the last word about this subject has certainly not been said and it is not going to be said
for a long time." Thirty years later, Jaynes  says about the interpretation of probability in quantum
theory, ". . . we are venturing into a smoky area of science where nobody knows what the real truth is."
What types of boundary conditions or constraints can "self-organize" from deterministic dynamical
laws, and what types can only "emerge" from a statistical bias on a heritable population distribution (i.e.,
natural selection) is a central problem in evolution theory and an active study in artificial life . As
with all such problems, the issue depends on the existence of an epistemic cut.
3.3 Measurement Defines an Epistemic Cut
Like it or not, the epistemic cut in physical theory falls in Planck's "unfathomable gulf" between
dynamical and statistical laws. The possible trajectories of the world are described dynamically by
reversible, noiseless laws, but any explicit knowledge of a trajectory requires observations or
measurements described by irreversible, noisy statistical laws. This is the root of the measurement
problem in physical theory. The problem arises classically, where it is often discussed using the thought
experiments such as Maxwell's demon , and in quantum theory where the formal treatment of the
measurement process only makes matters worse . Von Neumann  described the problem in this
way: An epistemic cut must separate the measuring device from what is measured. Nevertheless, the
constraints of the measuring device are also part of the world. The device must therefore be describable by
universal dynamical laws, but this is possible only at the cost of moving the epistemic cut to exclude the
measurement. We then require a new observer and new measuring devices - a vacuous regress.
When we distinguish the Turing-von Neumann concept of programmable computation from other less
well-defined concepts, we will see in Sec. 5.6 that when described physically a "step" in the computation
must be a measurement. The completion of a measurement is indicated by a record or memory that is no
longer a part of the dynamics except as an incoherent (nonintegrable) constraint.
It is important to understand that invariance and compressibility are not themselves laws, but are
necessary epistemic conditions to establish the heritability, objectivity and utility we require of laws. As P.
Curie  pointed out, if the entire world in all its details were really invariant there would be nothing to
observe. No epistemic cut would be possible, and therefore life could not exist, except perhaps in a
mystical sense. It is only because we divide our knowledge into two categories, dynamical laws and initial
conditions, that invariance itself has any meaning [24, 54]. How we choose this cut intellectually is
largely a pragmatic empirical question, although there is also a strong aesthetic component of choice .
The point is that invariance and compressibility are general epistemic requirements for evolution that
preceded physical theory. They are both "about" something else, and therefore they require a cut between
what does not change and what does change, and between the compression and what is compressed. How
life, real or artificial, spontaneously discovers an invariant, compressible, and hence evolvable,
description-construction cut is the origin of life problem. However it happened, it is clear that
compressibility is necessary to define dynamical laws and life. Without compressibility life could not
adapt or evolve, because there is no way to adapt to endlessly random (incompressible) events.
4. Artificial Life Requires an Artificial Physics
How is this physical epistemology relevant for artificial life? The important point is that physical
epistemology is a highly evolved and specialized form of the primitive description-construction process.
The cognitive role of physical epistemology appears to be far removed from the constructive function of
genes, but both define a fundamental epistemic cut. Great discoveries have been made in physics without
understanding the mechanisms that actually implement the epistemic cut, because physics does not need
to study the epistemic cut itself. Measurement can simply be treated as an irreducible primitive activity.
That is why in most sciences the epistemic cut appears sharp - we tend to ignore the details of
constructing the measurement devices and record only the results. The reality is that physical theory
would remain in a primitive state without complex measuring devices, and in fact most of the financial
resources in physics are spent on their construction.
Unlike physical theory, great discoveries in the evolution of natural and artificial life are closely
related to understanding how the description-construction process can be most efficiently implemented.
The course of evolution depends on how rapidly and efficiently an adaptive genotype-phenotype
transformation can be discovered and how reliably it can be executed [11, 12].
Real and artificial life must have arisen and evolved in a nonliving milieu. In real life we call this the
real physical world. If artificial life exists in a computer, the computer milieu must define an artificial
physics. This must be done explicitly or it will occur by default to the program and hardware. What is an
artificial physics or physics-as-it-might-be? Without principled restrictions this question will not inform
philosophy or physics, and will only lead to disputes over nothing more than matters of taste in
computational architectures and science fiction. If an epistemology-as-we-know-it in physics has evolved
from life itself, we must consider this a fundamental restriction. What we now distinguish as the three
essential categories of knowledge - laws, initial conditions, and statistics - we need to represent in
computational models of artificial life.
This means that artificial laws must correspond to algorithmically compressible subsets of
computational events, and initial conditions must refer to incompressible events determinable only by
measurements by organisms. In other words, any form of artificial life must be able to detect events and
discover laws of its artificial world. Defining a measurement in a computer is a problem. I discuss this in
Sec. 5.6. Also, autonomy requires what I call semantic closure . This means the organism's
measurement, memory, and control constraints must be constructed by the genes of the organism from
parts of the artificial physical world. Of course consistency requires that all activities of the organisms
follow the laws they may discover. Whether such organisms are really alive or only simulated is a matter
of definition. A more objective and important question is how open-ended is such computational life. No
consensus can be expected on this question unless there is consensus on what computation means.
5. What Is Computation?
There are two fundamentally different views of computation, the mathematical or formal view and the
physical or hardware view. Barrow  sees these views arising from "two great streams of thought" about
physical reality. The traditional view is based on symmetry principles, or invariance with respect to
observers. The currently popular view is based on an abstract concept of computation. Roughly, the
symmetry view is based on the established physical epistemology that I outlined above with statistical
measurement playing an essential role. The computational view emphasizes a dynamical ontology, with
logical consistency and axiomatic laws playing the essential role. The one view sees computation as a
locally programmable, concrete, material process strictly limited by the laws of physics. The other view
sees computation as a universal, abstract dynamics to which even the laws of physics must conform.
5.1 Formal Computation
The ontological view of computation has some roots in the historical ideal of formal symbol manipulation
by axiomatic rules. The meaning of a formal system in logic and mathematics as conceived by Hilbert is
that all the procedures for manipulating symbols to prove theorems and compute functions are axiomati-
cally specified. This means that all the procedures are defined by idealized unambiguous rules that do not
depend on physical laws, space, time, matter, energy, dissipation, the observer's frame of reference, or the
many possible semantic interpretations of the symbols and rules. The founders of computation theory were
mostly logicians and mathematicians who, with the significant exceptions of Turing and von Neumann,
were not concerned with physical laws. Ironically, formal computational procedures are now called
"effective" or "mechanical" even though they have no epistemic relation to physical laws. These
procedures are justified only by intuitive thought experiments. This weakness is well-known, but is usually
ignored. As Turing  noted, "All arguments which can be given [for effective procedures] are bound to
be, fundamentally, appeals to intuition and for this reason rather unsatisfactory mathematically."
This complete conceptual separation of formal symbol manipulation from its physical embodiment is a
characteristic of mathematical operations as we normally do them. All symbol strings are discrete and
finite, as are all rewriting steps. Steps may not be analyzed as analog physical devices. Proofs do not allow
statistical fluctuation and noise. The concepts of set and function imply precise symbol recognition and
complete determinism in rewriting all symbols. Formal computation is, by definition, totally
This formal view of computation appears to contribute little to understanding the nature of epistemic
cuts because formal systems are self-contained. Symbols and rules have no relation to measurement,
control, and useful information. In fact, purely formal systems must be free of all influence other than
their internal syntax, otherwise they are in error. To have meaning they must be informally interpreted,
measured, grounded, or selected from the outside. "Outside" of course is established only by an epistemic
cut. It is for this reason that formal models can be programmed to simulate everything, except perhaps the
ineffable or mystical, since all the interpretation you need to define what the simulation means can be
freely provided from outside the formal activity of the computer.
5.2 Laplacean Computation
An extension of formal computation is the Laplacean ideal which, as a thought experiment, replaces the
epistemic cut with an in-principle isomorphism between the formal computational states and physical
states. Such thought experiments often lead to apparent paradoxes precisely because an isomorphism is a
formal concept that does not define how to execute the epistemic cut. Maxwell's demon and Schrödinger's
cat are famous examples. The demon forces us to clearly state how measured information is distinguished
from physical entropy, and the cat forces us to decide when a measurement occurs. These distinctions both
require defining epistemic cuts between the knower and the known. It is significant that there is still no
consensus on where the cut should be placed in both cases [31, 50].
5.3 Computation in the Wild
A further elaboration of formal computation has become popular more recently as a kind of backward
Laplacean ideal. That is, the Laplacean isomorphism is interpreted as its converse: computation does not
provide a map of the universe ― the universe is a map of a computation, or "IT from BIT" as Wheeler
 states it. This ontological view is what Dietrich  calls "the computer in the wild." Historians
might try to blame this view on Pythagoras, but its modern form probably began with the shift in the view
of mathematics as a pure logical structure to more of a natural science after the failure of pure logic to
justify its foundations. The ontological view also arose from the ambiguous relation of information to
entropy in the contexts of cosmology, quantum theory, and algorithmic complexity theory [9, 57]. Toffoli
 describes computation this way: "In a sense, nature has been continually computing the 'next state' of
the universe for billions of years; all we have to do - and actually all we can do - is 'hitch a ride' on this
huge ongoing computation, and try to discover which parts of it happen to go near where we want."
This confounding of formal rules that arise from constraints, and dynamics that describe physical laws,
leads to ambiguous questions like, "Is there a physical phenomenon that computes something
noncomputable? Contrariwise, does Turing's thesis . . . constrain the physical universe we are in?"
(Chaitin, ). This speculative association of formal theorems with physical laws is sometimes called the
strong Church-Turing thesis. It leads to the argument that if there were a natural physical process that
could not be Turing-computed, then that process could be used as a new computing element that violates
the thesis .
The strong Turing-Church thesis is also used to try to equate formal Turing-equivalence between two
symbol systems, with fitness equivalence between two physical implementations of the formal systems.
The argument is that because there are many Turing-equivalent formalisms, like cellular automata and
artificial neural nets, that there is no significant difference in the behavior of their different physical
implementations. Of course from the biological perspective this is not the case, because it is precisely the
overall efficiency of the physical implementation that determines survival. The significant processes in
life at all levels, from enzyme catalysis to natural selection, depend on statistical biases on the rates of
change of noisy population distributions, whereas formal equivalence is neither a statistical bias, rate-
dependent, noisy, nor a population distribution.
The believers in strong artificial intelligence have further popularized the computer metaphor by
defining brains and life as just some kind of computer that we do not yet understand. This view is labeled
computationalism. It replaces the Laplacean isomorphism with an identity. Like Toffoli, Dietrich 
believes that, "every physical system in the universe, from wheeling galaxies to bumping proteins, is a
special purpose computer in the sense that every physical system in the universe is implementing some
computation or other." According to this view, the brain is a computer by definition. It is our job to figure
out what these physical systems are really computing. Thus, according to Churchland and Sejnowski ,
". . . there is a lot we do not yet know about computation. Notice in particular that once we understand
more about what sort of computers nervous systems are, and how they do whatever it is they do, we shall
have an enlarged and deeper understanding of what it is to compute and represent." Hopfield  extends
this vague, generalized view of computation to evolution: "Much of the history of evolution can be read as
the evolution of systems to make environmental measurements, make predictions, and generate
appropriate actions. This pattern has the essential aspects of a computational system."
This undifferentiated view of the universe, life, and brains as all computation is of no value for
exploring what we mean by the epistemic cut because it simply includes, by definition, and without
distinction, dynamic and statistical laws, description and construction, measurement and control, living
and nonliving, and matter and mind as some unknown kinds of computation, and consequently misses the
foundational issues of what goes on within the epistemic cuts in all these cases. All such arguments that
fail to recognize the necessity of an epistemic cut are inherently mystical or metaphysical and therefore
undecidable by any empirical or objective criteria [36, 37, 43].
5.4 The Programmable Physical Computer
The formal view of computation would be conceivable as long as Turing's  condition that every
symbol is "immediately recognizable" (i.e., perfectly precise measurement) and Gödel's  condition of
perfect mechanism (i.e., perfect determinism) were possible. However, even though we have no way of
knowing if nature is ultimately deterministic or not, we do know that measurement must at some stage be
irreversible, and the results of measurement cannot be used to violate the 2nd law of thermodynamics.
Hence, useful measurements are dissipative and subject to error and violate the assumptions of Laplace,
Turing, and Gödel.
The physical view of computation is little more than the engineering view that recognizes the
hardware constraints as a necessity for implementing any symbol manipulation. This view holds that
statistical physical laws are both the foundation and limitation of computation. Programmable hardware is
inherently slow and noisy. Most of the peculiar design features of the computer are to overcome these
limits. It is a wonder of technology that these limits have been extended so far. Actually, it was Turing
 who first justified the use of bits as the highest signal-to-noise symbol vehicle, and of course von
Neumann  believed that any rigorous theory of computation must have its roots in thermodynamics.
He did not think of computers as implementation-independent: "An automaton cannot be separated from
the milieu to which it responds. By that I mean that it is meaningless to say that an automaton is good or
bad, fast or slow, reliable or unreliable, without telling in what milieu it operates." The same is true for
natural and artificial life.
5.5 Limits of Physical Computation
The requirement that computation must satisfy physical laws, especially the 2nd law of thermodynamics,
is seldom questioned, but is nevertheless largely ignored by both formalists and computationalists. On the
other hand, hardware designers are acutely aware of the practical physical limits of speed, reliability and
dissipation. Theories of reversible (dissipationless) computation have been proposed over the last few
decades [3, 28], but they are essentially thought experiments with idealized dynamical constraints. No one
knows how to build a useful programmable computer along these lines.
Bennett,  argues that the source of irreversibility in measurement is erasure rather than the
measurement itself. This interpretation is possible if the measured results remain unused on the physical
side of the epistemic cut. In any case, the basic condition is that our use of measured information cannot
lead to a violation of the second law of thermodynamics. Therefore the addition of any new measured
information that is actually used to decrease the accessible states (i.e., entropy decrease, useful work,
natural selection, control of dynamics, etc.) must be compensated by information loss (i.e., entropy
increase, noise, dissipation, increased accessible states, etc.) in some aspect of the measuring process .
5.6 Analog Dynamics
The ontological computationalists will argue that normal programmable computation is just our
interpretation of a constrained physical dynamical system. They claim that all dynamical systems,
likewise, can be interpreted as computing because they are implementing some functions [16, 45]. This
may be the case for analog computers where the dynamics maps initial states to final states without
programs, measurements, or intermediate steps, but this is too great an abstraction for describing a
The issue is what we mean by implementation of a function, and how we define a step. If a computer is
to be an implementation of formal logic or mathematics, then it must implement discrete symbols and
perform discrete steps in the rewriting of these symbols to and from memory according to a sequence of
rules or a program. This is what formal logic and mathematics is about. This is what Turing/von
Neumann programmable computers do. It is also the case that any implementation of such symbolic
computational steps must be a law-based system with physical dynamics, so the question is: What
corresponds to a symbol and a step? Physical dynamics does not describe symbols and steps. They are not
in the primary categories of knowledge called laws and initial conditions. A step can only be defined by a
measurement process, and a symbol as a record of a measurement. Therefore, a programmable
computation can be described in physical terms only as a dynamical system that is internally constrained
to regularly perform a sequence of simple measurements that are recorded in memory. The records
subsequently act as further constraints. Since the time of measurement, by definition, has no coherence
with the time of the dynamics, the sequence of computational steps is rate-independent, even though all
physical laws are rate-dependent. As in all arguments about when measurement occurs, this also depends
on where the epistemic cut is placed.
The ontological computer-in-the-wild is a physical system that may be interpreted as a dynamical
analog device that parallels some other process. Such analog computers were common before the
development of the programmed digital computer. They cannot be classified easily because all systems are
indeed potential analogs. Furthermore, what aspects of the system are to be interpreted as computation are
not crisply defined as are symbolic, rule-based, programmed computers. It should be clear that these are
two extremes that only produce confusion by being lumped together. In rate-dependent dynamical
analogs no memory is necessary, and one epistemic cut is made at the end when the final result is
measured. In rate-independent programmed computation each step is a measurement recorded in memory.
There are innumerable possibilities for machines with all degrees of constraints in between these
extremes, but none have general utility.
6. The Epistemology of Organisms
Living systems as-we-know-them use a hybrid of both discrete symbolic and physical dynamic behavior to
implement the genotype-phenotype epistemic cut. There is good reason for this. The source and function
of genetic information in organisms is different from the source and function of information in physics. In
physics new information is obtained only by measurement and, as a pure science, used only passively, to
know that rather than to know how, in Ryle's terms. Measuring devices are designed and constructed
based on theory. In contrast, organisms obtain new genetic information only by natural selection and
make active use of information to know how, that is, to construct and control. Life is constructed, but only
by trial and error, or mutation and selection, not by theory and design. Genetic information is therefore
very expensive in terms of the many deaths and extinctions necessary to find new, more successful
This high cost of genetic information suggests an obvious principle that there is no more genetic
information than is necessary for survival. What affects this minimum? The minimum amount of genetic
or selected information will depend largely on how effectively this information can be implemented using
the parts and the dynamics of physical world. For example, some organisms require genetic instruction for
synthesizing amino acids from smaller molecules, but if all amino acids are available as environmental
parts, there is no need for these genes. At the next level, if the information that determines the linear
sequence is sufficient constraint to determine the folding and self-assembly of proteins then no further
folding information is necessary. However, in some cases, when the self-folding is not stable, additional
genes for membrane or scaffolding proteins to further constrain the folding are necessary.
This minimum genetic information principle should not be confused with algorithmic compression of
information. Algorithmic compression is defined only in a formal context on unselected information.
Compressiblity across an epistemic cut can only be interpreted informally as the efficiency of implementa-
tion of selected information in a physical milieu. No such minimum information principle can apply to
formal or programmable computation. Formal computation requires, by definition, complete
informational control. That is, no self-folding or any other law-based dynamics can have any effect on
formal symbol manipulation. Any such effect is regarded either as irrelevant or an error.
The success of evolution depends on how quickly and effectively organisms can adapt to their
environment. This in turn depends on how efficiently the sequence space of genes can transform,
gradually, the control or function space of phenotypes. Efficiency here includes the search problem, i.e.,
how to find good descriptions , and the construction problem, i.e., how to reliably assemble parts
according to the description [11, 13].
As I mentioned in Sec. 2.1, it is important to recognize that these implementation problems, if treated
formally, are combinatorially complex. The search space is enormous and the number of degrees of
freedom of an enzyme is large, so that even though polymer folding is the simplest possible natural
process that implements the genotype-phenotype transformation, a purely computational mapping is
impractical. Even the two-dimensional folding of RNA is NP-complete, and ab initio computation of
detailed protein folding appears out of reach. To make matters worse, folding requires finding only a
stationary state. A quantum dynamical model of enzyme catalysis has not even been formulated.
The only practical computational approach to these combinatorially complex problems is to use
"reverse biological engineering" and simulate the natural dynamics with artificial neural nets [25, 56],
and natural selection in the form of genetic algorithms to evolve the connection weights in the nets .
There is no doubt that these techniques derived from life-as-we-know-it are of practical engineering value.
However, I would call them virtual dynamical analogs implemented by programmed computers. Adlelman
 has used real DNA molecules in a "massively parallel" chemical search for a solution of the
Hamiltonian path problem. It is a matter of taste whether this should be called molecular computing or
chemical graph theory.
If artificial life is to inform philosophy, physics, and biology it must address the implementation of
epistemic cuts. Von Neumann recognized the logical necessity of the description-construction cut for
open-ended evolvability, but he also knew that a completely axiomatic, formal, or implementation-
independent model of life is inadequate, because the course of evolution depends on the speed, efficiency,
and reliability of implementing descriptions as constraints in a dynamical milieu.
Many nonlinear dynamical models of populations of interacting units, like cellular automata,
Kauffman-type networks, and games of life have been interpreted as genetic populations with or without a
genotype-phenotype mapping. These populations compete, cooperate, and coevolve in an artificial
environment. However, where there is a genotype-phenotype mapping this is usually a fixed program in
which the evolution of efficiency of implementation does not arise. Of course implementation-independent
self-organization may play essential roles in the origin of life and in limiting the possibilities for natural
selection. The significance of these roles needs to be determined.
Implementation of a description means constructing organisms with the parts and the laws of an
artificial physical world. Some epistemic principles must restrict physics-as-it-could-be if it is to be any
more than computer games. In order to evolve, organisms must discover by selection or measurement
some "compressible" genetic descriptions of this artificial physical world. Compressibility is a formal
concept that is not strictly applicable across epistemic cuts. Compressibility across an epistemic cut simply
corresponds to efficiency of implementation. There is no general way to measure how far we can compress
the genetic information that is necessary to implement a biological function, because this depends on the
physical laws and the quality of function necessary for survival. For the same reason we cannot in general
specify how much information is necessary to construct a measuring device. The amount of information
for construction of the measuring device is incommensurable with the survival value of the information
obtained by the measurement. This is generally the case for all biological structure-function relations.
To evolve, organisms must efficiently implement these descriptions as constructions. A fundamental
limitation for computer life is that evolution can only reflect the complexity of the artificial physical world
in which organisms live. An epistemic cut affords the potential for efficient implementation and open-
ended evolution, but in a simple world, efficient implementations will be limited and life will also remain
Hybrid symbolic-dynamic systems, like life-as-we-know-it and computer-controlled robots actually
address the problem of efficient implementation of control instructions in the real world, but robots are
still a long way from implementing efficient memory-controlled constructions of real life that self-
assemble at all levels, from polymer folding to multicellular development. Real world dynamics will
always have some advantages for efficient implementations, because there are necessary but gratuitous
inefficiencies of programmed computer simulations that are missing in reality, as well as significant
unpredictable efficiencies of reality missing in the simulations.
1. Adelman, L. M., 1994, Molecular computation of solutions to combinatorial problems, Science 266,
2. Barrow, J. D., 1991, Theories of Everything, Oxford University Press, p. 203.
3. Bennett, C. H., 1982, The thermodynamics of computation - a review, Int. J. Theor. Phys. 21, 905-
4. Bennett, C. H., 1987, Demons, engines, and the second law, Sci. Am. 257, 108-116.
5. Born, M., 1964, Symbol and reality, Universitas 7, 337-353. Reprinted in Born, M., Physics in my
Generation, Springer-Verlag, NY, pp. 132-146.
6. Brooks, R., 1992, Artificial life and real robots. In Toward a Practice of Autonomous Systems, F.
Varela and P. Bourgine, eds, MIT Press, Cambridge, MA, pp.3-10.
7. Cassirer, E., 1957, The Philosophy of Symbolic Forms, Vol 3: The Phenomena of Knowledge, Yale
Univ. Press, New Haven, CT.
8. Chaitin, G., 1982, Gödel's theorem and information, Int. J. of Theor. Phys. 21, 941-953.
9. Chaitin, G., 1987, Algorithmic Information Theory, Cambridge University Press.
10. Churchland P. S. and Sejnowski, T. J., 1992, The Computational Brain, MIT Press.
11. Conrad, M., 1983, Adaptability, Plenum, New York.
12. Conrad, M., 1989, The brain-machine disanalogy, BioSystems 22, 197-213.
13 Conrad, M., 1990, The geometry of evolution, BioSystems 24, 61-81.
14. Curie, P., 1908, Oeuvres, Gauthier-Villars, Paris, p. 127.
15. Curtis, C. and Greenslet, F., 1945, The Practical Cogitator, Houghton Mifflin, Boston, p. 277.
16. Dietrich, E. 1994, ed.,Thinking Computers and Virtual Persons, Academic Press, NY, p. 13.
17. Dennett, D., 1994, Artificial life as philosophy, Artificial Life 1, 291-292.
18. Eddington, A., 1928, The Nature of the Physical World, Macmillan, New York, p. 260.
19. Eigen, M., 1992, Steps Toward Life, Oxford University Press.
20. Fontana, W., Wagner, G., and Buss, L. W., 1994, Beyond digital naturalism, Artificial Life 1, 211-
21. Gödel, K., 1964, Russell's mathematical logic, and What is Cantor's continuum problem? In P.
Benacerraf and H. Putnam, eds., Philosophy of Mathematics, Prentice-Hall, Englewood Cliffs, NJ,
22. Harnad, S., 1990, The symbol grounding problem, Physica D 42, 335-346.
23. Hopfield, J. J. 1994, Physics, computation, and why biology looks so different, J. Theor. Biol. 171,
24. Houtappel, R. M. F., Van Dam, H., and Wigner, E. P., 1965, The conceptual basis and use of the
geometric invariance principles, Rev. Mod. Physics 37, 595-632.
25. Hunter, L., 1993, ed., Artificial Intelligence and Molecular Biology , AAAI/- MIT Press, Menlo Park,
26. Jaynes, E., 1990, Probability in quantum theory. In W. H. Zurek, ed., Complexity, Entropy, and the
Physics of Information, Addison-Wesley, Redwood City, CA, p. 382.
27. Kleene, S. C., 1952, Introduction to Metamathematics, Van Nostrand, Princeton, NJ.
28. Landauer, R., 1986, Computation and physics: Wheeler's meaning circuit, Found. Phys. 16, 551-564.
29. Langton, C., 1989, Artificial life, In C. Langton, ed., Artificial Life, Addison-Wesley, Red-wood City,
CA, pp. 1-47.
30. Langton, C., Taylor, C., Farmer, J., and Rasmussen, S., eds., Artificial Life II, Addison-Wesley,
Redwood City, CA.
31. Leff, H. S. and Rex, A. F., eds., 1990, Maxwell's Demon, Entropy, Information, Computing,
Univ. Press, Princeton, NJ.
32. Pattee, H. H., 1969, How does a molecule become a message? Developmental Biology Supplement 3,
33. Pattee, H. H., 1972, Laws, constraints, symbols, and languages. In C. H. Waddington, ed., Towards a
Theoretical Biology 4, Edinburgh Univ. Press, pp. 248-258.
34. Pattee, H. H., 1982, Cell psychology: An evolutionary approach to the symbol-matter problem,
Cognition and Brain Theory 5(4), 325-341.
35. Pattee, H. H., 1988, Simulations, realizations, and theories of life. In C. Langton, ed., Artificial Life,
Addison-Wesley, Redwood City, CA, pp. 63-77.
36. Pattee, H. H., 1989,The measurement problem in artificial world models, BioSystems 23, 281-290.
37. Pattee, H. H., 1990, Response to E. Dietrich's "Computationalism," Social Epistemology 4(2), 176-
38. Pattee, H. H., 1993, The limitations of formal models of measurement, control, and cognition,
Math. and Computation 56, 111-130.
39. Pattee, H. H., 1995, Evolving self-reference: matter, symbols, and semantic closure, Communication
and Cognition - AI 12(1-2), 9-28.
40. Penrose, R., 1989, The Emperor's New Mind, Oxford University Press, p. 352.
41. Planck, M., 1960, A Survey of Physical Theory, Dover, New York, p. 64.
42. Polanyi, M., 1964, Personal Knowledge, Harper & Row, NY.
43. Rosen, R., 1986, Causal structures in brains and machines, Int. J. General Systems 12, 107-126.
44. Schuster, P., 1994, Extended molecular evolutionary biology: Artificial life bridging the gap between
chemistry and biology, Artificial Life 1, 39-60.
45. Toffoli, T. 1982, Physics and computation, International J. of Theoretical Physics 21, 165-175.
46. Turing, A., 1936, On computable numbers, with an application to the Entscheidungsproblem, Proc.
Lond. Math. Soc. 42, 230-265.
47. von Neumann, J., 1955, Mathematical Foundations of Quantum Mechanics, Princeton Univ. Press,
Princeton, NJ, Chapter VI.
48. von Neumann, J., 1966, The Theory of Self-reproducing Automata, A. Burks, ed., Univ. of Illinois
Press, Urbana, IL.
49. von Weizsäcker, C. F., 1973, Probability and quantum mechanics, Brit. J. Phil. Sci. 24, 321-337.
50. Wheeler, J. and Zurek, W., 1983, Quantum Theory and Measurement, Princeton Univ. Press,
51. Wheeler, J. A., 1990, Information, physics, quantum: The search for links. In W. H. Zurek, ed.,
Addison-Wesley, Redwood City. CA.
52. Whitehead, A. N., 1927, Symbolism: Its meaning and Effect, Macmillan, NY.
53. Whitley, D. and Hanson, T., 1989, Optimizing neural networks using faster, more accurate genetic
search. In Proceedings of the Third International Conference on GAs, Morgan Kauffman, pp. 391-
54. Wigner, E. P., 1964, Events, laws, and invariance principles, Science 145, 995-999.
55. Wilber, K., 1985, Quantum Questions, Shambala, Boston, MA.
56. Zuker, M., 1989, On finding all suboptimal folding of large RNA sequences using thermodynamics
and auxiliary information, Nucleic Acids Res. 9, 133.
57. Zurek, W. H., 1990, Complexity, Entropy, and the Physics of Information, Addison Wesley, Redwood