ChapterPDF Available

Info-computationalism and Morphological Computing of Informational Structure

Authors:

Abstract

Within the framework of info-computationalism, morphological computation is described as fundamental principle for all natural computation (information processing).
Info-computationalism and Morphological Com-
puting of Informational Structure
Gordana Dodig Crnkovic
Mälardalen University, School of Innovation, Design and Engineering,
Sweden, gordana.dodig-crnkovic@mdh.se
Abstract. Within the framework of info-computationalism, morphological
computation is described as fundamental principle for all natural computa-
tion (information processing).
Foundations of a New Science of Computation
Present computational machinery evolved from mechanical calculators to
electronic machines with vacuum tubes and then transistors, and to inte-
grated circuits and eventually microprocessors. During this remarkable de-
velopment of hardware towards ever smaller, faster and cheaper devices,
the computational principles remained unchanged: an isolated machine
calculating a function, executing an algorithm. Such machines were ade-
quately represented by the Turing Machine model. However, computation-
al machinery gradually started to change its character from isolated calcu-
lators to networked communicating devices. In the 1970s first networks
were created with computers linked together via telecommunications. The
emergence of networking involved a changed nature of computers and
computing as operating systems and applications started to access and use
the resources of each other, exchanging information.
Turing Machine model is sequential. As long as parallel processing, such
as occurring in networks, is synchronous, it can be sequentialized, and thus
Turing Machine model can be applied. However for networks with asyn-
chronous processes Turing Machine is not appropriate. As (Sloman 1996)
points out, concurrent and synchronized machines are equivalent to se-
quential machines, but some concurrent machines are asynchronous. (Do-
dig Crnkovic 2011)
Author's manuscript
Chapter in the book Integral Biomathics
Published by Springer
http://link.springer.com/chapter/10.1007/978-3-642-28111-2_10
pp 97-104
2
One of the main arguments in favor of universal computing is the often re-
peated claim in Computer Science (based on Turing machine model of
computation) that it is invariant on the details of implementation (hard-
ware). Computational complexity classes, themselves based on Turing
model of computation, are supposed to be substrate-independent general
abstractions. However, it turned out that Turing Machine model depends
essentially on the underlying assumption of classical physics:
The Turing machine is entirely classical, and does not allow for the possi-
bility that paper might have different symbols written on it in different un-
iverses, and that those might interfere with one another. (Deutsch 1997)
This fascinating insight in the fundaments of computing leads us directly
to the nascent field of Natural Computing, which sometimes is called Un-
conventional Computing or Physical Computing.
Natural Computation
According to the Handbook of Natural Computing (Rozenberg et al. 2011)
Natural Computing is “the field of research that investigates both human-
designed computing inspired by nature and computing taking place in na-
ture.” In particular, the book addresses:
Computational models inspired by the natural systems such as neural
computation, evolutionary computation, cellular automata, swarm intelli-
gence, artificial immune systems, artificial life systems, membrane compu-
ting and amorphous computing.
Computation performed by natural materials such as bioware in molecular
computing or quantum-mechanical systems in case of quantum computing.
Study of computational nature of processes taking place in (living) nature,
such as: self-assembly, developmental processes, biochemical reactions,
brain processes, bionetworks and cellular processes.
Especially important in the context of Natural Computing is that know-
ledge is generated bi-directionally, through the interaction between com-
puter science and the natural sciences. While the natural sciences are ra-
3
pidly absorbing ideas, tools and methodologies of information processing,
computer science is broadening the notion of computation, recognizing in-
formation processing found in nature as (natural) computation. (Rozenberg
and Kari 2008) (Stepney et al. 2005) (Stepney et al. 2006)
This new concept of computation allows for nondeterministic complex
computational systems with self-* properties. Here self-* stands for self-
organization, self-configuration, self-optimization, self-healing, self-
protection, self-explanation, and self(context)-awareness. Dodig Crnkovic
in (Dodig Crnkovic and Müller 2009) argues that natural computation (un-
derstood as processes acting on informational structures) provides a basis
within info-computational framework for a unified understanding of phe-
nomena of embodied cognition, intelligence and knowledge generation.
While computing nature is an old idea, dating back to Zuse, and developed
by number of other researchers (Fredkin, Wolfram, Chaitin, Lloyd) who
argue that all of the physical world computes, the question may be asked:
on what substrate does this computation goes on? Within the info-
computational framework, the answer is: information. All computational
processes in the Nature take place on informational structures (protoinfor-
mation).
Universe as Informational Structure
Von Baeyer (2003) suggests that information is to replace matter/energy as
the primary constitutive principle of the universe. Wolfram supports the
equivalence between the two descriptions:
Matter is merely our way of representing to ourselves things that are in
fact some pattern of information, but we can also say that matter is the
primary thing and that information is our representation of that. (Wolfram
in Zenil 2011, p. 389).
The universe is "nothing but processes in structural patterns all the way
down" (Ladyman, et al. 2007) p. 228. Understanding patterns as informa-
tion, one may infer that information is a fundamental ontological category.
What we know about the universe is what we get from sciences, as "spe-
cial sciences track real patterns" (p. 242). Thus the realism of this approach
4
is based on the claim that "successful scientific practice warrants networks
of mappings as identified above between the formal and the material" (p.
121). The ontology is scale-relative, as we generate knowledge through in-
teractions with the world (Dodig Crnkovic 2008) on different levels of ab-
straction (organization).
Information may be considered the most fundamental physical structure, as
in Floridi’s Informational Structural Realism (Floridi 2008). It is in perma-
nent flow, in a process of transformation, as observed in physics. We know
the world as a result of interaction and exploration:
Structural objects (clusters of data as relational entities) work epistemo-
logically like constraining affordances: they allow or invite certain con-
structs (they are affordances for the information system that elaborates
them) and resist or impede some others (they are constraints for the same
system), depending on the interaction with, and the nature of, the informa-
tion system that processes them. (Floridi 2008).
Info-computational Universe
Info-computationalism (Dodig Crnkovic 2006, 2009) is a unifying ap-
proach that brings together Informationalism (Informational Structural
Realism) of Floridi (2008); Informational Realism of Sayre (1976) and
(Ladyman, et al. 2007) – the informational universe - with the Naturalist
Computationalism/ Pancomputationalism (Zuse, Fredkin, Wolfram, Chai-
tin, Lloyd) – the computing universe. Info-computationalist naturalism un-
derstands the dynamical interaction of informational structures as compu-
tational processes. (Dodig Crnkovic forthcoming 2011) It includes digital
and analogue, continuous and discrete as phenomena existing in the physi-
cal world on different levels of organization (Dodig Crnkovic and Müller
2009). Digital computing is a subset of a more general natural computing.
In what follows I will present the idea of morphological computation
which is, as much of natural computation, different from the execution of
an in advance given procedure in a deterministic mechanical way. The dif-
ference is in the computational mechanism based on natural physical ob-
jects as hardware which at the same time acts as software or a program
governing the behavior of a computational system. Physical laws govern
5
processes which cause dynamical development of a physical system. Or in
other words, computational processes are manifestation of physical laws.
The new structure (data structure, informational structure) produced by
computational processes is a new program in the next step of time devel-
opment. Interestingly, morphological computation is not one of the topics
of the Handbook of Natural Computing, even though the fundamental
principles of morphological computing are underlying all of natural
computing.
Morphological Computation
Recently, morphological computing emerged as a new idea in robotics,
(Pfeifer 2011), (Pfeifer and Iida 2005), (Pfeifer and Gomez 2009) (Paul
2004). It has conceptually very important generalizable consequences with
regard to info-computationalism.
From the beginning, based on the Cartesian tradition, robotics treated sepa-
rately the body (machine) and its control. However, successively it became
evident that embodiment itself is essential for cognition, intelligence and
generation of behavior. In a most profound sense, embodiment is vital be-
cause cognition results from the interaction of brain, body, and environ-
ment. (Pfeifer 2011)
From an evolutionary perspective it is clear that the environment presents a
physical source of biological body which through morphological computa-
tional processes leads to the establishment of morphogenesis (governing
short time scale formation of an organism) and on long time scales govern-
ing evolution of species. Nervous system and brain evolves gradually
through interactions (computational processes) of a living agent with the
environment as a result of information self-structuring (Dodig Crnkovic
2008).
The environment provides a variety of inputs, at the same time as it impos-
es constraints which limit the space of possibilities, driving the computa-
tion to specific trajectories. This relationship is called structural coupling
by (Maturana& Varela 1980) and described by (Quick and Dautenhahn
1999) as “non-destructive perturbations between a system and its environ-
6
ment, each having an effect on the dynamical trajectory of the other, and
this in turn effecting the generation of and responses to subsequent pertur-
bations.” (Clark 1997) p. 163 talks about "the presence of continuous mu-
tually modulatory influences linking brain, body and world."
In morphological computing modeling of the agents behavior (such as lo-
comotion and sensory-motor coordination) proceeds by abstracting the
principles via information self-structuring and sensory-motor coordination,
(Matsushita et al. 2005), (Lungarella et al. 2005) (Lungarella and Sporns
2005) (Pfeifer, Lungarella and Iida 2007). Brain control is decentralized
based on the sensory-motor coordination through the interaction with en-
vironment. Some of the examples of the use of morphological computation
(Pfeifer 2011) in robotics are: “Yokoi hand” which can grasp any shape,
acting through self-regulation; “Passive dynamic walker” – the brainless
robot who walks down the slope; for which the dynamics of the interaction
with the environment is used for self-stabilization and “Insect walking”
with no central control for leg-coordination but global communication
through interaction with the environment.
Morphological Computing as Information Self-Structuring
In morphological computation, generation of sensory stimulation is
achieved by the interaction with the environment through constraints im-
posed by the morphology and materials. Through this interaction with the
environment, generation of correlations in sensors (self-structuring of sen-
sory data) is achieved by physical process. The induction of correlations
leads to reduction of complexity. Interaction occurs across multiple time
scales between body and control structure of an agent, and its environment.
According to (Lungarella et al. 2005) “sensory input and motor activity are
continuously and dynamically coupled with the surrounding environment.”
and “the ability of embodied agents to actively structure their sensory input
and to generate statistical regularities represents a major functional ratio-
nale for the dynamic coupling between sensory and motor systems. Statis-
tical regularities in the multimodal sensory data relayed to the brain are
critical for enabling appropriate developmental processes, perceptual ca-
tegorization, adaptation, and learning” (emphasis added). (Mirza et al.
2007) present an embodied, grounded individual sensorimotor interaction
7
history, based on information theoretic metric space of sensorimotor expe-
rience, dynamically constructed as the robot acts in the environment.
(Lungarella and Sporns 2005) give details of the study of the coupling and
interplay across multiple time scales between the brain, body, and envi-
ronment. Their findings are supported by the results of (Der 2011). It is
important to notice that structures emerge on all levels of control:
Embodied interactions impose statistical structure not only on “raw pix-
els” within primary sensory channels, but also (and perhaps more power-
fully so) on neural activity patterns far removed from the sensory peri-
phery. We predict that embodied systems operating in a highly coordinated
manner generate information and additional statistical regularities at all
hierarchical levels of their control architectures, including but not limited
to the immediate sensory input. (Lungarella and Sporns 2005)
The above mechanism provides the basis for the evolutionary understand-
ing of embodied cognition and knowledge generation. (Dodig Crnkovic
2008) In the process of self-organization of information, the states of the
distant parts of the system are synchronized by stigmergy - indirect coordi-
nation between agents or actions. The trace left in the environment by an
action increases the probability of the next action; so subsequent actions
reinforce and build on each other, resulting in a coherent behavior.
The results on self-organization of information and the development of
embodied cognition in living organisms have inspired the research pro-
gram in developmental robotics. Learning is a continuous and incremental
process and development proceeds through morphological change, growth
and maturation. Boundary conditions and physical limitations play an im-
portant role in the development of an agent, as they cause reduction of the
amount of information. Motor learning results in the reduction of space of
possible movements and enables acquisition of motor skills through explo-
ratory activity in the environment. It has been noticed that the greatest
learning occurs in childhood when the most vigorous growth occurs. (El-
man 1993) showed in training of networks to process complex sentences
that neural processing limitations appear advantageous as they contribute
to gradual learning. In a new born child initial low resolution vision suc-
cessively increases, and coarse control becomes gradually more fine-
grained (Pfeifer 2011) as learning proceeds. Only simple organisms are
8
born in their final form, while for complex organisms, development seems
necessary in order to successively achieve complexity, avoiding chaos.
Info-computational Character of Morpohogenetic Computing
Morphological computation makes visible essential connections between
an agent’s body, (nervous) control and its environment. Through the em-
bodied interaction with the environment, in particular through sensory-
motor coordination, information structure is induced in the sensory data,
thus facilitating perception, learning and categorization. The same prin-
ciples of morphological computing (physical computing) and data self-
organization apply to biology and robotics. Interesting to note is that in
1952 Alan Turing wrote a paper proposing a chemical model as the basis
of the development of biological patterns such as the spots and stripes on
animal skin, (Turing 1952). Turing morphogenesis did not originally claim
that physical system producing patterns actually performed computation.
Nevertheless, from the perspective of info-computationalism we can argue
that morphogenesis is a process of morphological computing. Physical
process – though not „computational“ in the traditional sense, presents
natural (unconventional), morphological computation. Essential element in
this process is the interplay between the informational structure and the
computational process - information self-structuring and information inte-
gration, both synchronic and diachronic, going on in different time and
space scales.
Morphology is the central idea in understanding of the connection between
computation (morphological/morphogenetical) and information. Materials
represent morphology on the lower level of organization – the arrange-
ments of molecular and atomic structures i.e., how protons, neutrons and
electrons are arranged on the level below.
Info-computational naturalism describes nature as informational structure
– a succession of levels of organization of information. Morphological
computing on that informational structure leads to new informational
structures via processes of self-organization of information. Evolution it-
self is a process of morphological computation on a long-term scale. It will
be instructive within the info-computational framework to study processes
9
of self organization of information in an agent (as well as in population of
agents) able to re-structure themselves through interactions with the envi-
ronment as a result of morphological (morphogenetic) computation.
References
Clark A. (1997). Being There: putting brain, body and world together again. Ox-
ford University Press.
Der R. (2011) Self-organization of robot behavior by self-structuring dynamical
information http://ailab.ifi.uzh.ch/brown-bag-lectures/self-organization-of-robot-
behavior-by-self-structuring-dynamical-information talk at Uni Zurich
Deutsch D. (1997). The Fabric of Reality. Penguin
Dodig Crnkovic, G. (2006) Investigations into Information Semantics and Ethics
of Computing, pp 1-133, Mälardalen University Press, Västerås, Sweden.
Dodig Crnkovic, G. (2008) Knowledge Generation as Natural Computation, Jour-
nal of Systemics, Cybernetics and Informatics, Vol 6, No 2
Dodig Crnkovic, G. (2009). Information and Computation Nets. Investigations in-
to Info-computational World. Information and Computation (pp. 1-96). Saar-
brucken: Vdm Verlag.
Dodig Crnkovic, G. (2011) Significance of Models of Computation from Turing
Model to Natural Computation, Minds and Machines,, Springer 21, 301-322.
Dodig Crnkovic, G. (forthcoming) Dynamics of Information as Natural Computa-
tion, Information, Selected Papers from FIS 2010 Beijing Conference,
http://www.mdpi.com/journal/information/special_issues/selectedpap_beijing
Dodig-Crnkovic, G. and Müller V. (2009). A Dialogue Concerning Two World
Systems: Info-Computational vs. Mechanistic. In: Information and Computation,
World Scientific, Singapore 2011. Preprint available at:
http://arxiv.org/abs/0910.5001
Floridi L. (2008). A defense of informational structural realism. Synthese 161: 2,
Springer, pp. 219-253.
Ladyman, J., Ross, D., Spurrett, D., and Collier, J. (2007). Everything must go:
metaphysics naturalized. Clarendon Press, Oxford: pp 1-368.
Lungarella M. and Sporns O. (2005) Information Self-Structuring: Key Principle
for Learning and Development, Proceedings of 2005 4th IEEE Int. Conference on
Development and Learning, pp 25-30
Lungarella M., Pegors T., Bulwinkle D. and Sporns O. (2005) Methods for Quan-
tifying the Informational Structure of Sensory and Motor Data, Neuroinformatics
Volume 3, pp 243-262
10
Matsushita K., Lungarella M., Paul C., Yokoi H. (2005) Locomoting with Less
Computation but More Morphology, Proc. 2005 IEEE Int. Conf. on Robotics and
Automation, pp.:2008-2013
Maturana, H. R. & Varela, F. J. (1980). Autopoiesis and Cognition - The Realiza-
tion of the Living. Dordrecht, The Netherlands: D. Reidel Publishing.
Mirza N. A. et al. (2007) Grounded Sensorimotor Interaction Histories in an In-
formation Theoretic Metric Space for Robot Ontogeny, Adaptive Behavior, Vol
15(2): 167–187
Paul C. (2004) Morphology and Computation, Proceedings of the International
Conference on the Simulation of Adaptive Behaviour Los Angeles, CA, USA, pp
33–38
Pfeifer R. (2011) http://www.eucognition.org/index.php?page=tutorial-on-
embodiment Tutorial on embodiment
Pfeifer R. and Iida F. (2005) Morphological computation: Connecting body, brain
and environment. Japanese Scientific Monthly, Vol. 58, No. 2, 48–54
Pfeifer, R. and Gomez, G. (2009) Morphological computation - connecting brain,
body, and environment. In B. Sendhoff, O. Sporns, E. Körner, H. Ritter, & K.
Pfeifer, R., Lungarella, M. & Iida, F. (2007) Self-organization, embodiment, and
biologically inspired robotics, Science 318, 1088-1093.
Rozenberg G., Bäck T., Kok J. N., eds. (2011) Handbook of Natural Computing,
volume II. Springer. Forthcoming
Rozenberg, G. and Kari, L. (2008) ‘The many facets of natural computing’,
Communications of the ACM, 51. 72–83.
Sayre, K. M. (1976) Cybernetics and the Philosophy of Mind, Routledge & Kegan
Paul, London.
Stepney S., Braunstein S. L., Clark J. A., Tyrrell A., Adamatzky A., Smith R. E.,
Addis T., Johnson C., Timmis J., Welch P., Milner R., and Partridge D. (2005).
Journeys in non-classical computation I: A grand challenge for computing re-
search. Int. J. Parallel, Emergent and Distributed Systems, 20(1):5–19.
Stepney S., Braunstein S. L., Clark J. A., Tyrrell A., Adamatzky A., Smith R. E.,
Addis T., Johnson C., Timmis J., Welch P., Milner R., Partridge D. (2006) Jour-
neys in Non-Classical Computation II: Initial journeys and waypoints. Int. J. Pa-
rallel, Emergent and Distributed Systems. 21(2):97–125.
Turing A. M. (1952) Philosophical Transactions of the Royal Society of London.
Series B, Biological Sciences, Vol. 237, No. 641. (Aug. 14, 1952), pp. 37-72.
von Baeyer H.C. (2003) Information: The New Language of Science, Weidenfeld
and Nicolson
Zenil, H. (2011) Randomness Through Computation: Some Answers, More Ques-
tions., World Scientific Pub Co Inc, Singapore.
... Some arguments suggest that the problems of scalability, resiliency, and complexity of distributed software applications are symptoms that point to a foundational shortcoming of the computational model associated with the stored program implementation of the Turing Machine from which all current-generation computers are derived [7][8][9][10][11][12][13]. ...
... (www.preprints.org) | NOT PEER-REVIEWED | Posted: 9 August 2024 doi:10.20944/preprints202406.1622.v28 ...
Preprint
Full-text available
Biological systems have a unique ability inherited through their genome. It allows them to build, operate, and manage a society of cells with complex organizational structures where autonomous components execute specific tasks and collaborate in groups to fulfill systemic goals with shared knowledge. The system receives information from various senses, makes sense of what is being observed, and acts using its experience, while the observations are still in progress. We use the General Theory of Information (GTI) to implement a digital genome, specifying the operational processes that design, deploy, operate, and manage a cloud-agnostic distributed application that is independent of IaaS and PaaS infrastructure, which provides the resources required to execute the software components. The digital genome specifies the functional and non-functional requirements that define the goals and best-practice policies to evolve the system using associative memory and event-driven interaction history to maintain stability and safety while achieving the system’s objectives. We demonstrate a structural machine, cognizing oracles, and knowledge structures derived from GTI used for designing, deploying, operating, and managing a distributed video streaming application with autopoietic self-regulation that maintains structural stability and communication among distributed components with shared knowledge while maintaining expected behaviors dictated by functional requirements.
... Moreover, the advent of many virtualized and disaggregated technologies, and the rapid increase of the Internet of Things (IoT) makes end-to-end orchestration difficult to do at scale. Some arguments suggest that the problems of scalability, resiliency, and complexity of distributed software applications are symptoms that point to a foundational shortcoming of the computational model associated with the stored program implementation of the Turing Machine from which all current-generation computers are derived [7][8][9][10][11][12][13]. ...
... The architecture provides a selfregulating distributed software application using resources from different providers. 8 We describe an example implemented using this architecture to demonstrate the feasibility and the benefits of this architecture. A video-on-demand service is deployed in a cloud with auto-failover. ...
Preprint
Full-text available
Biological systems have a unique ability inherited through their genome. It allows them to build, operate, and manage a society of cells with complex organizational structures where autonomous components execute specific tasks and collaborate in groups to fulfill systemic goals with shared knowledge. The system receives information from various senses, makes sense of what is being observed, and acts using its experience, while the observations are still in progress. We use the General Theory of Information (GTI) to implement a digital genome, specifying the operational processes that design, deploy, operate, and manage a cloud-agnostic distributed application that is independent of IaaS and PaaS infrastructure, which provides the resources required to execute the software components. The digital genome specifies the functional and non-functional requirements that define the goals and best-practice policies to evolve the system using associative memory and event-driven interaction history to maintain stability and safety while achieving the system’s objectives. We demonstrate a structural machine, cognizing oracles, and knowledge structures derived from GTI used for designing, deploying, operating, and managing a distributed video streaming application with autopoietic self-regulation that maintains structural stability and communication among distributed components with shared knowledge while maintaining expected behaviors dictated by functional requirements.
... In her paper "Significance of Models of Computation, from Turing Model to Natural Computation", Gordana Dodig-Crnkovic states and justifies [30] that if a machine is composed of asynchronous concurrently running subsystems and their relative frequencies vary randomly, then such a machine cannot be adequately modeled by a Turing machine. Similarly, in her paper "Info-computationalism and Morphological Computing of Informational Structure", [30] she writes that while synchronous parallel processing can be made sequential and thus modeled by a Turing machine, asynchronous processes in networks cannot be appropriately modeled by a Turing machine [31]. ...
... In addition, an important implication of Gödel's incompleteness theorem [31] is that it is not possible to have a finite description with the description itself as the proper part. In other words, it is not possible to read yourself or process yourself as a process. ...
Article
Full-text available
With 500+ papers and 20+ books spanning many scientific disciplines, Mark Burgin has left an indelible mark and legacy for future explorers of human thought and information technology professionals. In this paper, I discuss his contribution to the evolution of machine intelligence using his general theory of information (GTI) based on my discussions with him and various papers I co-authored during the past eight years. His construction of a new class of digital automata to overcome the barrier posed by the Church–Turing Thesis, and his contribution to super-symbolic computing with knowledge structures, cognizing oracles, and structural machines are leading to practical applications changing the future landscape of information systems. GTI provides a model for the operational knowledge of biological systems to build, operate, and manage life processes using 30+ trillion cells capable of replication and metabolism. The schema and associated operations derived from GTI are also used to model a digital genome specifying the operational knowledge of algorithms executing the software life processes with specific purposes using replication and metabolism. The result is a digital software system with a super-symbolic computing structure exhibiting autopoietic and cognitive behaviors that biological systems also exhibit. We discuss here one of these applications.
... 145), 2. In her paper "Significance of Models of Computation, from Turing Model to Natural Computation," Gordana Dodig-Crnkovic states and justifies [30] that if a machine is composed of asynchronous concurrently running subsystems and their relative frequencies vary randomly, then such a machine cannot be adequately modeled by a Turing machine. Similarly, in her paper "Info-computationalism and Morphological Computing of Informational Structure," [30] she writes that while synchronous parallel processing can be made sequential and thus modeled by a Turing machine, asynchronous processes in networks cannot be appropriately modeled by a Turing machine [31]. ...
... In addition, an important implication of Gödel's incompleteness theorem [31] is that it is not possible to have a finite description with the description itself as the proper part. In other words, it is not possible to read yourself or process yourself as a process. ...
Preprint
Full-text available
With 500+ papers and 20+ books spanning many scientific disciplines, Mark Burgin has left an indelible mark and legacy for future explorers of human thought and information technology professionals. In this paper, I discuss his contribution to the evolution of machine intelligence using his general theory of information (GTI) based on my discussions with him and various papers I co-authored during the past eight years. His construction of a new class of digital automata to overcome the barrier posed by the Church-Turing Thesis, and his contribution to Super-Symbolic Computing with Knowledge Structures, Cognizing Oracles, and Structural Machines are leading to practical applications changing the future landscape of information systems. GTI provides a model for the operational knowledge of biological systems to build, operate, and manage life processes using 30+ trillion cells capable of replication and metabolism. The schema, and associated operations derived from GTI are also used to model a digital genome specifying the operational knowledge of algorithms executing the software life processes with specific purposes using replication and metabolism. The result is a digital software system with a super-symbolic computing structure exhibiting autopoietic and cognitive behaviors that biological systems also exhibit. We discuss here one of these applications.
... Konrad Zuse first suggested that the physical behavior of the universe could be computed on a basic level, using cellular automata (Zuse, 1970). A similar, pan-computationalist view is supported by various scientists (Wheeler, 1990) (Wheeler, 1994) (Fredkin, 1990), (Wolfram, 2002) (Dodig-Crnkovic, 2012) who see natural phenomena as results of computational processes. This perspective aligns with the idea that the universe computes its next state from its current state, with interactions and information exchanges driving its evolution (Chaitin, 2006). ...
Preprint
Full-text available
This study aims to place Lorenzo Magnanis Eco-Cognitive Computationalism within the broader context of current work on information, computation, and cognition. Traditionally, cognition was believed to be exclusive to humans and a result of brain activity. However, recent studies reveal it as a fundamental characteristic of all life forms, ranging from single cells to complex multicellular organisms and their networks. Yet, the literature and general understanding of cognition still largely remain human-brain-focused, leading to conceptual gaps and incoherency. This paper presents a variety of computational (information processing) approaches, including an info-computational approach to cognition, where natural structures represent information and dynamical processes on natural structures are regarded as computation, relative to an observing cognizing agent. We model cognition as a web of concurrent morphological computations, driven by processes of self-assembly, self-organisation, and autopoiesis across physical, chemical, and biological domains. We examine recent findings linking morphological computation, morphogenesis, agency, basal cognition, extended evolutionary synthesis, and active inference. We establish a connection to Magnanis Eco-Cognitive Computationalism and the idea of computational domestication of ignorant entities. Novel theoretical and applied insights question the boundaries of conventional computational models of cognition. The traditional models prioritize symbolic processing and often neglect the inherent constraints and potentialities in the physical embodiment of agents on different levels of organization. Gaining a better info-computational grasp of cognitive embodiment is crucial for the advancement of fields such as biology, evolutionary studies, artificial intelligence, robotics, medicine, and more.
... Josh Bondgard and Michael Levin use the term "Polycomputing" (Bongard and Levin 2023). The author addressed this topic (Dodig-Crnkovic 2012c, 2017b, 2017c as computing on different levels of organization in nature. ...
Chapter
Full-text available
This article examines the evolution of computational natural philosophy, tracing its origins from the mathematical foundations of ancient natural philosophy, through Leibniz's concept of a "Calculus Ratiocinator," to Turing's fundamental contributions in computational models of learning and the Turing Test for artificial intelligence. The discussion extends to the contemporary emergence of ChatGPT. Modern computational natural philosophy conceptualizes the universe in terms of information and computation, establishing a framework for the study of cognition and intelligence. Despite some critiques, this computational perspective has significantly influenced our understanding of the natural world, leading to the development of AI systems like ChatGPT based on deep neural networks. Advancements in this domain have been facilitated by interdisciplinary research, integrating knowledge from multiple fields to simulate complex systems. Large Language Models (LLMs), such as ChatGPT, represent this approach's capabilities, utilizing reinforcement learning with human feedback (RLHF). Current research initiatives aim to integrate neural networks with symbolic computing, introducing a new generation of hybrid computational models. While there remain gaps in AI's replication of human cognitive processes, the achievements of advanced LLMs, like GPT4, support the computational philosophy of nature – where all nature, including the human mind, can be described, on some level of description, as a result of natural computational processes.
... Some arguments suggest that the problems of scalability, resiliency, and complexity of distributed software applications are symptoms that point to a foundational shortcoming of the computational model associated with the stored program implementation of the Turing machine from which all current-generation computers are derived [7][8][9][10][11][12][13][14]. ...
Article
Full-text available
Biological systems have a unique ability inherited through their genome. It allows them to build, operate, and manage a society of cells with complex organizational structures, where autonomous components execute specific tasks and collaborate in groups to fulfill systemic goals with shared knowledge. The system receives information from various senses, makes sense of what is being observed, and acts using its experience while the observations are still in progress. We use the General Theory of Information (GTI) to implement a digital genome, specifying the operational processes that design, deploy, operate, and manage a cloud-agnostic distributed application that is independent of IaaS and PaaS infrastructure, which provides the resources required to execute the software components. The digital genome specifies the functional and non-functional requirements that define the goals and best-practice policies to evolve the system using associative memory and event-driven interaction history to maintain stability and safety while achieving the system’s objectives. We demonstrate a structural machine, cognizing oracles, and knowledge structures derived from GTI used for designing, deploying, operating, and managing a distributed video streaming application with autopoietic self-regulation that maintains structural stability and communication among distributed components with shared knowledge while maintaining expected behaviors dictated by functional requirements.
... Natural Computing refers to such algorithms that simply use computers to extract relatively-common but complex ideas from nature to develop computational systems or use natural materials such as molecules to perform computation. From this explanation, it is clear that natural computing can be drawing direct inspiration from nature, sometimes called Nature-Inspired Computing (NIC) or simply computing with natural materials (CWN) (Dodig-Crnkovic, 2012). Computing with natural materials is one of the most recent innovations in computing approaches. ...
Article
Full-text available
Optimization has become such a favored area of research in recent times necessitating the need for technical papers and tutorials that will properly analyze and explain the basics of the field. At the heart of efficiency and effectiveness of optimization of engineering, business and industrial processes is metaheuristics, hence the need for proper explanations of the basics of optimization algorithms since the optimization algorithms are the engine room of successful optimization enterprise. This paper presents a foundational discussion on metaheuristic algorithms as a necessary ingredient in successful optimization endeavors and concludes, after analysis of some metaheuristic algorithms that a good metaheuristic algorithm should consist of four components, namely global search, local search, randomization and identification of the best solution at each iteration.
Article
Full-text available
Three special issues of Entropy journal have been dedicated to the topics of “Information-Processing and Embodied, Embedded, Enactive Cognition”. They addressed morphological computing, cognitive agency, and the evolution of cognition. The contributions show the diversity of views present in the research community on the topic of computation and its relation to cognition. This paper is an attempt to elucidate current debates on computation that are central to cognitive science. It is written in the form of a dialog between two authors representing two opposed positions regarding the issue of what computation is and could be, and how it can be related to cognition. Given the different backgrounds of the two researchers, which span physics, philosophy of computing and information, cognitive science, and philosophy, we found the discussions in the form of Socratic dialogue appropriate for this multidisciplinary/cross-disciplinary conceptual analysis. We proceed as follows. First, the proponent (GDC) introduces the info-computational framework as a naturalistic model of embodied, embedded, and enacted cognition. Next, objections are raised by the critic (MM) from the point of view of the new mechanistic approach to explanation. Subsequently, the proponent and the critic provide their replies. The conclusion is that there is a fundamental role for computation, understood as information processing, in the understanding of embodied cognition.
Article
In our previous paper we introduced morphogenesis and post-embryonic life as arising from cells interacting via coupled chemical, electrical and mechanical processes occurring across multiple organization levels. We reviewed these processes from the perspectives of developmental biology and how they relate to physics-based constitutive equations that are well suited to model intercellular interactions' fields. In this paper we will describe a knowledge representation and architectural design strategy that can organize and encode the biochemical, biological and biophysical data necessary to represent and model the highly specialized and diversified cells that constitute living tissues. Since there are about 200 different types of cells in mammalian tissues, a huge amount of molecular, cellular and tissue data must be accounted for. This data cannot be incorporated in an ad hoc manner but, on the contrary, must be organized according to some sound principles. We give an overview of these principles and describe how they can be incorporated as proper features of a Knowledge Base System (KBS) dedicated to computational morphogenesis (CM).
Article
Full-text available
Knowledge generation can be naturalized by adopting computational model of cognition and evolutionary approach. In this framework knowledge is seen as a result of the structuring of input data (data → information → knowledge) by an interactive computational process going on in the agent during the adaptive interplay with the environment, which clearly presents developmental advantage by increasing agent's ability to cope with the situation dynamics. This paper addresses the mechanism of knowledge generation, a process that may be modeled as natural computation in order to be better understood and improved. http://www.iiisci.org/journal/sci/FullText.asp?var=&id=G774PI
Article
Full-text available
We motivate and present a definition of an embodied, grounded individual sensorimotor interaction history, which captures the time-extended behavior characteristics of humans and many animals. We present an architecture that connects temporally extended individual experience with capacity for action, whereby a robot can develop over ontogeny through interaction. Central to this is an information theoretic metric space of sensorimotor experience, which is dynamically constructed and reconstructed as the robot acts. We present results of robotic experiments that establish the predictive efficacy of the space and we show the robot developing the capacity to play the simple interaction game “peekaboo.” A quantitative investigation of the appropriate horizon length of experience for the game reveals the relationship between the length of experience and the cycle time of interaction, and suggests the importance of multiple, and possibly self-adaptive, horizon lengths.
Chapter
New research on the adaptive behavior of natural and synthetic agents. The biannual International Conference on the Simulation of Adaptive Behavior brings together researchers from ethology, psychology, ecology, artificial intelligence, artificial life, robotics, engineering, and related fields to advance the understanding of behaviors and underlying mechanisms that allow natural and synthetic agents (animats) to adapt and survive in uncertain environments. The work presented focuses on well-defined models—robotic, computer simulation, and mathematical—that help to characterize and compare various organizational principles or architectures underlying adaptive behavior in both animals and animats. The proceedings of the eighth conference treat such topics as passive and active perception, navigation and mapping, collective and social behavior, and applied adaptive behavior. Bradford Books imprint
Book
This book argues that the only kind of metaphysics that can contribute to objective knowledge is one based specifically on contemporary science as it really is, and not on philosophers' a priori intuitions, common sense, or simplifications of science. In addition to showing how recent metaphysics has drifted away from connection with all other serious scholarly inquiry as a result of not heeding this restriction, this book demonstrates how to build a metaphysics compatible with current fundamental physics ("ontic structural realism"), which, when combined with metaphysics of the special sciences ("rainforest realism"), can be used to unify physics with the other sciences without reducing these sciences to physics itself. Taking science metaphysically seriously, this book argues, means that metaphysicians must abandon the picture of the world as composed of self-subsistent individual objects, and the paradigm of causation as the collision of such objects. The text assesses the role of information theory and complex systems theory in attempts to explain the relationship between the special sciences and physics, treading a middle road between the grand synthesis of thermodynamics and information, and eliminativism about information. The consequences of the books' metaphysical theory for central issues in the philosophy of science are explored, including the implications for the realism versus empiricism debate, the role of causation in scientific explanations, the nature of causation and laws, the status of abstract and virtual objects, and the objective reality of natural kinds. © James Ladyman, Don Ross, David Spurrett, and John Collier 2007. All rights reserved.
Article
It is suggested that a system of chemical substances, called morphogens, reacting together and diffusing through a tissue, is adequate to account for the main phenomena of morphogenesis. Such a system, although it may originally be quite homogeneous, may later develop a pattern or structure due to an instability of the homogeneous equilibrium, which is triggered off by random disturbances. Such reaction-diffusion systems are considered in some detail in the case of an isolated ring of cells, a mathematically convenient, though biologically unusual system. The investigation is chiefly concerned with the onset of instability. It is found that there are six essentially different forms which this may take. In the most interesting form stationary waves appear on the ring. It is suggested that this might account, for instance, for the tentacle patterns on Hydra and for whorled leaves. A system of reactions and diffusion on a sphere is also considered. Such a system appears to account for gastrulation. Another reaction system in two dimensions gives rise to patterns reminiscent of dappling. It is also suggested that stationary waves in two dimensions could account for the phenomena of phyllotaxis. The purpose of this paper is to discuss a possible mechanism by which the genes of a zygote may determine the anatomical structure of the resulting organism. The theory does not make any new hypotheses; it merely suggests that certain well-known physical laws are sufficient to account for many of the facts. The full understanding of the paper requires a good knowledge of mathematics, some biology, and some elementary chemistry. Since readers cannot be expected to be experts in all of these subjects, a number of elementary facts are explained, which can be found in text-books, but whose omission would make the paper difficult reading.