Article

A Computable Universe: Understanding and Exploring Nature as Computation

Authors:
  • Oxford Immune Algorithmics
To read the full-text of this research, you can request a copy directly from the author.

Abstract

A Computable Universe is a collection of papers discussing computation in nature and the nature of computation, a compilation of the views of the pioneers in the contemporary area of intellectual inquiry focused on computational and informational theories of the world. This volume is the definitive source of informational/computational views of the world, and of cutting-edge models of the universe, both digital and quantum, discussed from a philosophical perspective as well as in the greatest technical detail. The book discusses the foundations of computation in relation to nature. It focuses on two main questions: What is computation? How does nature compute? The contributors are world-renowned experts who have helped shape a cutting-edge computational understanding of the universe. They discuss computation in the world from a variety of perspectives, ranging from foundational concepts to pragmatic models to ontological conceptions and their philosophical implications. The volume provides a state-of-the-art collection of technical papers and non-technical essays representing a field that takes information and computation to be key to understanding and explaining the basic structure underpinning physical reality. It also includes a new edition of Konrad Zuse's "Calculating Space", and a panel discussion transcription on the topic, featuring worldwide experts (including a Nobel prize) in quantum mechanics, physics, cognition, computation and algorithmic complexity.A Computable Universe is a collection of papers discussing computation in nature and the nature of computation, a compilation of the views of the pioneers in the contemporary area of intellectual inquiry focused on computational and informational theories of the world. This volume is the definitive source of informational/computational views of the world, and of cutting-edge models of the universe, both digital and quantum, discussed from a philosophical perspective as well as in the greatest technical detail. The book discusses the foundations of computation in relation to nature. It focuses on two main questions: What is computation? How does nature compute? The contributors are world-renowned experts who have helped shape a cutting-edge computational understanding of the universe. They discuss computation in the world from a variety of perspectives, ranging from foundational concepts to pragmatic models to ontological conceptions and their philosophical implications. The volume provides a state-of-the-art collection of technical papers and non-technical essays representing a field that takes information and computation to be key to understanding and explaining the basic structure underpinning physical reality. It also includes a new edition of Konrad Zuse's "Calculating Space", and a panel discussion transcription on the topic, featuring worldwide experts (including a Nobel prize) in quantum mechanics, physics, cognition, computation and algorithmic complexity.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... In addition, during the last several decades, the concept of computation has impacted the scientific view of nature because computation was proposed as a key to explain nature. In fact, although the idea that our world might be some type of a machine has been in the collective imagination since ancient times [1], one of the most surprising questions that physicists have pondered for the last four decades is whether the universe is a computational system [1,2]. Applying the concept of computation in physics [3][4][5] has given a completely new dimension to the concept [6]. ...
... According to this definition, a computational model has greater computational power than another computational model if the subset of functions that it can implement strictly contains the set of the other computational model, and the two models have the same computational power if both sets are equal. It is interesting to note that the Church-Turing thesis can be divided into two claims: (1) a computational power exists that cannot be exceeded under the finitary point of view, and (2) the limit of what is effectively calculable under the finitary point of view is the computational power of a Turing machine, the Church-Turing limit. The Church-Turing thesis has been supported by additional research on other computational models that found the same limit, e.g., the post-canonical system, semi-Thue system [22], multitape Turing machine [23], random access machine [24], or P-system [25]. ...
... Deutsch went further than Kreisel because he was not speaking about a feature of the theories but about it as a feature of nature itself. Thus, Deutsch formulated the following physical principle: 2 Deutsch's principle:"Every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means" [95] (p. 99). ...
Article
Full-text available
The central goal of this manuscript is to survey the relationships between fundamental physics and computer science. We begin by providing a short historical review of how different concepts of computer science have entered the field of fundamental physics, highlighting the claim that the universe is a computer. Following the review, we explain why computational concepts have been embraced to interpret and describe physical phenomena. We then discuss seven arguments against the claim that the universe is a computational system and show that those arguments are wrong because of a misunderstanding of the extension of the concept of computation. Afterwards, we address a proposal to solve Hempel’s dilemma using the computability theory but conclude that it is incorrect. After that, we discuss the relationship between the proposals that the universe is a computational system and that our minds are a simulation. Analysing these issues leads us to proposing a new physical principle, called the principle of computability, which claims that the universe is a computational system (not restricted to digital computers) and that computational power and the computational complexity hierarchy are two fundamental physical constants. On the basis of this new principle, a scientific paradigm emerges to develop fundamental theories of physics: the computer-theoretic framework (CTF). The CTF brings to light different ideas already implicit in the work of several researchers and provides a new view on the universe based on computer theoretic concepts that expands the current view. We address different issues regarding the development of fundamental theories of physics in the new paradigm. Additionally, we discuss how the CTF brings new perspectives to different issues, such as the unreasonable effectiveness of mathematics and the foundations of cognitive science.
... Some of this work relies on Shannon information theory [12,[19][20][21], and some of it on Fisher information theory [17]. There has also been work on this topic that focuses on the processing of information, i.e., that views the universe through the lens of Turing machine (TM) theory [13,24,26,33,[43][44][45][46][47]. ...
... After these preliminaries I present some of the connections between the theory of IDs and the theory of Turing Machines. In particular I analyze some of the properties of an ID version of universal Turing machines and of an ID version of Kolmogorov complexity [13,24,26,33,[43][44][45][46][47]. I show that the ID versions of those quantities obey many of the familiar results of Turing machine theory (e.g., the invariance theorem of TM theory). ...
... Constraints on what can be computed by a physical device can be derived from the laws of physics [25]. There have also been attempts to go the other way, and derive constraints on the laws of physics from computation theory, in particular from algorithmic information theory (AIT) [13,23,[43][44][45][46]. These often implicitly involve uncertainty about the state of the universe. ...
Chapter
Full-text available
There are four types of information an agent can have concerning the state of the universe: information acquired via observation, via control, via prediction, or via retrodiction, i.e., memory. Each of these four types of information appear to rely on a different kind of physical device (e.g., an observation device, a control device, etc.). However it turns out that there is some mathematical structure that is common to those four types of devices. Any device that possesses that structure is known as an “inference device” (ID). Here I review some of the properties of IDs, including their relation with Turing machines, and (more loosely) quantum mechanics. I also review the bounds on the joint information about the physical universe that can be held by any set of IDs. These bounds constrain the possible mathematical structure of any universe that contains agents with information concerning that universe in which they are embedded.
... Some of this work relies on Shannon information theory [12,[19][20][21], and some of it on Fisher information theory [17]. There has also been work on this topic that focuses on the processing of information, i.e., that views the universe through the lens of Turing machine (TM) theory [13,24,26,33,[43][44][45][46][47]. ...
... After these preliminaries I present some of the connections between the theory of IDs and the theory of Turing Machines. In particular I analyze some of the properties of an ID version of universal Turing machines and of an ID version of Kolmogorov complexity [13,24,26,33,[43][44][45][46][47]. I show that the ID versions of those quantities obey many of the familiar results of Turing machine theory (e.g., the invariance theorem of TM theory). ...
... Constraints on what can be computed by a physical device can be derived from the laws of physics [25]. There have also been attempts to go the other way, and derive constraints on the laws of physics from computation theory, in particular from algorithmic information theory (AIT) [13,23,[43][44][45][46]. These often implicitly involve uncertainty about the state of the universe. ...
Article
There are (at least) four ways that an agent can acquire information concerning the state of the universe: via observation, control, prediction, or via retrodiction, i.e., memory. Each of these four ways of acquiring information seems to rely on a different kind of physical device (resp., an observation device, a control device, etc.). However it turns out that certain mathematical structure is common to those four types of device. Any device that possesses a certain subset of that structure is known as an "inference device" (ID). Here I review some of the properties of IDs, including their relation with Turing machines, and (more loosely) quantum mechanics. I also review the bounds of the joint abilities of any set of IDs to know facts about the physical universe that contains them. These bounds constrain the possible properties of any universe that contains agents who can acquire information concerning that universe. I then extend this previous work on IDs, by adding to the definition of IDs some of the other mathematical structure that is common to the four ways of acquiring information about the universe but is not captured in the (minimal) definition of IDs. I discuss these extensions of IDs in the context of epistemic logic (especially possible worlds formalisms like Kripke structures and Aumann structures). In particular, I show that these extensions of IDs are not subject to the problem of logical omniscience that plagues many previously studied forms of epistemic logic.
... " is one that may never have a satisfactory, globally accepted, answer, because new discoveries and maturing understandings always result in novel insights and perspectives (Denning, 2010). For now, we are going to assume these more contemporary views that some form of computation goes on in Nature (Ballard, 1997;Brent & Buck, 2006;de Castro, 2006;Denning, 2007;Cohen, 2009;Schwenk et al., 2009;Crnkovic, 2010Crnkovic, , 2011aGelende, 2011;Mitchel, 2011;Penrose, 2012;Zenil, 2012aZenil, , 2012b), and this will influence in a higher or lower level the discipline that came to be known as Natural Computing. A straightforward, though, important conclusion is that computing is not a human invention; it already exists in the Universe and is responsible for its very origin (Zenil, 2012). ...
... Under the transdisciplinary Natural Computing umbrella, Natural Computing is itself a Natural Science. The knowledge within Computer Science does not directly explain natural phenomena , but gives birth to the first steps towards the understanding of computing in nature (Brent & Bruck, 2006;Lloyd, 2006;de Castro, 2007;Denning, 2007;Gelende, 2011;Mitchel, 2011;Deutsch, 2012;Zenil, 2012aZenil, , 2012bCrnkovic, 2010Crnkovic, , 2011aCrnkovic, , 2011b). ...
... Nobel Laureate and Caltech President David Baltimore commented: " Biology is today an information science " (Denning, 2001). The classic definition of computing is becoming obsolete: computer science is increasing its scope to the study of natural and artificial information processes (Denning, 2007;Zenil, 2012aZenil, , 2012b). However, if computing is concerned with information processing, either natural or artificial, it is necessary to first answer the following question: What is information? ...
Chapter
Full-text available
An important premise of Natural Computing is that some form of computation goes on in Nature, and that computing capability has to be understood, modeled, abstracted, and used for different objectives and in different contexts. Therefore, it is necessary to propose a new language capable of describing and allowing the comprehension of natural systems as a union of computing phenomena, bringing an information processing perspective to Nature. To develop this new language and convert Natural Computing into a new science it is imperative to overcome three specific Grand Challenges in Natural Computing Research: Transforming Natural Computing into a Transdisciplinary Discipline, Unveiling and Harnessing Information Processing in Natural Systems, Engineering Natural Computing Systems.
... The first one with a focus on processes is the idea of the computing universe (naturalist computationalism/pancomputationalism) in which a cognizing agent sees the dynamics of physical states in nature as information processing (natural computation) [57][58][59][60][61][62]. ...
... It "computes" or spontaneously unfolds its physical systems, which for an observer are informational structures. As Chaitin says, the universe is computing its next state by simply executing its own physics over its existing states, [61,62]. In Hewitt's model of computation, you do not need external input for computation to execute in a system. ...
Article
Full-text available
Three special issues of Entropy journal have been dedicated to the topics of “Information-Processing and Embodied, Embedded, Enactive Cognition”. They addressed morphological computing, cognitive agency, and the evolution of cognition. The contributions show the diversity of views present in the research community on the topic of computation and its relation to cognition. This paper is an attempt to elucidate current debates on computation that are central to cognitive science. It is written in the form of a dialog between two authors representing two opposed positions regarding the issue of what computation is and could be, and how it can be related to cognition. Given the different backgrounds of the two researchers, which span physics, philosophy of computing and information, cognitive science, and philosophy, we found the discussions in the form of Socratic dialogue appropriate for this multidisciplinary/cross-disciplinary conceptual analysis. We proceed as follows. First, the proponent (GDC) introduces the info-computational framework as a naturalistic model of embodied, embedded, and enacted cognition. Next, objections are raised by the critic (MM) from the point of view of the new mechanistic approach to explanation. Subsequently, the proponent and the critic provide their replies. The conclusion is that there is a fundamental role for computation, understood as information processing, in the understanding of embodied cognition.
... In his more recent work from 2012, the foreword to A Computable Universe: Understanding Computation & Exploring Nature As Computation [19], which is Penrose's latest text on computationalism, he explicitly acknowledges having changed his position on the question of computationalism (the belief that the mind can be modeled computationally) multiple times. In the foreword, Penrose discusses different versions of computationalism and various possible interpretations. ...
... • In theoretical physics, [pan-]computationalism loosely refers to a variety of modeling approaches where formal concepts from symbolic computing or the Shannon concept of information are invoked to describe and explain the physical world (introduction: Zenil (2013);examples: von Weizsäcker (1985); Lloyd (2013); Wolfram (2020)). A notably popular special format are cellular automata models, which capture self-organized pattern formation in physical substrates (Zuse, 1982;Wolfram, 2002;Fredkin, 2013). ...
Preprint
Full-text available
Approaching limitations of digital computing technologies have spurred research in neuromorphic and other unconventional approaches to computing. Here we argue that if we want to systematically engineer computing systems that are based on unconventional physical effects, we need guidance from a formal theory that is different from the symbolic-algorithmic theory of today's computer science textbooks. We propose a general strategy for developing such a theory, and within that general view, a specific approach that we call "fluent computing". In contrast to Turing, who modeled computing processes from a top-down perspective as symbolic reasoning, we adopt the scientific paradigm of physics and model physical computing systems bottom-up by formalizing what can ultimately be measured in any physical substrate. This leads to an understanding of computing as the structuring of processes, while classical models of computing systems describe the processing of structures.
... The mathematics challenge: Mathematical results show that human thinking cannot be computational in the standard sense, as shown by Cooper [6][7][8][9]54,76,112,113]. In addition, a move from inductive and deductive logic to the perspective of active inference brings forward abductive reasoning [114], which is making the best guess about the external states of affairs [93,115]. ...
Article
Full-text available
Cognition, historically considered uniquely human capacity, has been recently found to be the ability of all living organisms, from single cells and up. This study approaches cognition from an info-computational stance, in which structures in nature are seen as information, and processes (information dynamics) are seen as computation, from the perspective of a cognizing agent. Cognition is understood as a network of concurrent morphological/morphogenetic computations unfolding as a result of self-assembly, self-organization, and autopoiesis of physical, chemical, and biological agents. The present-day human-centric view of cognition still prevailing in major encyclopedias has a variety of open problems. This article considers recent research about morphological computation, morphogenesis, agency, basal cognition, extended evolutionary synthesis, free energy principle, cognition as Bayesian learning, active inference, and related topics, offering new theoretical and practical perspectives on problems inherent to the old computationalist cognitive models which were based on abstract symbol processing, and unaware of actual physical constraints and affordances of the embodiment of cognizing agents. A better understanding of cognition is centrally important for future artificial intelligence, robotics, medicine, and related fields.
... Moreover, Wolfram (Chapter 12 of [34]) has proposed the Principle of Computational Equivalence which states that systems in nature which are not obviously simple, eg the weather, have maximally possible computational power, implying that many or even most natural systems operate effectively equivalent to a UTM. Other work on undecidability in physics [35][36][37][38] and the computational capacity of physical world [39][40][41] may tend to support the possibility of high-level computation in the natural world. Despite these points, the Principle has not been proven to hold and it is not clear that it does actually hold very commonly in nature. ...
Preprint
Full-text available
Developing new ways to estimate probabilities can be valuable for science, statistics, and engineering. By considering the information content of different output patterns, recent work invoking algorithmic information theory has shown that a priori probability predictions based on pattern complexities can be made in a broad class of input-output maps. These algorithmic probability predictions do not depend on a detailed knowledge of how output patterns were produced, or historical statistical data. Although quantitatively fairly accurate, a main weakness of these predictions is that they are given as an upper bound on the probability of a pattern, but many low complexity, low probability patterns occur, for which the upper bound has little predictive value. Here we study this low complexity, low probability phenomenon by looking at example maps, namely a finite state transducer, natural time series data, RNA molecule structures, and polynomial curves. Some mechanisms causing low complexity, low probability behaviour are identified, and we argue this behaviour should be assumed as a default in the real world algorithmic probability studies. Additionally, we examine some applications of algorithmic probability and discuss some implications of low complexity, low probability patterns for several research areas including simplicity in physics and biology, a priori probability predictions, Solomonoff induction and Occam's razor, machine learning, and password guessing.
... В рамках инфокомпьютационной структуры, или вычисляющей природы [18], вычисление на заданном уровне организации информации представляет собой реализацию/актуализацию законов, управляющих взаимодействиями между ее составными частями. На базовом уровне вычисление -это проявление каузальности в физическом субстрате [83]. На каждом следующем уровне организации набор правил, управляющих системой, переходит в новый эмерджентный режим. ...
Article
Full-text available
The emerging contemporary natural philosophy provides a common ground for the integrative view of the natural, the artificial, and the human-social knowledge and practices. Learning process is central for acquiring, maintaining, and managing knowledge, both theoretical and practical. This paper explores the relationships between the present advances in understanding of learning in the sciences of the artificial (deep learning, robotics), natural sciences (neuroscience, cognitive science, biology), and philosophy (philosophy of computing, philosophy of mind, natural philosophy). The question is, what at this stage of the development the inspiration from nature, specifically its computational models such as info-computation through morphological computing, can contribute to machine learning and artificial intelligence, and how much on the other hand models and experiments in machine learning and robotics can motivate, justify, and inform research in computational cognitive science, neurosciences, and computing nature. We propose that one contribution can be understanding of the mechanisms of ‘learning to learn’, as a step towards deep learning with symbolic layer of computation/information processing in a framework linking connectionism with symbolism. As all natural systems possessing intelligence are cognitive systems, we describe the evolutionary arguments for the necessity of learning to learn for a system to reach humanlevel intelligence through evolution and development. The paper thus presents a contribution to the epistemology of the contemporary philosophy of nature.
... Furthermore, the idea is that natural phenomena themselves are regarded as information processing, i.e., that they are occurring as a result of some type of computation [14]. Based on the perspective that real-world phenomena perform some computation, we use those natural phenomena for a desired computation. ...
Preprint
Full-text available
Owing to recent advances in artificial intelligence and internet of things (IoT) technologies, collected big data facilitates high computational performance, while its computational resources and energy cost are large. Moreover, data are often collected but not used. To solve these problems, we propose a framework for a computational model that follows a natural computational system, such as the human brain, and does not rely heavily on electronic computers. In particular, we propose a methodology based on the concept of `computation harvesting', which uses IoT data collected from rich sensors and leaves most of the computational processes to real-world phenomena as collected data. This aspect assumes that large-scale computations can be fast and resilient. Herein, we perform prediction tasks using real-world road traffic data to show the feasibility of computation harvesting. First, we show that the substantial computation in traffic flow is resilient against sensor failure and real-time traffic changes due to several combinations of harvesting from spatiotemporal dynamics to synthesize specific patterns. Next, we show the practicality of this method as a real-time prediction because of its low computational cost. Finally, we show that, compared to conventional methods, our method requires lower resources while providing a comparable performance.
... In every next layer of organization, a set of rules governing the system switch to the new emergent regime. It remains yet to be established how this process exactly goes on in nature, and how emergent properties occur [84]. Research on natural computing is expected to uncover those mechanisms. ...
Article
Full-text available
The emerging contemporary natural philosophy provides a common ground for the integrative view of the natural, the artificial, and the human-social knowledge and practices. Learning process is central for acquiring, maintaining, and managing knowledge, both theoretical and practical. This paper explores the relationships between the present advances in understanding of learning in the sciences of the artificial (deep learning, robotics), natural sciences (neuroscience, cognitive science, biology), and philosophy (philosophy of computing, philosophy of mind, natural philosophy). The question is, what at this stage of the development the inspiration from nature, specifically its computational models such as info-computation through morphological computing, can contribute to machine learning and artificial intelligence, and how much on the other hand models and experiments in machine learning and robotics can motivate, justify, and inform research in computational cognitive science, neurosciences, and computing nature. We propose that one contribution can be understanding of the mechanisms of ‘learning to learn’, as a step towards deep learning with symbolic layer of computation/information processing in a framework linking connectionism with symbolism. As all natural systems possessing intelligence are cognitive systems, we describe the evolutionary arguments for the necessity of learning to learn for a system to reach human-level intelligence through evolution and development. The paper thus presents a contribution to the epistemology of the contemporary philosophy of nature.
... In natural computing or computing nature [39], the whole of the nature is seen as a network of networks of computational processes on different levels of organization. Similarly, Zenil [58] presents the idea of a computable universe with both ambition to understand computation and exploring nature as computation. ...
Article
Full-text available
Defining computation as information processing (information dynamics) with information as a relational property of data structures (the difference in one system that makes a difference in another system) makes it very suitable to use operator formulation, with similarities to category theory. The concept of the operator is exceedingly important in many knowledge areas as a tool of theoretical studies and practical applications. Here we introduce the operator theory of computing, opening new opportunities for the exploration of computing devices, processes, and their networks.
... They influenced other scientific thought leaders including Beckenstein [74], Blume-Kohout & Zurek [75], Deutsch [76] [77], Esposito et al. [78], Fredkin [79] [80], Freedman et al. [81] [82], 't Hooft [83] [84], Kitaev [85] [86], Lloyd [87], Mandal & Jarzynski [88], Nayak et al. [89], Ogburn & Preskill [90], Pekola [91], Sagawa & Ueda [92], Sau et al. [93], Schmidhuber [94], Toyabe et al. [95], Wang [96], von Weizsäcker [97], Wheeler [98], Wolfram [99], Zenil [100] and Zizzi [101] to explore and further develop the physics of information within a computational Universe (or Triuniverse). Wheeler succinctly captured the essence of this relationship through his memorable phrase "it from bit". ...
Article
Full-text available
An original quantum foundations concept of a deep learning computational Universe is introduced. The fundamental information of the Universe (or Triuniverse) is postulated to evolve about itself in a Red, Green and Blue (RGB) tricoloured stable self-mutuality in three information processing loops. The colour is a non-optical information label. The information processing loops form a feedback-reinforced deep learning macrocycle with trefoil knot topology. Fundamental information processing is driven by ψ -Epistemic Drive, the Natural appetite for information selected for advantageous knowledge. From its substrate of Mathematics, the knotted information processing loops determine emergent Physics and thence the evolution of superemergent Life (biological and artificial intelligence). RGB-tricoloured information is processed in sequence in an Elemental feedback loop (R), then an Operational feedback loop (G), then a Structural feedback loop (B) and back to an Elemental feedback loop (R), and so on around the trefoil in deep learning macrocycles. It is postulated that hierarchical information correspondence from Mathematics through Physics to Life is mapped and conserved within each colour. The substrate of Mathematics has RGB-tricoloured feedback loops which are respectively Algebra (R), Algorithms (G) and Geometry (B). In Mathematics, the trefoil macrocycle is Algebraic Algorithmic Geometry and its correlation system is a Tensor Neural Knot Network enabling Qutrit Entanglement. Emergent Physics has corresponding RGB-tricoloured feedback loops of Quantum Mechanics (R), Quantum Deep Learning (G) and Quantum Geometrodynamics (B). In Physics, the trefoil macrocycle is Quantum Intelligent Geometrodynamics and its correlation system is Quantum Darwinism. Super-emergent Life has corresponding RGB-tricoloured loops of Variation (R), Selection (G) and Heredity (B). In the evolution of Life, the trefoil macrocycle is Variational Selective Heredity and its correlation ecosystem is Darwin’s ecologically “Entangled Bank”.
... Generalizing from Leibniz's project of Characteristica Universalis, we can see not only humans, but also all natural and cultural phenomena, indeed, the whole of our reality as manifestations of a variety of computational phenomena. That view is called computationalism, natural computation or computing nature, (Zenil 2012)(Dodig-Crnkovic & Giovagnoli 2013). Info-computation is a constructive theoretical framework that connects information as a structure and computation as information processing, developed in (1976) (Wheeler 1990)(Floridi 2003)(Burgin 2010) in which the world/reality is a complex fabric of informational structures, and natural computationalism (Zuse 1970)(Fredkin 1992)(Wolfram 2002) (Chaitin 2007), which argues that the universe is a computational network of networks. ...
Article
Full-text available
Similar to oil that acted as a basic raw material and key driving force of industrial society, information acts as a raw material and principal mover of knowledge society in the knowledge production, propagation and application. New developments in information processing and information communication technologies allow increasingly complex and accurate descriptions, representations and models, which are often multi-parameter, multi-perspective, multi-level and multidimensional. This leads to the necessity of collaborative work between different domains with corresponding specialist competences, sciences and research traditions. We present several major transdisciplinary unification projects for information and knowledge, which proceed on the descriptive, logical and the level of generative mechanisms. Parallel process of boundary crossing and transdisciplinary activity is going on in the applied domains. Technological artifacts are becoming increasingly complex and their design is strongly user-centered, which brings in not only the function and various technological qualities but also other aspects including esthetic, user experience, ethics and sustainability with social and environmental dimensions. When integrating knowledge from a variety of fields, with contributions from different groups of stakeholders, numerous challenges are met in establishing common view and common course of action. In this context, information is our environment, and informational ecology determines both epistemology and spaces for action. We present some insights into the current state of the art of transdisciplinary theory and practice of information studies and informatics. We depict different facets of transdisciplinarity as we see it from our different research fields that include information studies, computability, human-computer interaction, multi-operating-systems environments and philosophy.
... Related article by the same author, The Mathematician's Bias and the Return to Embodied Computation, elucidates the differences of physical computation compared to universal symbol manipulation. [40] From all above it is clear that Turing machine model of computation is an abstraction and idealization. In general, the trend in computing can be discerned towards extension to more and more physics-inspired instead of idealized, symbolmanipulating models, which are its subset. ...
Article
Full-text available
The development of models of computation induces the development of technology and natural sciences and vice versa. Current state of the art of technology and sciences, especially networks of concurrent processes such as Internet or biological and sociological systems, calls for new computational models. It is necessary to extend classical Turing machine model towards physical/ natural computation. Important aspects are openness and interactivity of computational systems, as well as concurrency of computational processes. The development proceeds in two directions - as a search for new mathematical structures beyond algorithms as well as a search for different modes of physical computation that are not equivalent to actions of human executing an algorithm, but appear in physical systems in which concurrent interactive information processing takes place. The article presents the framework of infocomputationalism as applied on computing nature, where nature is an informational structure and its dynamics (information processing) is understood as computation. In natural computing, new developments in both understanding of natural systems and in their computational modelling are needed, and those two converge and enhance each other.
... The idea of computing nature [7,8] builds on the notion that the universe as a whole can be seen as a computational system that intrinsically computes its own next state. This approach is called pancomputationalism or natural computationalism and dates back to Konrad Zuse with his Calculating Space-Rechnender Raum [9]. ...
Article
Full-text available
The Architecture of Mind as a Network of Networks of Natural Computational Processes Gordana Dodig-Crnkovic Chalmers University of Technology and University of Gothenburg, Gothenburg 41296, Sweden; E-Mail: dodig@chalmers.se; Tel.: +46-73-662-0511 Abstract: In discussions regarding models of cognition, the very mention of “computationalism” often incites reactions against the insufficiency of the Turing machine model, its abstractness, determinism, the lack of naturalist foundations, triviality and the absence of clarity. None of those objections, however, concerns models based on natural computation or computing nature, where the model of computation is broader than symbol manipulation or conventional models of computation. Computing nature consists of physical structures that form layered computational architecture, with computation processes ranging from quantum to chemical, biological/cognitive and social-level computation. It is argued that, on the lower levels of information processing in the brain, finite automata or Turing machines may still be adequate models, while, on the higher levels of whole-brain information processing, natural computing models are necessary. A layered computational architecture of the mind based on the intrinsic computing of physical systems avoids objections against early versions of computationalism in the form of abstract symbols manipulation. Keywords: computational cognition; info-computationalism; cognitive architecture; natural computing; cognitive information processing; levels of organisation.
... Another is the Arbiter's Problem [4], also known as Buridan's Principle [10] , a fundamental property of physics stating that a discrete decision based on a continuous variable (i.e., an analog-to-digital conversion) cannot be guaranteed to complete in bounded time; this property must be accounted for in any analysis seeking formal guarantees about discrete decisions on real numbers. Interestingly, both of these physical constraints are also closely related to considerations of computability, as is natural if the viewpoint is taken that the physical universe itself may arise from an underlying computational process [15]. ...
Article
Full-text available
This work outlines an equation-based formulation of a digital control program and transducer interacting with a continuous physical process, and an approach using the Coq theorem prover for verifying the performance of the combined hybrid system. Considering thermal dynamics with linear dissipation for simplicity, we focus on a generalizable, physically consistent description of the interaction of the real-valued temperature and the digital program acting as a thermostat. Of interest in this work is the discovery and formal proof of bounds on the temperature, the degree of variation, and other performance characteristics. Our approach explicitly addresses the need to mathematically represent the decision problem inherent in an analog-to-digital converter, which for rare values can take an arbitrarily long time to produce a digital answer (the so-called Buridan's Principle); this constraint ineluctably manifests itself in the verification of thermostat performance. Furthermore, the temporal causality constraints in the thermal physics must be made explicit to obtain a consistent model for analysis. We discuss the significance of these findings toward the verification of digital control for more complex physical variables and fields.
... Since nature continuously performs information processing that computes its next state [Chaitin 2007], every physical system performs some computation. It is very important to make a distinction between Computing Nature [ Zenil 2013] and the 'Triviality account' in which every physical system performs every kind of computation. Fresco seems to be sceptical about the computing nature approach, as his focus in this book is to present the state of the art and to clear existing muddles around computation and cognition, and not so much to introduce new developments in the field: 'It remains an open question though whether embodied computation is indeed the type of computation that takes place in nature. ...
Article
Full-text available
Fresco, Nir, Physical Computation And Cognitive Science. Berlin Heidelberg: Springer, 2014, Studies in Applied Philosophy, Epistemology and Rational Ethics, Vol. 12, XXII, 229, 83,29 €. According to the author, the objective of this book is to establish a clearer understanding of computation in cognitive science, and to argue for the fundamental role of concrete (physical) computation, for cognition. He succeeds in both. At the same time he is searching for the adequate scope of computation, repudiating attempts of Putnam, Searle and others, who argued against (classical) computationalism in cognitive science, thereby trivializing computation. The book identifies ambiguities in present day approaches to computation and presents and compares different concepts of computation and their applicability to cognitive systems. The main claim is that for computation to be effective in a cognitive system, computation must be physical (concrete). That requirement is motivated by the development of cognitive theories in the direction of embodiment and embeddedness. The corollary is that the Turing model of computation does not suffice to cover all kinds of cognitive computational processes, as it is a model of a logical procedure describing computation of a mathematical function, while cognitive processes in an organism cover a much broader range of information processing. Fresco presents the computation as a concept that philosophers of computing and computer scientists as well as cognitive scientists understand in multiple ways. He lists seven different conceptions of computation predominant in the literature. The argument shows – what should be obvious in any case – that one accepted formalization of a concept (Turing machine) neither precludes reflection on its meaning, nor prevents other quite different formalizations. At present it is common to approach cognition through computation in a particular formalization based on Turing model of computation. However, computing is much broader than its logical aspects and its physical implementation (dependent also on types of objects manipulated and time-dependent processes of execution) while it is an aspect very central for understanding of cognition. In the same way as the model of informational universe (always relative to an agent) is not trivial because of layered architecture of the informational universe organized in hierarchical structure of levels of abstraction (Floridi 2009) – the dynamics of that informational universe (which is also a computational universe) is not trivial either. But then, as Fresco rightly emphasizes, it is necessary to generalize Turing model of computation to “concrete” (physical) computation. Fresco explores specifically digital physical computation (and he does not insist on a distinction between digital and discrete) – so he deliberately limits his domain. Floridi convincingly argued against digital ontology on the principal grounds (Floridi 2009). Nevertheless, when it comes to practical physical implementation of computation, current digital computers are successfully used for calculation of continua such as found in fluid dynamics. But that is on the modeling side, and the question is only how fine-grained model is sufficient to represent continuous system. The distinction continuous/discrete is not only the property of the physical world; it is a property of the relation between the cognizing agent and the world. (Dodig-Crnkovic and Müller 2009) p. 164. As the basis of an IP (information processing) account of computation, Fresco have chosen instructional information, “prescriptive in that its processing function is aimed at making something happen.” p. 140. The book presents key requirements for a physical system to perform nontrivial digital computation in terms of information processing. The system must have the capacity to: 1. Send information. 2. Receive information. 3. Store and retrieve information. 4. Process information. 5. Actualize control information. (Implementing this requirement is what separates trivial from nontrivial computation.) In the above list of requirements, strong influence of conventional computers is visible. As a summary, on p. 205 Fig. 8.1, there is a diagram showing the relations among the six different accounts of computation analyzed: 6. The most specific account: PSS (physical symbol system) account. UTMs (Universal Turing Machines) and universal stored program digital computers. 7. FSM (formal symbol manipulation) account: program controlled digital computing systems – special purpose TMs, special purpose digital computers 8. Algorithm execution account: digital systems acting in accordance with an algorithm, FSA (Finite State Automata), Hypercomputers 9. The Mechanic and IIP (Instructional Information Processing) accounts: Logic gates, Logic circuits, Discrete connectionist networks 10. The most general account: The “Triviality” account: every physical object computes - Searle-triviality thesis and the Putnam-triviality theorem imply that every sufficiently complex system performs every digital computation In the list above, between items 4 and 5, the account of computing nature is missing, that is the claim that the whole of nature computes, in general as a network of computational networks on different levels of organization (Dodig-Crnkovic 2014). It continuously performs information processing that computes its next state, (Chaitin 2007), where every physical system performs some computation. It is very important to make a distinction between Computing Nature (Stepney et al. 2006; Stepney 2008) (Dodig-Crnkovic and Müller 2009)(Rozenberg et al. 2012)(Zenil 2012) and “Triviality account” in which every physical system performs every kind of computation. Fresco seems to be skeptical about the computing nature approach, as his focus in this book is to present the state of the art and to clear existing muddles around computation and cognition, and not so much to introduce new developments in the field. “It remains an open question though whether embodied computation is indeed the type of computation that takes place in nature. But at least traditionally, it has not been invoked as the basis of the Computational Theory of Mind (CTM).” (p. 4). See also (Fresco and Staines 2014). Even though the CTM does not assume natural computation as a basis of computational approaches to cognition, in the computing nature approach, embodied computation comes naturally from the basic assumptions. If cognition is explained computationally, that computation must be embodied. The fact that traditional CTM did not realize the importance of embodiment points out CTM’s historical limitations. At the time classical computational theory of mind was developed, the belief prevailed that it would be possible to grow and sustain a conscious “brain-in-a-vat”. However, understanding of cognition has increased dramatically since the days of classical CTM, and any respectable contemporary theory of cognition must address embodiment. Fresco in this book makes an important and correct argument that the explanatory frameworks of computationalism, connectionism and dynamicism, contrary to frequent claims are not mutually exclusive but rather complementary approaches, suggesting the way for their integration. Some open questions that remain outside of the scope of the book Physical Computation And Cognitive Science are still of interest and should be mentioned. One fundamental perspective that is missing when it comes to cognition is the biological one. Cognition is a fundamentally biological phenomenon and in order to be able to construct cognitive computational artifacts it is important to understand how natural cognition functions, develops, and evolves. (Maturana and Varela 1980) It is hard to address cognitive phenomena without biological perspective. Computing nature approach includes those aspects and makes them integral part of its discourse. As a consequence of the aims and the framework chosen in the book, computers are taken to be the machines we have today, which also brings some assumptions and constraints that are not necessary. Among others the assumption about necessary infallibility of computation that is implicitly taken for granted, for example in the discussion of miscomputation (p. 41). Turing’ s own view of intelligent computing machines with learning capability is different, as he claims: “There are indications however that it is possible to make the machine display intelligence at the risk of its making occasional serious mistakes.” (Turing 1947) as quoted in (Siegelmann 2013). The allowance for cognitive computation making mistakes and even fatal errors might change the arguments and conclusions offered in the book. The next discussion that I find lacking is the role of explicit account of an agent for whom/which a process is computation. In the computing nature approach, with Hewitt model of computation (Hewitt 2012) in the background, agency-based view of cognition becomes visible and obvious. Instead of having one single definition of computation for all levels of organization, we can define computation in the sense of Hewitt model by interactions between computational agents (actors) that exchange information. The prospect of further development of computational accounts of cognition is nicely outlined in the concluding chapter of the book: “Research attention should be directed toward gaining a better understanding of the types of information processed by natural cognitive agents, how they are processed and interact and how such processing throws light on human cognitive architectures. Such research should examine how cognitive agents produce, acquire, extract, analyze and use information in learning, planning and decision making, for example. It can inform contemporary cognitive science by identifying the mechanisms in the human cognitive architecture that are necessary for these information-processing operations.” p. 225 To sum up, the main virtues of this timely and enlightening book are: systematicity and unusual clarity in eliciting key requirements for a physical system to perform concrete digital computation and providing comparison between different existing approaches to cognition. The book shows clearly that computing in general is broader than abstract models of computation, and cognitive science should be based on it. Gordana Dodig-Crnkovic Chalmers Technical University and University of Gothenburg References Chaitin, G. 2007. Epistemology as Information Theory: From Leibniz to Ω, in G. Dodig-Crnkovic, ed. Computation, Information, Cognition – The Nexus and The Liminal. Newcastle UK: Cambridge Scholars: 2–17. Dodig-Crnkovic, G. 2007. Epistemology Naturalized: The Info-Computationalist Approach. APA Newsletter on Philosophy and Computers, 06/2: 9–13. Dodig-Crnkovic, G. 2014. Modeling Life as Cognitive Info-Computation, in Computability in Europe 2014. eds. A. Beckmann, E. Csuhaj-Varjú, and K. Meer, LNCS. Berlin Heidelberg: Springer: 153–163. Dodig-Crnkovic, G. and Mueller, V. 2009. A Dialogue Concerning Two World Systems: Info-Computational vs. Mechanistic, in Information and Computation eds. G. Dodig-Crnkovic and M. Burgin, Singapore: World Scientific Pub Co Inc. Floridi, L. 2009. Against digital ontology. Synthese, 168/1, 151–178. Fresco, N. and Staines, P. 2014. A revised attack on computational ontology. Minds and Machines, 24/1: 101–122. Hewitt, C., 2012. What is computation? Actor Model versus Turing’s Model, in A Computable Universe, Understanding Computation and Exploring Nature As Computation. ed. H. Zenil, World Scientific Publishing Company/Imperial College Press: 159-177. Maturana, H. and Varela, F. 1980. Autopoiesis and cognition: the realization of the living, Dordrecht Holland: D. Reidel Pub. Co. Rozenberg, G., Bäck, T. and Kok, J.N. eds. 2012. Handbook of Natural Computing, Berlin Heidelberg: Springer. Siegelmann, H.T., 2013. Turing on Super-Turing and Adaptivity. Progress in Biophysics and Molecular Biology, 113/1: 117–126. Stepney, S. et al. 2006. Journeys in Non-Classical Computation II: Initial Journeys and Waypoints. Int. J. Parallel Emerg. Distr. Syst., 21, 97–125. Stepney, S. 2008. The neglected pillar of material computation. Physica D: Nonlinear Phenomena, 237/9: 1157–1164. Zenil, H. ed. 2012. A Computable Universe. Understanding Computation and Exploring Nature As Computation. Singapore: World Scientific Publishing Company/Imperial College Press.
... For other disciplines such as medicine and biology, the subject is also very important as it affects the life progress and aging of humans and all living beings [8,9]. In life sciences, ecology and environmental sciences, the problem of system change pathways is still of high priority in solving problems such as river and shores sedimentation and erosion, as well as investigating ecological and environmental growth and balance10111213. The study of system change pathways is mainly related to the investigation of all factors affecting such changes. ...
Article
Full-text available
This paper is directed toward presenting a novel approach based on “consolidity charts” for the analysis of natural and man-made systems during their change pathway or course of life. The physical significance of the consolidity chart (region) is that it marks the boundary of all system interactive behavior resulting from all exhaustive internal and external influences. For instance, at a specific event state, the corresponding consolidity region describes all the plausible points of normalized input–output (fuzzy or non-fuzzy) interactions. These charts are developed as each event step for zone scaling of system parameters changes due to affected events or varying environments “on and above” their normal operation or set points and following the “time driven-event driven-parameters change” paradigm. Examples of the consolidity trajectory movement in the regions or patterns centers in the proposed charts of various consolidity classes are developed showing situations of change pathways from the unconsolidated form to the consolidated ones and vice versa. It is shown that the regions comparisons are based on type of consolidity region geometric shapes properties. Moreover, it is illustrated that the centerlines connecting consolidity regions during the change pathway could follow some certain type of trajectories designated as “consolidity pathway trajectory” that could assume various forms including zigzagging patterns depending on the consecutive affected influences. Implementation procedures are elaborated for the consolidity chart analysis of four real life case studies during their conventional and unconventional change pathways, describing: (i) the drug concentration production problem, (ii) the prey–predator population problem, (iii) the spread of infectious disease problem and (iv) the HIV/AIDS Epidemic problem. These solved case studies have lucidly demonstrated the applicability and effectiveness of the suggested consolidity chart approach that could open the door for a comprehensive analysis of system change pathway of many other real life applications. Examples of the fields of these applications are engineering, materials sciences, biology, medicine, geology, life sciences, ecology, environmental sciences and other important disciplines.
... The main reason is that reversible computation is not possible for the operation of erasure. Deleting information has a thermodynamic cost, and the authors argue that consciousness or sophisticated information processing will certainly need to erase information (see also Zenil 2012). ...
Article
Full-text available
The death of our universe is as certain as our individual death. Some cosmologists have elaborated models which would make the cosmos immortal. In this paper, I examine them as cosmological extrapolations of immortality narratives that civilizations have developed to face death anxiety. I first show why cosmological death should be a worry, then I briefly examine scenarios involving the notion of soul or resurrection on a cosmological scale. I discuss in how far an intelligent civilization could stay alive by engaging in stellar, galactic and universal rejuvenation. Finally, I argue that leaving a cosmological legacy via universe making is an inspiring and promising narrative to achieve cosmological immortality.
Article
Full-text available
Information transmission via communication between agents is ubiquitous on Earth, and is a vital facet of living systems. In this paper, we aim to quantify this rate of information transmission associated with Earth’s biosphere and technosphere (i.e., a measure of global information flow) by means of a heuristic order-of-magnitude model. By adopting ostensibly conservative values for the salient parameters, we estimate that the global information transmission rate for the biosphere might be ∼1024 bits/s, and that it may perhaps exceed the corresponding rate for the current technosphere by ∼9 orders of magnitude. However, under the equivocal assumption of sustained exponential growth, we find that information transmission in the technosphere can potentially surpass that of the biosphere ∼90 years in the future, reflecting its increasing dominance.
Article
Tekst stanowi komentarz do publikowanego przekładu tekstu O antropolizie autorstwa Benjamina Brattona, kalifornijskiego teoretyka dizajnu i twórcy nowego modelu geografii politycznej. W związku z tym, że Bratton nie był dotychczas tłumaczony na język polski, podejmuję się krótkiego przedstawienia jego postaci i dokonań, koncentrując się przede wszystkim na książce The Stack: On Software and Sovereignty (2016). Krytycznie omawiając koncepcję Stosu (ang. The Stack), która w zamierzeniu autora ma zrewolucjonizować myślenie o władzy i suwerenności, przedstawiam miejsce Brattona w najnowszej historii intelektualnej, założenia teoretyczne leżące u zrębów jego myśli, a także identyfikuję jego inspiracje. Następnie omawiam tekst O antropolizie, pokazując, w jaki sposób pojęcie antropolizy rozwiązuje niektóre trudności wynikające z lektury The Stack.
Article
Full-text available
Pancomputationalism is quite a wide-ranging concept, but most of its variants, either implicitly or explicitly, rely on Turing's conceptualizations of a computer and computing, which are obvious anthropomorphisms. This paper questions the concept of pancomputationalism based on Turing computing and asks what concept of computation can be used to avoid the constrains of anthropomorphisations.
Chapter
In the long journey of the human mind attempting to decode the workings of reality, one trusted companion has to be abandoned: the materialistic and reductionistic scientific worldview. What new notion should fill the void? Slowly a novel worldview is emerging, supported by different theoretical traditions. Most intriguingly, at the nexus of these formal approaches a new ontology of reality is becoming most apparent. Two novel mantras are spreading through humanity’s collective mind: “Information is physical” and “Information represents the ultimate nature of reality.” These surprisingly simple assertions have many deep consequences. Information theory is the wellspring of our contemporary digital world. Computation is, in essence, information processing. Then, information can be harnessed for mechanical work. Moreover, some of the pioneers of modern theoretical physics have, for a long time, suspected that information plays a fundamental role in nature. One striking consequence of this paradigm is that reality is inherently finite. Infinities can only be found in the abstract thought systems of the human mind. Essentially, there is a limit to how many bits of information can be stored in any region of space. The amalgamation of information theory, black hole thermodynamics, and string theory is hinting at a radical ontology: The universe is a hologram. In other words, our three-dimensional reality is an illusion created by the information content encoded on a two-dimensional area. Indeed, space and time appear to be emergent properties arising from pure quantum entanglement. Then, in very recent developments, string theory and theoretical computer science are conspiring to spearhead this novel probe into the heart of reality. The recalcitrant theory of quantum mechanics is reborn in a more approachable quantum-computational framework. In this novel information-theoretic context, the universe easily can be interpreted as a vast simulation. Level of mathematical formality: intermediate.
Chapter
Full-text available
Cognitive science is considered to be the study of mind (consciousness and thought) and intelligence in humans. Under such definition variety of unsolved/unsolvable problems appear. This article argues for a broad understanding of cognition based on empirical results from i.a. natural sciences, self-organization, artificial intelligence and artificial life, network science and neuroscience, that apart from the high level mental activities in humans, includes sub-symbolic and sub-conscious processes, such as emotions, recognizes cognition in other living beings as well as extended and distributed/social cognition. The new idea of cognition as complex multiscale phenomenon evolved in living organisms based on bodily structures that process information, linking cognitivists and EEEE (embodied, embedded, enactive, extended) cognition approaches with the idea of morphological computation (info-computational self-organisation) in cognizing agents, emerging in evolution through interactions of a (living/cognizing) agent with the environment.
Article
Full-text available
I will analyse Floridi’s rejection of digital ontologies and his positive proposal for an informational structural realism (ISR). I intend to show that ISR is still fundamentally a digital ontology, albeit with some different metaphysical commitments to those that Floridi rejects. I will argue that even though Floridi deploys the method of levels of abstraction adapted from computer science, and has established a Kantian transcendentalist conception of both information and structure, ISR still reduces to a discretised binary, and therefore digital, ontology. The digital ontologies that Floridi rejects are John Wheeler’s “It from Bit” conception and computational (including pancomputational) metaphysics (although there are others). They’re rejected predominantly on the basis that they rely upon a false dichotomy between digital discrete and continuous metaphysics (with which I agree). ISR involves a Kantian transcendentalist conception of de re relations that is intended to avoid this false dichotomy. However, I’ll argue that the binary, discrete, digital component of digital ontology is retained in ISR, and therefore ISR is still a digital ontology since its conception of information reduces to binary discrete de re relations. As such, ISR comes down on one side of the rejected ontic dichotomy of digital metaphysics, and so an informational metaphysics that is not a digital ontology is still a promissory note in the philosophy of information.
Chapter
Gigerenzer and coauthors have described a remarkably fast and direct way of generating new theories that they term the tools-to-theories heuristic. Call it the TTT heuristic or simply TTT. TTT links established methods to new theories in an intimate way that challenges the traditional distinction of context of discovery and context of justification. It makes heavy use of rhetorical tropes such as metaphor. This chapter places the TTT heuristic in additional historical, philosophical, and scientific contexts, especially informational biology and digital physics, and further explores its strengths and weaknesses in relation to human limitations and scientific realism.
Chapter
In the context of research efforts on causal sets as discrete models of physical spacetime, and on their derivation from simple, deterministic, sequential models of computation, we consider boolean nets, a transition system that generalises cellular automata, and investigate the family of causal sets that derive from their computations, in search for interesting emergent properties. The choice of boolean nets is motivated by the fact that they naturally support compositions via a LOTOS-inspired parametric parallel operator, with possible interesting effects on the emergent structure of the derived causal sets.
Article
Full-text available
Germany, 1935: the engineer Konrad Zuse (1910–1995), in the living room of his Berlin house, devotes himself to the design and construction of a binary, programmable machine, the Z1, capable of processing data in a fast and efficient way. While building his machines, he also started to devise a conceptual and notational system for writing ‘programs’ to execute applications much more complex than the basic arithmetic calculations. He delved deep into the study of formal logic in order to work out his “computation plan”, the Plankalkül. Although the Plan Calculus didn't exercise much impact on German post-World War hardships, it displays all the traits currently recognized as standard features of modern programming languages. The aim of the present study is to highlight the general purpose and technical specifics of this language, its historical and scientific background, and the philosophical inspiration leading Konrad Zuse to employ the predicate logic in the formalization of the “computation projects” for his machines.
Article
Full-text available
This article presents the evolution of philosophical and methodological considerations concerning empiricism in computer/computing science. In this study, we trace the most important current events in the history of reflection on computing. The forerunners of Artificial Intelligence H.A. Simon and A. Newell in their paper Computer Science As Empirical Inquiry (1975) started these considerations. Later the concept of empirical computer science was developed by S.S. Shapiro, P. Wegner, A.H. Eden and P.J. Denning. They showed various empirical aspects of computing. This led to a view of the science of computing (or science of information processing) - the science of general scope. Some interesting contemporary ways towards a generalized perspective on computations were also shown (e.g. natural computing).
Article
Full-text available
Digital physics claims that the entire universe is, at the very bottom, made out of bits; as a result, all physical processes are intrinsically computational. For that reason, many digital physicists go further and affirm that the universe is indeed a giant computer. The aim of this article is to make explicit the ontological assumptions underlying such a view. Our main concern is to clarify what kind of properties the universe must instantiate in order to perform computations. We analyse the logical form of the two models of computation traditionally adopted in digital physics, namely, cellular automata and Turing machines. These models are computationally equivalent, but we show that they support different ontological commitments about the fundamental properties of the universe. In fact, cellular automata are compatible with a rather traditional form of physicalism, whereas Turing machines support a dualistic ontology, which could be understood as a realism about the laws of nature or, alternatively, as a kind of panpsychism.
Chapter
A cellular automaton collider is a finite state machine build of rings of one-dimensional cellular automata. We show how a computation can be performed on the collider by exploiting interactions between gliders (particles, localisations). The constructions proposed are based on universality of elementary cellular automaton rule 110, cyclic tag systems, supercolliders, and computing on rings.
Chapter
This is a personal, in a great extent autobiographical, view on natural computing, especially about DNA and membrane computing, having as a background the author work in these research areas in the last (more than) two decades. The discussion ranges from precise (though informal) computer science and mathematical issues to very general issues, related, e.g., to the history of natural computing, tendencies, questions (deemed to remain questions, debatable) of a, say, philosophical flavor.
Article
Full-text available
The idea of obtaining a pilot-wave quantum theory on a lattice with discrete time is presented. The motion of quantum particles is described by a Ψ2|\Psi |^2-distributed Markov chain. Stochastic matrices of the process are found by the discrete version of the least-action principle. Probability currents are the consequence of Hamilton’s principle and the stochasticity of the Markov process is minimized. As an example, stochastic motion of single particles in a double-slit experiment is examined.
Chapter
In this chapter we derive numerical methods to solve the first-order differential equation dydt=f(t,y),     for   0<t,\displaystyle{ \frac{dy} {dt} = f(t,y),\;\;\text{ for }\;0 <t, } (7.1) where y(0)=α.\displaystyle{ y(0) =\alpha. } (7.2) This is known as an initial value problem (IVP), and it consists of the differential equation (7.1) along with the initial condition in (7.2). Numerical methods for solving this problem are first derived for the case of when there is one differential equation. Afterwards, the methods are extended to problems involving multiple equations.
Chapter
This chapter provides a brief introduction to the floating-point number system used in most scientific and engineering applications. A few examples are given in the next section illustrating some of the challenges using finite precision arithmetic, but it is worth quoting Donald Knuth to get things started. If you are unfamiliar with him, he was instrumental in the development of the analysis of algorithms, and is the creator of TeX. Anyway, here are the relevant quotes (Knuth [1997]:
Conference Paper
Full-text available
Future progress of new information processing devices capable of dealing with problems such as big data, Internet of things, semantic web, cognitive robotics, neuroinformatics and similar, depends on the adequate and efficient models of computation. We argue that defining computation as information transformation, and given that there is no information without representation, the dynamics of information on the fundamental level is physical/ intrinsic/ natural computation (Dodig-Crnkovic, 2011) (Dodig-Crnkovic, 2014). Intrinsic natural computation occurs on variety of levels of physical processes, such as the levels of computation of living organisms as well as designed computational devices. The present article is building on our typology of models of computation as information processing (Burgin & Dodig-Crnkovic, 2013). It is indicating future paths for the advancement of the field, expected both as a result of the development of new computational models and learning from nature how to better compute using information transformation mechanisms of intrinsic computation.
Conference Paper
Full-text available
This paper presents taxonomy of models of computation. It includes Existential (Physical, Abstract and Cognitive), Organizational, Temporal, Representational, Domain/Data, Operational, Process-oriented and Level-based taxonomy. It is connected to more general notion of natural computation, intrinsic to physical systems, and particularly to cognitive computation in living organisms and artificial cognitive systems. Computation is often understood through the Turing machine model, in the fields of computability, computational complexity and even as a basis for the present-day computer hardware and software architectures. However, several aspects of computation, even those existing in today's applications, are left outside in this model, thus adequate models of real-time, distributed, self-organized, resource-aware, adaptive, learning computation systems are currently being developed.
Conference Paper
After briefly mentioning the motivation and the “dreams” of unconventional computing (with an eye on natural computing, especially on bio-inspired computing), we ask ourselves whether these “dreams” are realistic, and end with a couple of related research issues from the membrane computing area.
Article
Full-text available
> Context • At present, we lack a common understanding of both the process of cognition in living organisms and the construction of knowledge in embodied, embedded cognizing agents in general, including future artifactual cogni-tive agents under development, such as cognitive robots and softbots. > Purpose • This paper aims to show how the info-computational approach (IC) can reinforce constructivist ideas about the nature of cognition and knowledge and, conversely, how constructivist insights (such as that the process of cognition is the process of life) can inspire new models of computing. > Method • The info-computational constructive framework is presented for the modeling of cognitive processes in cognizing agents. Parallels are drawn with other constructivist approaches to cognition and knowledge generation. We describe how cognition as a process of life itself functions based on info-computation and how the process of knowledge generation proceeds through interactions with the environment and among agents. > Results • Cognition and knowledge generation in a cognizing agent is understood as interaction with the world (potential information), which by processes of natural computation becomes actual information. That actual infor-mation after integration becomes knowledge for the agent. Heinz von Foerster is identified as a precursor of natural computing, in particular bio computing. > Implications • IC provides a framework for unified study of cognition in living organisms (from the simplest ones, such as bacteria, to the most complex ones) as well as in artifactual cogni-tive systems. > Constructivist content • It supports the constructivist view that knowledge is actively constructed by cognizing agents and shared in a process of social cognition. IC argues that this process can be modeled as info-computation. > Key words • Constructivism, info-computationalism, computing nature, morphological computing, self-organization, autopoiesis (12) (PDF) Why we need info-computational constructivism. Available from: https://www.researchgate.net/publication/287305416_Why_we_need_info-computational_constructivism [accessed Oct 07 2020].
Article
Full-text available
The metacomputation system (MS) is proposed to support the hypothesis that the universe is the processing output of a computer simulation. The MS is derived from a 3-tier hierarchy metaphysics model and it consists of 3 faculties – data, program and processor. The MS is the unprocessed existence of creation. The processing output of the MS is the processed existence of creation. The model is developed from the convergence of metaphysics and computational theories. It offers a new perspective and clarity on many important concepts and phenomena that have perplexed humans for millennia, including: consciousness, existence, creation, time, space, multiverse, reality, laws of nature, language, entity, mind, experience, thought, feeling, emotion, sensation and action. The model predicts the existence of powered voxel that is the fundamental building blocks of the physical universe.
Data
Full-text available
2012 г. Мы предлагаем вашему вниманию серию кратких заметок о развитии искусственного интеллекта на западе. Мы не ставим перед собой целью представить полновесный обзор основных направлений в истории искусственного интеллекта (ИИ). Эти заметки – субъективное восприятие авторами потрясающе интересных, с их точки зрения, фактов в области ИИ. Только в октябре 2012 года намечается провести порядка тридцати международных конференций по ИИ, в ноябре – двадцать, в декабре – тридцать (см., например, www.conferencealerts.com). Только в этом году уже опубликовано необъятное количество книг по истории ИИ. Чем же вызван такой интерес? Одна из причин – личность и вклад в науку и историю человечества одного из родоначальников теории ИИ Алана Тьюринга (1912 – 1954), столетие со дня рождения которого отмечают во всем мире, человека, ускорившего победу над нацизмом, перед которым в 2009 году официально извинился Гордон Браун, премьер-министр Великобритании того времени. Кроме того, поскольку человек, увы, не совершенен, тематика искусственного интеллекта, возможность усовершенствовать самих себя, особо притягательна. Человечество существует тысячелетия, но, тем не менее, призыв древних «Познай себя» до сих пор не воплощен в жизнь. Нас пугают искусственным интеллектом (Роман Ямпольский Roman V. Yampolskiy, ученый из Луисвильского университета, штат Кентукки), уверяя, что его развитие приведет к уничтожению человеческой расы. В то же время, нас убеждают (Warwick, Kurtzweil), что будущее человечества, постгуманизм, -в слиянии человека с машиной. Киборг – вот сверхчеловек.
Article
Full-text available
This is an answer to the commentaries on my target article "Info-computational Constructivism and Cognition", for Constructivist Foundations. The variety of commentaries has shown that IC impacts on many disciplines, from physics to biology, to cognitive science, to ethics. Given its young age, IC still needs to fill in many gaps, some of which were pointed out by the commentators. My goal is both to illuminate some general topics of infocomputationalism, and to answer specific questions in that context.
Article
Full-text available
>Context • At present, we lack a common understanding of both the process of cognition in living organisms and the construction of knowledge in embodied, embedded cognizing agents in general, including future artifactual cognitive agents under development, such as cognitive robots and softbots. > Purpose • This paper aims to show how the info-computational approach (IC) can reinforce constructivist ideas about the nature of cognition and knowledge and, conversely, how constructivist insights (such as that the process of cognition is the process of life) can inspire new models of computing. > Method • The info-computational constructive framework is presented for the modeling of cognitive processes in cognizing agents. Parallels are drawn with other constructivist approaches to cognition and knowledge generation. We describe how cognition as a process of life itself functions based on info-computation and how the process of knowledge generation proceeds through interactions with the environment and among agents. > Results • Cognition and knowledge generation in a cognizing agent is understood as interaction with the world (potential information), which by processes of natural computation becomes actual information. That actual information after integration becomes knowledge for the agent. Heinz von Foerster is identified as a precursor of natural computing, in particular bio computing. > Implications • IC provides a framework for unified study of cognition in living organisms (from the simplest ones, such as bacteria, to the most complex ones) as well as in artifactual cognitive systems. > Constructivist content • It supports the constructivist view that knowledge is actively constructed by cognizing agents and shared in a process of social cognition. IC argues that this process can be modeled as info-computation. > Key words • Constructivism, info-computationalism, computing nature, morphological computing, and cognition, self-organization, autopoiesis.
ResearchGate has not been able to resolve any references for this publication.