Conference PaperPDF Available

PROCEEDINGS OF THE SYMPOSIUM ON NATURAL/UNCONVENTIONAL COMPUTING AND IT’S PHILOSOPHICAL SIGNIFICANCE @ AISB/IACAP 2nd - 6th July 2012

Authors:

Abstract and Figures

http://www.mrtc.mdh.se/~gdc/work/AISB-IACAP-2012/NaturalComputingProceedings-2012-06-22.pdf
Collective link, and colimit of a pattern P The Memory Evolutive Systems give a model based on a 'dynamic' Category Theory, incorporating time and durations, for complex multi-scale systems, with the following characteristics: (i) The system is evolutionary, its components and their links varying over time. The few models of complex systems using category theory (e.g., inspired by [24]) only consider one category representing the invariant structure of system. On the contrary in MES, the system is not represented by a unique category but by an Evolutive System consisting in: a family of categories K t , representing the successive configurations of the system at each time t, and partial transition functors from K t to K t' accounting for the change from t to t'. (ii) The system is hierarchical, with a tangled hierarchy of components varying over time. A component C of a certain level 'binds' at least one pattern P of interacting components of lower levels so that C, and P acting collectively, have the same functional role. Modeling this hierarchy raises the Binding Problem: how do simple objects bind together to form "a whole that is greater than the sum of its parts" [1] and how can such "wholes" interact? In the categorical setting, the 'whole' C is represented by the colimit of the pattern P of interacting simple objects; and the interactions between wholes are described. (iii) There is emergence of complex multiform components, with development of a flexible central memory. Whence the Emergence Problem: how to measure the 'real' complexity of an object and what is the condition making possible the emergence over time of increasingly complex structures and processes? We characterize this condition as the Multiplicity Principle [12], a kind of 'flexible redundancy' which ensures the existence of multiform components. And we prove that it is necessary for the emergence of increasingly complex objects and processes with multiform presentations, constructed by iterated complexification processes [11]. (iv) The system has a multi-agent self-organization. Its global dynamic is modulated by the cooperation/competition of a network of internal functional subsystems, the co-regulators, with the help of a long-term memory. Each co-regulator operates locally with its own rhythm, logic and complexity, but their different commands can be conflicting and must be harmonized. While the local dynamics are amenable to conventional computations, the problem is different for the global one. MENS is a MES the level 0 of which represents the 'physical' neural system (neurons and synapses), while its higher
… 
Content may be subject to copyright.
A preview of the PDF is not available
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
Chapter
The dynamics of natural systems, and particularly organic systems, specialized in self-organization and complexity management, presents a vast source of ideas for new approaches to computing, such as natural computing and its special case organic computing. Based on paninformationalism (understanding of all physical structures as informational) and pancomputationalism or natural computationalism (understanding of the dynamics of physical structures as computation) a new approach of info-computational naturalism emerges as a result of their synthesis. This includes naturalistic view of mind and hence naturalized epistemology based on evolution from inanimate to biological systems through the increase in complexity of informational structures by natural computation. Learning on the info-computational level about structures and processes in nature and especially those in intelligent and autonomous biological agents enables the development of advanced autonomous adaptive intelligent artifacts and makes possible connection (both theoretical and practical) between organic and inorganic systems.
Book
This book presents a comprehensive, non-model-theoretic theory of ontic necessity and possibility within a formal (and formalized) ontology consisting of states of affairs, properties, and individuals. Its central thesis is that all modalities are reducible to intrinsic (or "logical") possibility and necessity if reference is made to certain states of affairs, called "bases of necessity." The viability of this Bases-Theory of Modality is shown also in the case of conditionals, including counterfactual conditionals. Besides the ontological aspects of the philosophy of modality, also the epistemology of modality is treated in the book. It is shown that the Bases-Theory of Modality provides a satisfactory solution to the epistemological problem of modality. In addition to developing that theory, the book includes detailed discussions of positions in the philosophy of modality maintained by Alvin Plantinga, David Lewis, Charles Chihara, Graeme Forbes, David Armstrong, and others. Among the themes treated are: possibilism vs. actualism; the theory of essences; conceivability and possibility; the nature of possible worlds; the nature of logical, nomological, and metaphysical possibility and necessity.
Book
Collision-Based Computing presents a unique overview of computation with mobile self-localized patterns in non-linear media, including computation in optical media, mathematical models of massively parallel computers, and molecular systems. It covers such diverse subjects as conservative computation in billiard ball models and its cellular-automaton analogues, implementation of computing devices in lattice gases, Conway's Game of Life and discrete excitable media, theory of particle machines, computation with solitons, logic of ballistic computing, phenomenology of computation, and self-replicating universal computers. Collision-Based Computing will be of interest to researchers working on relevant topics in Computing Science, Mathematical Physics and Engineering. It will also be useful background reading for postgraduate courses such as Optical Computing, Nature-Inspired Computing, Artificial Intelligence, Smart Engineering Systems, Complex and Adaptive Systems, Parallel Computation, Applied Mathematics and Computational Physics.
Book
We are living in a world where complexity of systems created and studied by people grows beyond all imaginable limits. Computers, their software and their networks are among the most complicated systems of our time. Science is the only efficient tool for dealing with this overwhelming complexity. One of the methodologies developed in science is the axiomatic approach. It proved to be very powerful in mathematics. In this book, the authors developed further an axiomatic approach in computer science initiated by Floyd, Manna, Blum and other researchers. In the traditional constructive setting, different classes of algorithms (programs, processes or automata) are studied separately, with some indication of relations between these classes. In such a way, the constructive approach gave birth to the theory of Turing machines, theory of partial recursive functions, theory of finite automata, and other theories of constructive models of algorithms. The axiomatic context allows one to research collections of classes of algorithms, automata, and processes. These classes are united in a collection by common properties in a form of axioms. As a result, axiomatic approach goes higher in the hierarchy of computer and network models, reducing in such a way complexity of their study.
Book
Join the authors on a journey where they describe the possibility of computers composed of nothing more than chemicals. Unlikely as it sounds, the book introduces the topic of 'reaction-diffusion computing', a topic which in time could revolutionise computing and robotics.