The dynamics of natural systems, and particularly organic systems, specialized in self-organization and complexity management, presents a vast source of ideas for new approaches to computing, such as natural computing and its special case organic computing. Based on paninformationalism (understanding of all physical structures as informational) and pancomputationalism or natural computationalism (understanding of the dynamics of physical structures as computation) a new approach of info-computational naturalism emerges as a result of their synthesis. This includes naturalistic view of mind and hence naturalized epistemology based on evolution from inanimate to biological systems through the increase in complexity of informational structures by natural computation. Learning on the info-computational level about structures and processes in nature and especially those in intelligent and autonomous biological agents enables the development of advanced autonomous adaptive intelligent artifacts and makes possible connection (both theoretical and practical) between organic and inorganic systems.
This book presents a comprehensive, non-model-theoretic theory of ontic necessity and possibility within a formal (and formalized) ontology consisting of states of affairs, properties, and individuals. Its central thesis is that all modalities are reducible to intrinsic (or "logical") possibility and necessity if reference is made to certain states of affairs, called "bases of necessity." The viability of this Bases-Theory of Modality is shown also in the case of conditionals, including counterfactual conditionals. Besides the ontological aspects of the philosophy of modality, also the epistemology of modality is treated in the book. It is shown that the Bases-Theory of Modality provides a satisfactory solution to the epistemological problem of modality. In addition to developing that theory, the book includes detailed discussions of positions in the philosophy of modality maintained by Alvin Plantinga, David Lewis, Charles Chihara, Graeme Forbes, David Armstrong, and others. Among the themes treated are: possibilism vs. actualism; the theory of essences; conceivability and possibility; the nature of possible worlds; the nature of logical, nomological, and metaphysical possibility and necessity.
Collision-Based Computing presents a unique overview of computation with mobile self-localized patterns in non-linear media, including computation in optical media, mathematical models of massively parallel computers, and molecular systems.
It covers such diverse subjects as conservative computation in billiard ball models and its cellular-automaton analogues, implementation of computing devices in lattice gases, Conway's Game of Life and discrete excitable media, theory of particle machines, computation with solitons, logic of ballistic computing, phenomenology of computation, and self-replicating universal computers.
Collision-Based Computing will be of interest to researchers working on relevant topics in Computing Science, Mathematical Physics and Engineering. It will also be useful background reading for postgraduate courses such as Optical Computing, Nature-Inspired Computing, Artificial Intelligence, Smart Engineering Systems, Complex and Adaptive Systems, Parallel Computation, Applied Mathematics and Computational Physics.
We are living in a world where complexity of systems created and studied by people grows beyond all imaginable limits. Computers, their software and their networks are among the most complicated systems of our time. Science is the only efficient tool for dealing with this overwhelming complexity. One of the methodologies developed in science is the axiomatic approach. It proved to be very powerful in mathematics. In this book, the authors developed further an axiomatic approach in computer science initiated by Floyd, Manna, Blum and other researchers. In the traditional constructive setting, different classes of algorithms (programs, processes or automata) are studied separately, with some indication of relations between these classes. In such a way, the constructive approach gave birth to the theory of Turing machines, theory of partial recursive functions, theory of finite automata, and other theories of constructive models of algorithms. The axiomatic context allows one to research collections of classes of algorithms, automata, and processes. These classes are united in a collection by common properties in a form of axioms. As a result, axiomatic approach goes higher in the hierarchy of computer and network models, reducing in such a way complexity of their study.
Join the authors on a journey where they describe the possibility of computers composed of nothing more than chemicals. Unlikely as it sounds, the book introduces the topic of 'reaction-diffusion computing', a topic which in time could revolutionise computing and robotics.