Project

Complex Systems in Physics, Biology, and Medicine: Backgrounds, Understanding, Modeling, and Software

Goal: The first goal is to disseminate a knowledge of the core of complex systems in an easy to follow form among all who would like to know more about it.

The most important goal is to enable everybody to reach the state when he/she can design complex systems giving desired output what is very difficult task using the contemporary available tools. It is necessary to study literally thousands of complex systems (CSs), and only then, it is possible to reach the edge of all what is known about CSs.

Prior to publication a review paper about CSs in biology and medicine, the poster

https://www.researchgate.net/publication/305751754_COMPLEX_SYSTEMS_AND_THEIR_USE_IN_BIOMEDICAL_RESEARCH_MATHEMATICAL_CONCEPTS_OF_SELF-ORGANIZATION_AND_EMERGENCE

serves as a good starting point to all who are interested in backgrounds of CSs in medicine and biology. It is a really exciting area of research, which has a great future and very probably will quickly lead to the development of personalized medicine. The work on it already begun (see the project "Classification of ECGs and Prediction of Arrhythmias").

Review paper serving as a concise starting point to study

https://www.researchgate.net/publication/330546521_Complex_Systems_and_Their_Use_in_Medicine_Concepts_Methods_and_Bio-Medical_Applications

Methods: Biology, Artificial Intelligence, Theory of Computation, Mathematics, Medicine, Physics, Machine Learning, C++, Python, Self-Organization, Ecosystem Functioning, Agent Based Modeling, Complex Networks, Scientific Visualization, Qt, Life Sciences, Cellular Automata, Mathematical Modeling, Computational Modeling, Computer Simulations, Computer Modelling, Emergence, Complex Systems, Massively Parallel Computations, Complexity

Date: 3 April 2017

Updates
0 new
29
Recommendations
0 new
53
Followers
0 new
67
Reads
1 new
1467

Project log

Jiří Kroc
added a research item
The implementation of logic-gate OR within the massive-parallel computational environment simulated by John H. Conway's 'Game of Life' cellular automaton. Another similar codes provide implementations of AND and NOT gates; all the gates are implemented using GoL-N24 Python software. It is a well-known fact that AND, OR, and NOT logic gates are capable to design any processing unit and memory, which are two basic components in the construction of any computer. Animations of all logic gate OR inputs are demonstrated along with the input configurations simulating them (TBA along with the next improved software version).
Jiří Kroc
added a research item
This program, GoL-N24-v1.0.py, is the open-source, advanced, interactive Python implementation of the 'Game of Life' cellular automaton with extended neighborhood, where 8 neighbors are selected prior the simulation from 24 possible within 5 times 5 large neighborhood area. This gives the total of 735 471 possible combinations of eight neighboring cells. *** The main idea behind this software is to invite everyone to explore a rich space of all neighborhoods, and to find out interesting neighbor-combinations of such generalized GoL rule. It is expected that there might be hidden, within all possible combinations, some neighborhoods that are leading to unique complex emergent structures. *** Recently, we got to know from biological experiments that biological systems are undergoing fairly complex internal computations and decisions, which are mostly and to our surprise based on the application of very simple local evolution/decision rules. This program allows us to study an example of such system with a really huge space of possible neighborhood combinations, which requires a cooperation of the team of researchers and computer science enthusiasts. This cellular automaton can help us to yield valuable insights into the behavior of such systems. *** Be aware that the released version is not allowing loading and saving CA worlds and neighborhood selections (this software upgrade is already developed and is being tested now and released in some weeks from now). Hence, it is up to user to make sure that exactly 8 neighbors are selected (as in the original GoL). For now, interesting configurations can be saved only in graphical form using save option available matplotlib window. *** The implementation of this code is done using numpy Python library in back-end and matplotlib Python library and applying event-driven design of the front-end. Please, report any development hints, mistakes, and errors to the author. Cite program as declared in its preamble. Enjoy the program. (A sample video: https://youtu.be/gc42D2sdL8I ).
Jiří Kroc
added an update
Mark Alber, Adrian Buganza Tepole, William R. Cannon, Suvranu De, Salvador Dura-Bernal, Krishna Garikipati, George Karniadakis, William W. Lytton, Paris Perdikaris, Linda Petzold and Ellen Kuhl
npj Digital Medicine (2019) 2:115
DOI: 10.1038/s41746-019-0193-y
Abstract:
Fueled by breakthrough technology developments, the biological, biomedical, and behavioral sciences are now collecting more data than ever before. There is a critical need for time-and cost-efficient strategies to analyze and interpret these data to advance humanhealth. The recent rise of machine learning as a powerful technique to integrate multimodality, multifidelity data, and reveal correlations between intertwined phenomena presents a special opportunity in this regard. However, machine learning alone ignores the fundamental laws of physics and can result in ill-posed problems or non-physical solutions. Multiscale modeling is a successful strategy to integrate multiscale, multiphysics data and uncover mechanisms that explain the emergence of function. However, multiscale modeling alone often fails to efficiently combine large datasets from different sources and different levels of resolution. Here we demonstrate that machine learning and multiscale modeling can naturally complement each other to create robust predictive models that integrate the underlying physics to manage ill-posed problems and explore massive design spaces. Wereview the current literature, highlight applications and opportunities, address open questions, and discuss potential challenges and limitations in four overarching topical areas: ordinary differential equations, partial differential equations, data-driven approaches, and theory-driven approaches. Towards these goals, we leverage expertise in applied mathematics, computer science, computational biology, biophysics, biomechanics, engineering mechanics, experimentation, and medicine. Our multidisciplinary perspective suggests that integrating machine learning and multiscale modeling can provide new insights into disease mechanisms, help identify new targets and treatment strategies, and inform decision making for the benefit of human health.
###
CONCLUSIONS
Machine learning and multiscale modeling naturally complement and mutually benefit from one another. Machine learning can explore massive design spaces to identify correlations and multiscale modeling can predict system dynamics to identify causality. Recent trends suggest that integrating machine learning and multiscale modeling could become key to better understand biological, biomedical, and behavioral systems. Along those lines, we have identified five major challenges in moving the field forward.
The first challenge is to create robust predictive mechanistic models when dealing with sparse data. The lack of sufficient data is a common problem in modeling biological, biomedical, and behavioral systems. For example, it can result from an inadequateexperimental resolution or an incomplete medical history. A critical first step is to systematically identify the missing information. Experimentally, this can guide the judicious acquisition of new data or even the design of new experiments to complement the knowledge base. Computationally, this can motivate supplementing the available training data by performing computational simulations. Ultimately, the challenge is to maximize information gain and optimize efficiency by combining low-and highresolution data and integrating data from different sources, which, in machine learning terms, introduces a multifidelity, multimodality approach.
The second challenge is to manage ill-posed problems. Unfortunately, ill-posed problems are relatively common in the biological, biomedical, and behavioral sciences and can result from inverse modeling, for example, when identifying parameter values or identifying system dynamics. A potential solution is to combine deterministic and stochastic models. Coupling the deterministic equations of classical physics—the balance of mass, momentum, and energy—with the stochastic equations of living systems—cell-signaling networks or reaction-diffusion equations —could help guide the design of computational models for problems that are otherwise ill-posed. Along those lines, physicsinformed neural networks and physics-informed deep learning are promising approaches that inherently use constrained parameter spaces and constrained design spaces to manage ill-posed problems. Beyond improving and combining existing techniques, we could even think of developing entirely novel architectures and new algorithms to understand ill-posed biological problems inspired by biological learning.
The third challenge is to efficiently explore massive design spaces to identify correlations. With the rapid developments in gene sequencing and wearable electronics, the personalized biomedical data has become as accessible and inexpensive as never before. However, efficiently analyzing big datasets within massive design spaces remains a logistic and computational challenge. Multiscale modeling allows us to integrate physicsbased knowledge to bridge the scales and efficiently pass information across temporal and spatial scales. Machine learning can utilize these insights for efficient model reduction towards creating surrogate models that drastically reduce the underlying parameter space. Ultimately, the efficient analytics of big data, ideally in real time, is a challenging step towards bringing artificial intelligence solutions into the clinic.
The fourth challenge is to robustly predict system dynamics to identify causality. Indeed, this is the actual driving force behind integrating machine learning and multiscale modeling for biological, biomedical, and behavioral systems. Can we eventually utilize our models to identify relevant biological features and explore their interaction in real time? A very practical example of immediate translational value is whether we can identify disease progression biomarkers and elucidate mechanisms from massive datasets, for example, early biomarkers of neurodegenerative disease, by exploiting the fundamental laws of physics. On a more abstract level, the ultimate challenge is to advance data-and theory-driven approaches to create a mechanistic understanding of the emergence of biological function to explain phenomena at higher scale as a result of the collective action on lower scales.
The fifth challenge is to know the limitations of machine learning and multiscale modeling. Important steps in this direction are analyzing sensitivity and quantifying of uncertainty. While machine learning tools are increasingly used to perform sensitivity analysis and uncertainty quantification for biological systems, they are at a high risk of overfitting and generating non-physical predictions. Ultimately, our approaches can only be as good as the underlying models and the data they have been trained on, and we have to be aware of model limitations and data bias. Preventing overfitting, minimizing data bias, and increasing rigor and reproducibility have been and will always remain the major challenges in creating predictive models for biological, biomedical, and behavioral systems.
 
Jiří Kroc
added an update
Massive-parallel computations are known to be operating in biological systems; they have a great potential in the description and understanding of emergent phenomena observed there. There are existing some models of simple emergent phenomena, but they usually lack robustness, which possess a great mathematical challenge.
This exactly became the motivation and main goal of this study, to pave the road towards development of robust, massive-parallel cellular automata models of natural phenomena observed in biology.
Below she provided links to the publication, software, and all simulations that are related to the following publication in the concise way.
Paper link of 'Robust massive parallel information processing environments in biology and medicine: case study':
The simplest Python program of 'Game of Life' called GoL:
Program GoL animations from the paper and more:
Python program robust GoL name as r-GoL:
Program r-GoL sample animations from the paper:
 
Jiří Kroc
added an update
Robust emergent computations are very important to our understanding of computations and evaluations carried on in all living entities. This task poses a great challenge to the contemporary cross-section of biology and mathematics.
One sample animation from three available that demonstrates the behavior of a robust algorithm that is representing a generalization of the 'Game of Life' cellular automaton can be seen here:
Details about the algorithm and results are published in the paper
###
All animations available at this link and are free to download:
The code itself will be available soon.
 
Jiří Kroc
added a research item
The current frontiers in the description and simulation of advanced physical and biological phenomena observed within all scientific disciplines are pointing toward the importance of the development of robust mathematical descriptions that are error resilient. Complexity research is lacking deeper knowledge of the design methodology of processes that are capable to recreate the robustness, which is going to be studied on massive-parallel computations (MPCs) implemented by cellular automata (CA). A simple, two-state cellular automaton having an extremely simple updating rule, which was created by John H. Conway called the ’Game of Life’ (GoL) is applied to simulate and logic gate using emergents. This is followed by simulations of a robust, generalized GoL, which works with nine states instead of two, that is called R-GoL (open-source). extra states enable higher intercellular communication. The logic gate is simulated by the GoL. It is destroyed by injection of random faulty evaluations with even the smallest probability. The simulations of the R-GoL are initiated with random values. several types of emergent structures, which are robust to injection of random errors, are observed for different setups of the R-GoL rule. The GoL is not robust. The R-GoL is capable to create and maintain oscillating, emergent structures that are robust under constant injection of random, faulty evaluations with up to 1% of errors. The R-GoL express long-range synchronization, which is together with robustness facilitated by designed intercellular communication. ..... ...... ..... ..... (Software & video links available at the file Software-and-animation-links-to--r-GoL.txt, see below)
Jiří Kroc
added an update
Carlos Eduardo Maldonado
Proceedings 2022, 81, 19.
DOI: 10.3390/proceedings2022081019
Abstract:
This paper argues that life is best understood in light of a physics of the immaterial. Life is not properly seen or touched, for instance, but conceived, imagined, intuited. In order to rightly grasp life in general, we need not reduce it in any sense, hence its counterintuitive character. The claim is based on five arguments: life is much more a process than a series of components; the first law of thermodynamics is important in thinking about processes; life entails a twofold perspective that opens up the window, so to speak, to the possible rather than only the actual; living beings are not machines in any sense of the word (biological hypercomputation); and life is an autopoietic or self-organized phenomenon. Some conclusions are drawn at the end.
Keywords: quantum mechanics; information theory; origin of life; logics of life; complexity science
###
Comments by J.K.:
The very origins of biological computation and its principles are still evading our understanding. It is always good to try to look from a greater distance to our achievements and give them some unifying features. Bridging various scientific disciplines is very demanding and requires deep insights into all involved disciplines.
On the other hand, such universal overview can trigger other researchers into exploration of directions and ideas that they never thought about before.
 
Jiří Kroc
added an update
CARLOS E. MALDONADO & NELSON A. GOMEZ CRUZ
Complexity 20:4 (2014) 8-18
DOI:10.1002/cplx.21535
Abstract:
This article discusses the meaning and scope of biological hypercomputation (BH) that is to be considered as new research problem within the sciences of complexity. The framework here is computational, setting out that life is not a standard Turing Machine. Living systems, we claim, hypercompute, and we aim at understanding life not by what it is, but rather by what it does. The distinction is made between classical and nonclassical hypercomputation. We argue that living processes are nonclassical hypercomputation. BH implies then new computational models. Finally, we sketch out the possibilities, stances, and reach of BH.
Key Words: complex systems; biological information processing; nonclassical hypercomputation; theoretical biology; complexification of computation
###
Biological Hypercomputation:
....
Life is a network of intertwined scales from bacteria to archaea to eukarya, and vice versa—on to the biosphere. In other words, from genes to cells to organs to systems on to the organism, and the interplay between homeostatic and homeorhetic processes. Life’s interaction with the environment is a question of resisting to the physical forces of the environment— homeorhesis. This is what Varela [94] dubs as the autonomy of living systems. However, at the same time once other living systems belong to the environment, the interaction is the interplay of cooperation and competition.
Life is a nonalgorithmic phenomenon. An algorithmic system works in three sequential steps, thus: input, processing, output, and they can never be mixed or intertwined. Computation literally happens in the second level, provided the input, and it takes place as a closed system (black box). Being essentially open, life never works in clearly stratified or differentiated sequential levels, for energy, matter, and information are constantly entering, being processed and produced in parallel, distributed, cross-linked dynamics. This is where BH happens and this is precisely what is to be understood. Hence, life does synthesize new information and new mechanisms for processing the new information—new information not present in original data.
...
###
Concluding Remarks:
...
So far, hypercomputation has not been capable of understanding biological processes, and develop a series of tools. In any case, to be sure, there is one more complex arena that arises here, namely how the mind hypercomputes.This issue raises questions and possibilities  not just for the computation sciences and biology but also for philosophy, epistemology, and mathematics if not also for logics.We have aimed at understanding life from a computational point of view; but it should be clear that we do not pretend to reduce life to a computational framework. 
 
Jiří Kroc
added an update
Richard Wong, Stefan Geyer, Wolfgang Weninger, Jean-Claude Guimberteau & Jason K. Wong
Experimental Dermatology (2015) 25(2)
DOI: 10.1111/exd.12832
Abstract:
The skin is often viewed as a static barrier that protects the body from the outside world. Emphasis on studying the skin's architecture and biomechanics in the context of restoring skin movement and function is often ignored. It is fundamentally important that if skin is to be modeled or developed that we do not only focus on the biology of skin but also aim to understand its mechanical properties and structure in living dynamic tissue. In this review, we describe the architecture of skin and patterning seen in skin as viewed from a surgical perspective and highlight aspects of the microanatomy that have never fully been realized and provide evidence or concepts that support the importance of studying living skin's dynamic behaviour. We highlight how the structure of the skin has evolved to allow the body dynamic form and function; and how injury, disease or ageing results in a dramatic changes to the microarchitecture and change physical characteristics of skin. Therefore, appreciating the dynamic microanatomy of skin from the deep fascia through to the skin surface is vitally important from a dermatological and surgical perspective. This focus provides an alternative perspective and approach to addressing skin pathologies and skin ageing.This article is protected by copyright. All rights reserved.
Key words: ageing – architecture – dynamic anatomy – patterning – skin
###
Conclusions:
Knowledge of the microarchitectural patterning seen in skin is vital to understanding how to aesthetically close wounds but also is an important blueprint to understanding how to develop skin replacements with the same mechanical properties so that they appear ‘life like’. The manipulation of skin goes beyond purely understanding the ‘grain’ of skin; hence, to achieve precise restoration and replacement needs to consider the continuum of skin below the surface aesthetics.
The appreciation of the architectural continuum of skin provides us with many concepts that help us better understand how ageing, disease and injury affect the skin health and cosmesis. By studying the physical and temporal dynamism of skin, we can further appreciate, simulate or engineer more realistic skin.
###
This review provides a very important starting point to all researchers who are working in the area of mathematical modeling of biological structures.
The mechanical nature of the problems related to functioning of all skin layers provides us with easier, mutual identification of real and simulated processes undergoing in living skin.
This work is recommended to all who are searching for a perspective and easy to grasp area of multiscale modelling techniques applied to functioning of biological structures.
 
Jiří Kroc
added an update
Per Bak, Chao Tang & Kurt Wiesenfeld
Physical Reviews Letters 59 (1987) 381-384
Abstract:
We show that dynamical systems with spatial degrees of freedom naturally evolve into a self-organized critical point. Flicker noise, or 1/f noise, can be identified with the dynamics of the critical state.This picture also yields insight into the origin of fractal objects.
###
There are not existing many research papers, which shape our understanding of the Universe so profoundly. This paper, which is explaining the very principles of self-organizing criticality of Complex Systems, is one of those rare papers.
It is worth to know the research results presented here by our heart as it helps us navigate through murky waters of complexity in hard sciences and biology.
 
Jiří Kroc
added an update
Joseph O. Dada and Pedro Mendes
Integrative Biology 3 (2011) 86–96
DOI: 10.1039/c0ib00075b
Abstract:
The aim of systems biology is to describe and understand biology at a global scale where biological functions are recognised as a result of complex mechanisms that happen at several scales, from the molecular to the ecosystem. Modelling and simulation are computational tools that are invaluable for description, prediction and understanding these mechanisms in a quantitative and integrative way. Therefore the study of biological functions is greatly aided by multi-scale methods that enable the coupling and simulation of models spanning several spatial and temporal scales. Various methods have been developed for solving multi-scale problems in many scientific disciplines, and are applicable to continuum based modelling techniques, in which the relationship between system properties is expressed with continuous mathematical equations or discrete modelling techniques that are based on individual units to model the heterogeneous microscopic elements such as individuals or cells. In this review, we survey these multi-scale methods and explore their application in systems biology.
###
Discussion:
Systems biology aims to describe and understand the operation of complex biological systems and ultimately to develop predictive models of human disease.80 This objective requires integration of knowledge from diverse biological components and data that span several spatial and temporal scales into models of the system as a whole. We described approaches that can be used to tackle multi-scale problems in systems biology including both continuum and ABM based multi-scale modelling approaches. We discussed the emerging multi-scale models for various biological systems and processes. These models use different strategies for modelling at different scales and for coupling between scales.
... continues
###
Multiscale modeling has a huge importance in modeling and description of biological processes in eucariote living creatures. To be able understand the latest models, it is always good to look to the early stages of development of the methodology. This review provides a list of different methodological approaches to multiscale modeling in general.
 
Jiří Kroc
added an update
Hiroki Sayama
State University of New York at Binghamton
(2015) Open SUNY, ISBN 13: 9781942341093
Keep up to date on Introduction to Modeling and Analysis of Complex Systems at http://bingweb.binghamton.edu/~sayama/textbook/
About the Book
Introduction to the Modeling and Analysis of Complex Systems introduces students to mathematical/computational modeling and analysis developed in the emerging interdisciplinary field of Complex Systems Science. Complex systems are systems made of a large number of microscopic components interacting with each other in nontrivial ways. Many real-world systems can be understood as complex systems, where critically important information resides in the relationships between the parts and not necessarily within the parts themselves.
This textbook offers an accessible yet technically-oriented introduction to the modeling and analysis of complex systems. The topics covered include: fundamentals of modeling, basics of dynamical systems, discrete-time models, continuous-time models, bifurcations, chaos, cellular automata, continuous field models, static networks, dynamic networks, and agent-based models. Most of these topics are discussed in two chapters, one focusing on computational modeling and the other on mathematical analysis. This unique approach provides a comprehensive view of related concepts and techniques, and allows readers and instructors to flexibly choose relevant materials based on their objectives and needs. Python sample codes are provided for each modeling example.
Table of Contents
• Introduction
• Fundamentals of Modeling
• Basics of Dynamical Systems
• Discrete-Time Models I: Modeling
• Discrete-Time Models II: Analysis
• Continuous-Time Models I: Modeling
• Continuous-Time Models II: Analysis
• Bifurcations
• Chaos
• Interactive Simulation of Complex Systems
• Cellular Automata I: Modeling
• Cellular Automata II: Analysis
• Continuous Field Models I: Modeling
• Continuous Field Models II: Analysis
• Basics of Networks
• Dynamical Networks I: Modeling
• Dynamical Networks II: Analysis of Network Topologies
• Dynamical Networks III: Analysis of Network Dynamics
• Agent-Based Models
 
Jiří Kroc
added an update
Filippo Castiglione, Francesco Pappalardom, Carlo Bianca, Giulia Russo, Santo Motta
BioMed Research International 2014(902545), DOI: 10.1155/2014/902545
Abstract
It is coming nowadays more clear that in order to obtain a unified description of the different mechanisms governing the behavior and causality relations among the various parts of a living system, the development of comprehensive computational and mathematical models at different space and time scales is required. This is one of the most formidable challenges of modern biology characterized by the availability of huge amount of high throughput measurements. In this paper we draw attention to the importance of multiscale modeling in the framework of studies of biological systems in general and of the immune system in particular.
###
From 'Concluding Remarks':
...
The goal of computational systems biology is to consider a biological system from a holistic perspective and use both experiments and modeling to reveal how the system behaves [4, 77]. Multiscale models able to exploit laboratory and clinical data at different levels can potentially bridge knowledge gaps between what is observed at the gene/molecular level and the clinical evolution of complex diseases [11].
...
###
This paper is recommended to everyone who wants to understand the basic principles of multiscale modeling.
 
Jiří Kroc
added an update
Jiri Kroc
preprint, February 8, 2022
The link to the paper:
Abstract:
The current frontiers in the description and simulation of advanced physical and biological phenomena observed within all scientific disciplines are pointing directly toward the importance of having robust models that are error resilient. Complexity research is lacking deeper knowledge of the design methodology of such processes that are capable to recreate the observed robustness. The first part provides a concise introduction into this difficult research area to non-specialists. This is followed by the second part of the paper, which is focusing on a description, that is providing a certain level of the desired robustness. The first introductory part is describing the basic principles that are demonstrated on a simple cellular automaton having an extremely simple updating rule, which was created by John Conway: the 'Game of Life' (aka the GoL). This allows us to enter the second, research part dealing with a simple CA rule, which is a generalization of the GoL, that is possessing an increased robustness against errors along with a very interesting set of emergent structures that are capable of unexpectedly long-range synchronization.
***
The link to the open-source 'Game of Life' cellular automaton software in Python:
***
The link to the open-source generalized, robust ''Game of life' software in Python:
***
Tlink to sample runs of generalized, robust ''Game of life' (above):
 
Jiří Kroc
added 2 research items
>>> Read the improved, published version with links to the Python software and animations here: https://www.researchgate.net/publication/361818826_Robust_massive_parallel_information_processing_environments_in_biology_and_medicine_case_study <<< ...... The current frontiers in the description and simulation of advanced physical and biological phenomena observed within all scientific disciplines are pointing directly toward the importance of development of robust mathematical descriptions that are error resilient. Complexity research is lacking deeper knowledge of the design methodology of processes that are capable to recreate the robustness. The first, introductory part of the paper provides a concise introduction into this difficult research area to non-specialists. This is followed by the second part, which is focusing on a description, that is providing a certain level of the desired robustness. The first part is describing the basic principles of massive-parallel computations (MPC) that are demonstrated on a simple cellular automaton having an extremely simple updating rule, which was created by John H. Conway: the 'Game of Life' (aka the GoL). This allows us to enter the second, research part dealing with a simple CA rule, which is a generalization of the GoL, that is possessing an increased robustness against errors along with a very interesting set of emergent structures that are expressing longrange synchronization.
<<< The code is a part of the paper: https://www.researchgate.net/publication/361818826_Robust_massive_parallel_information_processing_environments_in_biology_and_medicine_case_study >>> The presented emergent structures were discovered during research on modifications of the 'Game of Life' proposed by John Conway, which are aiming towards discovery of robust emergents, that are immune to randomly injected faulty evaluations. See details in the paper, the link is provided below, where those simulations are presented only in figures. Many scientifically and visually appealing simulation had arisen there, where the observed structures might be interesting to researchers working in biology and other disciplines. Animations are always having a great expressive power, see PNG animations below. Make sure that the viewer can display animated PNG figures! In general, it is observed that emergent structures of various types arise within this specific cellular automaton, all of them are having the period of two. What is surprising is the fact that those structures are long-range correlated, and some of them even turn by ninety degrees. From a certain point of view, we can say that the observed emergents posses a very stable binary behavior that deserves a deeper study. Surprisingly, no moving emergents were observed in this type of cellular automaton. ...... ....... Simulated videos of the code runs are shown at the link: https://www.researchgate.net/publication/357285926_Python_program_simulating_cellular_automaton_r-GoL_that_represents_robust_generalization_of_%27Game_of_Life%27_sample_runs ...... ....... The software belongs to the paper that is available here: https://www.researchgate.net/publication/361818826_Robust_massive_parallel_information_processing_environments_in_biology_and_medicine_case_study/
Jiří Kroc
added an update
Robert T. Wainwright
WSC '74: Proceedings of the 7th conference on Winter simulation - Vol. 2 ( January 1974) 449–459, DOI: 10.1145/800290.811303
Abstract:
The game of Life1 involves forms built out of simple birth and death rules which a computer puts through a series of rapid transformations. This game was invented by John Horton Conway and recently introduced in Scientific American by Martin Gardner. Many computers have been programmed to play the game of Life. In this paper we shall show how to return the compliment by making Life forms that can imitate computers. Then we shall see that many remarkable consequences follow from the existence of such constructions. Further we shall see that in Life there exists the possibility of organisms with the ability to duplicate themselves, to reproduce. It has even been suggested that the universe itself is space-time granular and that the future although completely deterministic is unpredic-table, being its own fastest simulation.
###
Many of us are wondering: "Why is there existing a link between the theory of complex systems and living entities?" The 'Game of Life' encoded in a primitive computing environment and the paper presented here give the answer to everyone who wants to know more.
This paper represents the simplest possible proof of capabilities of complex systems to carry out universal computing with the equivalence to the Turing machine. This paper gives a proof that we are capable to build computers or computing environments within any complex system that has sufficient flexibility. Life entities posses such and even greater levels of complexity.
Those interested about the open source code of the 'Game of Life' can download it here:
 
Jiří Kroc
added a research item
The whole computer can be simulated using the cellular automaton called the 'Game of the Life' that was designed by John Conway in 1970 using emergent structures of the first order. Presented simulations aim to demonstrate that environments utilizing massive parallel computations (MPC) including living organisms can carry on Turing Machine equivalent computations using elementary, localized computations. In other words, MPCs can evaluate all what is computable. The design of AND logic gate is demonstrated using the following emergent structures: three glider guns and a stopper. The two most left glider guns represent the logical input equal to the logical one when they are present. Otherwise, when missing, they are equal to the logical zero. The rightmost glider gun serves as the control. Computations are carried out by collisions of gliders generated by the respective glider gun. The output goes to the bottom right direction only and only when the inputs 1 and 1 are present. Pathologies: An example of incorrectly shifted glider guns that do not lead to the mutual annihilation of colliding gliders. The next level emergent structures are observed appearing from collisions of the first-level emergents called gliders. The emerging structures are called butterflies. *** To view PNG animations is required a software that is capable to display APNG format! *** Software used: https://www.researchgate.net/publication/355043921_The_simplest_Python_program_simulating_a_cellular_automaton_model_of_a_complex_system_the_'Game_of_Life'/stats
Jiří Kroc
added an update
Jiri Kroc
Many people are wondering why the GoL had become so popular among complex system researchers, and what are its connections to theory of computing. The answer is that we are capable to build within the GoL logical gates AND, OR, and NOT, which are building blocks of all human-made computers used currently. This led to the proof of GoL equivalence to the Turing machine.
It is a fascinating view to observe functioning logical gates implemented in a kind of computing medium that has a very high potential, being implemented in wet-computers.
In those animations, you are seeing an AND logical gate implemented within the GoL, which contains two inputs (A and B) made from glider guns emitting to the right that are located on the left side, and a controlling stream of gliders emitted from the rightmost glider gun. The first shown animation (A=1 & B=1) demonstrates the case when both inputs are equal to one (presence of the glider guns that are emitting gliders), and which propagates towards the output direction that is aiming to the right bottom. Other input combinations do not emit any output as is expected from a nicely behaving AND gate, see for the example, only the input A=1 along with control glider gun are presented in the second animation.
 
Jiří Kroc
added an update
Joakim Sundnes
Springer, Simula SpringerBriefs on Computing (2010) ISBN 978-3-030-50355-0
from the Preface
...
"The typical reader of the book will be a student of mathematics, physics, chemistry, or other natural science, and many of the examples will be familiar to these readers. However, the rapidly increasing relevance of data science means that computations and scientific programming will be of interest to a growing group of users. No typical data science tools are presented in this book, but the reader will learn tasks such as reading data from files, simple text processing, and programming with mathematics and floating point computations. These are all fundamental building blocks of any data science application, and they are essential to know before diving into more advanced and specialized tools."
"No prior knowledge of programming is needed to read this book. We start with some very simple examples to get started with programming and then move on to introduce fundamental programming concepts such as loops, functions, if-tests, lists, and classes. These generic concepts are supplemented by more specific and practical tools for scientific programming, primarily plotting and array-based computations. The book’s overall purpose is to introduce the reader to programming and, in particular, to demonstrate how programming can be an extremely useful and powerful tool in many branches of the natural sciences."
...
###
Python is a very useful tool in prototyping of software tools, and which can be used in many cases even as the final software product. Python has a huge set of modules enabling literally everyone to start to program quickly and efficiently. Everyone who want to learn Python quickly from the scratch will definitely benefit from this extraordinary introduction into scientific computing in Python book!
 
Jiří Kroc
added an update
Andrew Adamatzky
Philosophical Transactions of The Royal Society B Biological Sciences 374(1774) (June 2019)
DOI: 10.1098/rstb.2018.0372
Abstract
A substrate does not have to be solid to compute. It is possible to make a computer purely from a liquid. I demonstrate this using a variety of experimental prototypes where a liquid carries signals, actuates mechanical computing devices and hosts chemical reactions. We show hydraulic mathematical machines that compute functions based on mass transfer analogies. I discuss several prototypes of computing devices that employ fluid flows and jets. They are fluid mappers, where the fluid flow explores a geometrically constrained space to find an optimal way around, e.g. the shortest path in a maze, and fluid logic devices where fluid jet streams interact at the junctions of inlets and results of the computation are represented by fluid jets at selected outlets. Fluid mappers and fluidic logic devices compute continuously valued functions albeit discretized. There is also an opportunity to do discrete operation directly by representing information by droplets and liquid marbles (droplets coated by hydrophobic powder). There, computation is implemented at the sites, in time and space, where droplets collide one with another. The liquid computers mentioned above use liquid as signal carrier or actuator: the exact nature of the liquid is not that important. What is inside the liquid becomes crucial when reaction–diffusion liquid-phase computing devices come into play: there, the liquid hosts families of chemical species that interact with each other in a massive-parallel fashion. I shall illustrate a range of computational tasks, including computational geometry, implementable by excitation wave fronts in nonlinear active chemical medium. The overview will enable scientists and engineers to understand how vast is the variety of liquid computers and will inspire them to design their own experimental laboratory prototypes. This article is part of the theme issue ‘Liquid brains, solid brains: How distributed cognitive architectures process information’.
###
From the Discussion of the paper:
Several contributions to this special issue have demonstrate that a creature does not need a nervous system, in its classical sense, to fuse sensorial inputs, process information and make a decision. The most exciting examples include learning by the slime mould, morphogenetic positional computing, distributed information processing in plant organs, and neuronal-like activity of bacterial biofilms. In the present paper, I show that a substrate does not need to be alive or posses any particular electrical or biochemical properties to compute: even a liquid, which you would be unable to keep in your hands, can perform sophisticated computing circuits.
 
Jiří Kroc
added a research item
This open-source program serves as an introduction into complex systems modeling. The GoL was developed by John Conway in the 70ies, and it gained a huge attention by both researchers and mathematical enthusiasts. In the past, such programs designed in some higher-level programming language can easily exceed thousands of code lines. This program has about one hundred lines within which defines updating rules, visualize the lattice, and even save the animation. Everyone who is starting to model and understand complex systems modeling will benefit from the simplicity and clarity of the program. The code contains many comments in order to enable newcomers to follow in detail all its functions and to modify them according to their own preferences and needs. The chosen approach allows non-specialists along with all those who just want to passively understand the methodology to dive into this non-sequential way of thinking quite easily. The way of cellular automata modeling is very tightly related to massively parallel thinking, which is a bit tricky to newcomers. All what must be understood about those models is that the surrounding world (CA lattice) is observed and interacted with from each cell independently. This means that the rule of thumb of all CA modeling is to update all variables within the updated cell centripetally, and do not change any values outside it. Another important feature of CA modeling is the necessity to work with the old and new layers of the updated lattice, which are periodically switched. This allows us to overcome the unsolvable problem of the sequential updating of all cells within just one lattice, when it would be used instead. In such case, cells would be updated sequentially, one-by-one: cells containing new and old data will become mixed up in the same lattice. This is really something that we do not want to have in any CA model in the first place. Each initial configuration must be inserted into the Python program manually within its beginning: coordinates defines the live cells having the value equal to one. Complicated configurations can be quite difficult to insert because the loading of the initial configuration into the program was sacrificed due to the extra effort to keep it simple and easy to understand. Literature and all necessary programming resources are listed in the README file. This program is designed as the methodical one. When the program is used, it must be cited as stated in the 'README-GoL.txt' file along with one of the recommended optional papers.
Jiří Kroc
added an update
C. C. Wood
Philosophical Translations of Royal Society B 374 (2019) 20180380
DOI: 10.1098/rstb.2018.0380
Abstract:
The goal of this article is to call attention to, and to express caution about, the extensive use of computation as an explanatory concept in contemporary biology. Inspired by Dennett’s ‘intentional stance’ in the philosophy of mind, I suggest that a ‘computational stance’ can be a productive approach to evaluating the value of computational concepts in biology. Such an approach allows the value of computational ideas to be assessed without being diverted by arguments about whether a particular biological system is ‘actually computing’ or not. Because there is sufficient difference of agreement among computer scientists about the essential elements that constitute computation, any doctrinaire position about the application of computational ideas seems misguided. Closely related to the concept of computation is the concept of information processing. Indeed, some influential computer scientists contend that there is no fundamental difference between the two concepts. I will argue that despite the lack of widely accepted, general definitions of information processing and computation: (1) information processing and computation are not fully equivalent and there is value in maintaining a distinction between them and (2) that such value is particularly evident in applications of information processing and computation to biology.
This article is part of the theme issue ‘Liquid brains, solid brains: How distributed cognitive architectures process information’.
###
Computational biology (including bioinformatics) generally refers to the use of computational techniques in service of various branches of biological science, to ‘the understanding and modeling of the structures and processes of life’ (https://www.britannica.com/science/computational-biology) and to the ‘develop[ment of ] algorithms or models to understand biological systems and relationships (https://en.wikipedia.org/wiki/Computational_biology). Bioinformatics focuses on the development and application of large-scale databases of biological information, in particular databases in molecular biology where the approach developed with protein and genetic data [4].
 
Jiří Kroc
added an update
Svein Linge & Hans Petter Langtangen
Springer 2020 - Open Access
Part of the Texts in Computational Science and Engineering book series (TCSE, volume 15)
Introduction
This book is published open access under a CC BY 4.0 license.
This book presents computer programming as a key method for solving mathematical problems. This second edition of the well-received book has been extensively revised: All code is now written in Python version 3.6 (no longer version 2.7). In addition, the two first chapters of the previous edition have been extended and split up into five new chapters, thus expanding the introduction to programming from 50 to 150 pages. Throughout the book, the explanations provided are now more detailed, previous examples have been modified, and new sections, examples and exercises have been added. Also, a number of small errors have been corrected.
The book was inspired by the Springer book TCSE 6: A Primer on Scientific Programming with Python (by Langtangen), but the style employed is more accessible and concise, in keeping with the needs of engineering students.
The book outlines the shortest possible path from no previous experience with programming to a set of skills that allows students to write simple programs for solving common mathematical problems with numerical methods in the context of engineering and science courses. The emphasis is on generic algorithms, clean program design, the use of functions, and automatic tests for verification.
Keywords
programming, Python, verification, numerical methods, differential equations
What it provides:
Easy-to-read text offering a gentle introduction to the necessary mathematics and computer science concepts.
Focuses on explaining all details of how to construct programs to solve mathematical problems.
Further emphasizes verification procedures and how to embed them in automatic test frameworks.
###
Scientific computing has gained an increasing importance in all scientific fields over the past several decades. Python become of one the workhorses of this development. Python is easier to learn when compared to C/C++ and has a huge amount of modules extending its capabilities.
This open access book provides a very useful recipe to design your own software by demonstrating Python programs on real scientific applications. This book is highly recommended to everyone who means it with Python programming seriously and want to write down reliable and precise Python software. More books like this one will follow in the future in this project (stay tuned and follow the project when interested).
Good luck at your own programming!
 
Jiří Kroc
added an update
Richard Berlin & Russell Gruen & James Best
Journal of Healthcare Informatics Research (2017) 1:119–137
DOI 10.1007/s41666-017-0002-9
Abstract
This paper presents a brief history of Systems Theory, progresses to Systems Biology, and its relation to the more traditional investigative method of reductionism. The emergence of Systems Medicine represents the application of Systems Biology to disease and clinical issues. The challenges faced by this transition from Systems Biology to Systems Medicine are explained; the requirements of physicians at the bedside, caring for patients, as well as the place of human-human interaction and the needs of the patients are addressed. An organ-focused transition to Systems Medicine, rather than a genomic-, molecular-, or cell-based effort is emphasized. Organ focus represents a middle-out approach to ease this transition and to maximize the benefits of scientific discovery and clinical application. This method manages the perceptions of time and space, the massive amounts of human- and patient-related data, and the ensuing complexity of information.
***
A highly important review on Systems Medicine that is pointing towards the very probable future development of biomedical research and clinical practice. Written by a team of computer scientists and medical doctors. Recommended to everyone who is interested about the near and far future of medicine.
Specialists from both, medicine and computer science, will find parts of the text that will help them to dive deeply into this uneasy subject that is dealing with explosion of data availability in medicine and our inability to generalize above them.
 
Jiří Kroc
added an update
Niklas Boers, Jürgen Kurths, and Norbert Marwan
Journal of Physics: Complexity 2 011001 (8 April 2021)
Abstract
Complex systems can, to a first approximation, be characterized by the fact that their dynamics emerging at the macroscopic level cannot be easily explained from the microscopic dynamics of the individual constituents of the system. This property of complex systems can be identified in virtually all natural systems surrounding us, but also in many social, economic, and technological systems. The defining characteristics of complex systems imply that their dynamics can often only be captured from the analysis of simulated or observed data. Here, we summarize recent advances in nonlinear data analysis of both simulated and real-world complex systems, with a focus on recurrence analysis for the investigation of individual or small sets of time series, and complex networks for the analysis of possibly very large, spatiotemporal datasets. We review and explain the recent success of these two key concepts of complexity science with an emphasis on applications for the analysis of geoscientific and in particular (palaeo-) climate data. In particular, we present several prominent examples where challenging problems in Earth system and climate science have been successfully addressed using recurrence analysis and complex networks. We outline several open questions for future lines of research in the direction of data-based complex system analysis, again with a focus on applications in the Earth sciences, and suggest possible combinations with suitable machine learning approaches. Beyond Earth system analysis, these methods have proven valuable also in many other scientific disciplines, such as neuroscience, physiology, epidemics, or engineering.
***
Those who want to understand the techniques applied in analysis of both experimental and stimulated complex systems will definitely benefit from their explanation on geological data. Recurrence analysis of complex networks is one of the very useful methods that is good to have in the research toolkit.
 
Jiří Kroc
added an update
Arieh Ben-Naim
Entropy 21(12):1170 (November 2019)
DOI: 10.3390/e21121170
Abstract
This article is about the profound misuses, misunderstanding, misinterpretations and misapplications of entropy, the Second Law of Thermodynamics and Information Theory. It is the story of the “Greatest Blunder Ever in the History of Science”. It is not about a single blunder admitted by a single person (e.g., Albert Einstein allegedly said in connection with the cosmological constant, that this was his greatest blunder), but rather a blunder of gargantuan proportions whose claws have permeated all branches of science; from thermodynamics, cosmology, biology, psychology, sociology and much more.
***
Only by going to the roots of the entropy definition, scientists can recognize its importance, scope of its validity, and when it is misused. This paper is very helpful in elucidating the above-mentioned issues using clear, easy to follow reasoning that is accessible even to non-specialists.  
 
Jiří Kroc
added an update
Johannes Lehmann, Jose Pereira da Silva Jr., Christoph Steiner, Thomas Nehls, Wolfgang Zech, and Bruno Glaser
Plant and Soil 249: 343–357, 2003.
Abstract
Abstract and figures
Soil fertility and leaching losses of nutrients were compared between a Fimic Anthrosol and a Xanthic Ferralsol from Central Amaznia. The Anthrosol was a relict soil from pre-Columbian settlements with high organic C containing large proportions of black carbon. It was further tested whether charcoal additions among other organic and inorganic applications could produce similarly fertile soils as these archaeological Anthrosols. In the first experiment, cowpea (Vigna unguiculata (L.) Walp.) was planted in pots, while in the second experiment lysimeters were used to quantify water and nutrient leaching from soil cropped to rice (Oryza sativa L.). The Anthrosol showed significantly higher P, Ca, Mn, and Zn availability than the Ferralsol increasing biomass production of both cowpea and rice by 38–45% without fertilization (P < 0.05). The soil N contents were also higher in the Anthrosol but
the wide C-to-N ratios due to high soil C contents led to immobilization of N. Despite the generally high nutrient availability, nutrient leaching was minimal in the Anthrosol, providing an explanation for their sustainable fertility. However, when inorganic nutrients were applied to the Anthrosol, nutrient leaching exceeded the one found in the fertilized Ferralsol. Charcoal additions significantly increased plant growth and nutrition. While N availability in the Ferralsol decreased similar to the Anthrosol, uptake of P, K, Ca, Zn, and Cu by the plants increased with higher charcoal additions. Leaching of applied fertilizer N was significantly reduced by charcoal, and Ca and Mg leaching was delayed. In both the Ferralsol with added charcoal and the Anthrosol, nutrient availability was elevated with the exception of N while nutrient leaching was comparatively low.
***
Local, sustainable agriculture can benefit from the information covered in this paper
From the text:
"The high fertility of the prehistoric Anthrosols may have been more related to nutrient release from successively available soil pools than high ion contents at exchange sites. Without fertilization, the leachate in the Anthrosols had extremely low concentrations of nutrients while nutrient availability was high compared to the Ferralsol. Low leaching at high nutrient availability ensures sustainable soil fertility. These results coincide with observations made by Petersen et al.(2001) who found Anthrosols in Western Amazônia which have been under continuous cultivation without fertilization for 40 years."
"The demonstrated properties of the relict anthropogenic soils have important implications for soil management of Ferralsols indicating that organic applications ca be used for sustainable crop production under humid tropical conditions. On the other hand, these results also imply that such Anthrosols should not be fertilized with inorganic nutrients but organic applications be continued."
 
Jiří Kroc
added an update
Research Topic in Front. Astron. Space Sci.
About this Research Topic
The subject of how stars and planets form is one of the most fundamental outstanding questions in astronomy. Many theories have been proposed to explain the various processes involved. One of the key unanswered aspects of this whole question is exactly what role magnetic fields play in the overall process. has been known for many years that magnetic fields exist in the interstellar medium, although their role is hotly debated.
The subject of how stars and planets form is one of the most fundamental outstanding questions in astronomy. Many theories have been proposed to explain the various processes involved. One of the key unanswered aspects of this whole question is exactly what role magnetic fields play in the overall process. It has been known for many years that magnetic fields exist in the interstellar medium, although their role is hotly debated.
Some theories have magnetic fields as the key agents of evolution, whilst other theories ignore magnetic fields altogether, as being only a minor perturbation on an otherwise turbulent picture. However, the recent advent of new telescopes that are capable of measuring inter-stellar magnetic fields, with previously unheard-of sensitivity and resolution, such as ALMA, NOEMA, SMA, and new instruments on existing telescopes such as JCMT, Nobeyama and IRAM, has meant that it is now possible to revisit this question with fresh eyes, based on new data. In addition, the huge increase in power of High Performance Computers (HPCs) means that the current generation of theories can include more details of more aspects of astrophysics than ever before.
In this Research Topic we aim to revisit the question of the role of magnetic fields in the star formation process, and bring together the latest observations with the latest theories to see what progress can now be made in answering this question.
Keywords: star formation, interstellar medium, magnetic fields
***
The universe is full of energetic fields that shape it more than we thought before. Gravity is not the only force acting at the Universe. The topics cover the latest observation and understanding achieved in this area of exciting research.
 
Jiří Kroc
added an update
Have you ever faced a need to describe the behavior of a natural phenomenon that is obviously requiring massive parallel formulation? You had no chance, so far, to enter deeply into all caveats of this discipline. It does not matter.
You can substantially speed up your learning curve by those two papers.
The first one describes how actually are cellular automata models, as one of many possible massively parallel descriptions, build on their examples from physics and chemical reactions.
The other one reviews all major massively parallel descriptions in use this time
 
Jiří Kroc
added a research item
Cellular automaton models of complex systems (CSs) are gaining greater popularity; simultaneously, they have proven the capability to solve real scientific and engineering applications. To enable everybody a quick penetration into the core of this type of modeling, three real applications of cellular automaton models, including selected open source software codes, are studied: laser dynamics, dynamic recrystallization (DRX) and surface catalytic reactions. The paper is written in a way that it enables any researcher to reach the cutting edge knowledge of the design principles of cellular automata (CA) models of the observed phenomena in any scientific field. The whole sequence of design steps is demonstrated: definition of the model using topology and local (transition) rule of a cellular automaton, achieved results, comparison to real experiments, calibration, pathological observations, flow diagrams, software, and discussions. Additionally, the whole paper demonstrates the extreme expressiveness and flexibility of massively parallel computational approaches compared to other computational approaches. The paper consists of the introductory parts that are explaining CSs, self-organization and emergence, entropy, and CA. This allows readers to realize that there is a large variability in definitions and solutions of this class of models. >>> <<< (Remark: Open-source software included).
Jiří Kroc
added an update
Are you modeling the development of biological structures? Do you want to know more about how to design such models? You can download an open-source software that simulated pattern formation within epithelial structures.
Those who study all kinds and types of pattern formation will benefit from knowing this and similar types of software.
Definitely, this software will be very helpful to all who just entered the field and want to know more about it.
 
Jiří Kroc
added an update
Due to my exploration of the area of complex systems modeling, I decided to publish the review named "Complex Systems and Their Use in Medicine: Concepts, Methods and Bio-Medical Applications" in cooperation with Karel Balihar who served as a person who was telling me how to design the paper in order to allow nonspecialists, leading medical professionals who are deciding about the future direction of the course of biomedical research, to understand the very core of complex systems modeling.
We were forced to rewrite the original paper that was using a too much specialized mathematical approaches to explain all terms that are impenetrable to biomedical researchers. We decided, with the advice of Martin Matejovic, to distill all mathematics into simple basics and provide a set of vital examples carefully selected from all biomedical applications of the complex system.
It is upon readers to decide whether we succeed in our goal or not.
---
Because the paper is sponsored from public financial resources, I decided to publish it as a preprint prior to its publication. Details are provided in comments of the publication.
 
Jiří Kroc
added a research item
This review aims mainly to all professionals from the fields of clinical medicine, biomedical and experimental research. It targets to deliver a quick starting overview and basic understanding of Complex Systems (CSs) with a citation apparatus enabling to efficiently reach the cutting-edge knowledge and applications. This paper has two main objectives. It builds the core information of CSs that is explained on a carefully selected example called the "Game of Life", which expresses self-organization and emergence. The second and most important objective is to provide a wide list of CSs computational methods enabling everybody to achieve a basic overview of all major methods applied in experimental and clinical medicine. These methods are easy to implement in any field of the interest.
Jiří Kroc
added 10 research items
This paper continues in the previous research focussed to two simple ques-tions. The first one reads: "What is the influence of anisotropy of computational lattice on simulations of boundary movement?" where grain boundary movement typically appears in simulations of grain boundary migration and static/dynamic recrystallization. The sec-ond question reads: "How is the computational anisotropy related to natural anisotropy of the material lattice itself?" This study is focussed on the influence of change of the com-putational algorithm and/or lattice on the grain boundary movement. Two algorithms, the majority rule and the simple modification of the Monte Carlo method for two different lattices – namely square and hexagonal one – are used.
It is well known that many of macroscopic properties of materials undergoing dynamic recrystallization are determined by their microstructure. Despite a large experimental and theoretical effort, it is not possible to predict in a satisfactory way the evolution of microstructure with sufficient precision. The reason is that we are dealing with a complex system where the basic components are simple but many of such simple components can act synergetically. Unfortunately, analytical solutions are not available at the present time. Therefore, we look for a method which can fill the gap between macroscopic and atomistic scale. The Cellular Automata Theory used in this work provides such a method. Basically, this method maps microstructure into a discrete space and defines local interactions between elements of this discrete space. The aim of this work is to formulate a cellular automaton model of dynamic recrystallization with special attention to so-called critical experiments. We start by an overview of experiments on static and dynamic recrystallization. Some classic models and most of the cellular automata models of static and dynamic recrystallization are reviewed as well. A new cellular automaton model with probabilistic grain boundary movement is proposed. The influence of the initial mean grain size, evolution laws of dislocation density, and nucleation rate on deformation curves are studied. The attention is focused to the explanation of several critical experiments, e.g. constant strain rate test and strain rate change test. It is recognized that evolution of the mean grain size represents the complementary information to the evolution of stress. A strong dependence of deformation curves on the initial mean grain size and nucleation rate is detected. Such critical experiments as strain rate change test and constant strain rate test are reproduced. The model shows a weak sensitivity of the material on the type of the evolution law of dislocation density. Globular shape of grains was obtained through probabilistic movement of grain boundary segments. This work shows that a deeper study of every particular process can bring substantial insight into the recrystallization behaviour and related processes as, for example, grain growth. A tentative list of the future aims of research in the field of dynamic recrystallization modelled by cellular automata is present at the end of this work.
Properties of the model performing dynamic mesh partitioning into computationally equivalent mesh parts on regular square lattices using the diffusion controlled cellular automaton (DCCA) are studied. Every processor has assigned a domain seed at the beginning of each simulation--seeds grow into competing domains having migrating boundaries. Computations having different number of seeds and/or different boundary topologies are compared. Additionally, it is demonstrated that new seeds could be inserted dynamically during simulation--it mimics dynamic inserting of new processors of a supercomputer operating above given mesh. Solutions are dynamically stable configurations having equal domain sizes.
Jiří Kroc
added a research item
A model performing mesh partitioning into computationally equivalent mesh parts on regular square lattices using a diffusion controlled cellular automaton (DCCA) is proposed and studied in this article. Every processor has assigned a domain seed at the beginning of a simulation. Algorithm works with growth of seeds and migration of domain borders, the later is triggered when the difference of diffusive agents on both sides of a border exceeds a given threshold. The model is built using self-organization principles ensuring convergence. Solutions are dynamically stable configurations achieved from any initial configuration.
Jiří Kroc
added an update
BACHELOR THESIS of Jakub Tkac:
"Design and Implementation of Cellular Automaton Simulating Dynamic Recrystallization [in czech]"
Abstract
The aim of this thesis is to design, implement and visualize a cellular automaton simulating dynamic recrystallization. Implementation of the algorithm was based on the scientific publication, doctoral thesis and existing code in CellLang. The created software serves as a springboard for programming of complex systems on some suitable example. The graphical user interface is implemented in C ++ using the Qt framework. The resulting application displays the microstructure and visualizes calculated results to the user: mechanical stress curve, mean grain size curve, dislocation density. The simulation results can be stored. It is possible to retrieve the initialization microstructure or to generate our own in the program based on defined parameters by pseudo-static recrystallization. Optional parameters guarantee simulation of material deformed under different conditions. Part of the program package ( https://www.researchgate.net/publication/316989956_Cellular_Automaton_Simulation_of_Dynamic_Recrystallization_Introduction_into_Self-Organization_and_Emergence ) is also the Programmer’s manual and User’s manual in English. You can also watch the video where is the program presented ( https://www.researchgate.net/publication/317013011_Self-Organization_Video_Sequence_Depicting_Numerical_Experiments_with_Cellular_Automaton_Model_of_Dynamic_Recrystallization_with_source-code_link ). Link to publication: ( https://www.researchgate.net/publication/225670793_Application_of_Cellular_Automata_Simulations_to_Modeling_of_Dynamic_Recrystallization ) Link to doctoral thesis: ( https://www.researchgate.net/publication/311733496_Simulation_of_Dynamic_Recrystallization_by_Cellular_Automata ) Design and Implementation of Cellular Automaton Simulating Dynamic Recrystallization [in czech]. Available from: https://www.researchgate.net/publication/317236033_Design_and_Implementation_of_Cellular_Automaton_Simulating_Dynamic_Recrystallization_in_czech [accessed May 31, 2017].
 
Jiří Kroc
added a research item
This video presents numerical experiments done by the software "Cellular Automaton Simulation of Dynamic Recrystallization: Introduction" (https://www.researchgate.net/publication/316989956_Cellular_Automaton_Simulation_of_Dynamic_Recrystallization_Introduction). This software was developed by Jakub Tkac according to the original software that was programmed in the language Cellular/Cellang (by Jiri Kroc), which is not anymore maintained. This lead to the decision to write down a new version of the software for all who would like to study it. Everybody is welcomed to study provided software, and use it for research and non-commercial purposes. You should cite this software (above) and the original research provided bellow. * The Ph.D. thesis of the original research: https://www.researchgate.net/publication/311733496_Simulation_of_Dynamic_Recrystallization_by_Cellular_Automata * The paper containing the main results from the Ph.D. thesis: https://www.researchgate.net/publication/225670793_Application_of_Cellular_Automata_Simulations_to_Modeling_of_Dynamic_Recrystallization * [Link to the identical video on youtube: https://youtu.be/_v0tQnbLW-4 ] *** Link do the Bachelor Thesis of Jakub Tkac: https://www.researchgate.net/publication/317236033_Design_and_Implementation_of_Cellular_Automaton_Simulating_Dynamic_Recrystallization_in_czech
Jiří Kroc
added a research item
The purposes of this software are manifold: (i) It serves as an introduction to the modeling of dynamic recrystallization by cellular automata and to complex systems modeling by cellular automata in general. (ii) There is a big demand to have in hands an open-source software that can serve as an example to similar software that might be developed in various scientific fields. (iii) Everybody is welcomed to develop its non-commercial applications in various scientific fields using this template. (iv) The software was originally developed by JK during his Ph.D. studies using software called Cellular/Cellang that is not maintained anymore. (v) JT developed software according to the best software engineering practice. (vi) It and its descendants can be used to study complex systems modeling in various scientific disciplines including biosciences, biology, and medicine. This software was developed during bachelor studies of JT and published as his bachelor thesis project. PS The work when used must be properly cited according to requirements given in the software. >>><<< Remark: Introduction to Complex Systems Modelling by Cellular Automata can be found at https://www.researchgate.net/publication/225413484_Introduction_to_Modeling_of_Complex_Systems_Using_Cellular_Automata (this provides the necessary background for the understanding of the software) >>><<< Link do the Bachelor Thesis of Jakub Tkac: https://www.researchgate.net/publication/317236033_Design_and_Implementation_of_Cellular_Automaton_Simulating_Dynamic_Recrystallization_in_czech >>><<< Link to a video depicting a run of the code https://www.researchgate.net/publication/317013011_Self-Organization_Video_Sequence_Depicting_Numerical_Experiments_with_Cellular_Automaton_Model_of_Dynamic_Recrystallization_with_source-code_link >>><<< Link to a quick physical & computational introduction https://www.researchgate.net/publication/338019707_BUILDING_EFFICIENT_COMPUTATIONAL_CELLULAR_AUTOMATA_MODELS_OF_COMPLEX_SYSTEMS_BACKGROUND_APPLICATIONS_RESULTS_SOFTWARE_AND_PATHOLOGIES
Jiří Kroc
added 4 research items
The animation, representing a simulation generated by the Qt front-end, of the program published at the RG-page under name "Design and Implementation of Cellular Automaton Simulating Domain Growth - Source Code" (https://www.researchgate.net/publication/303805779_Design_and_Implementation_of_Cellular_Automaton_Simulating_Domain_Growth_-_Source_Code), which was developed within the Martin Muzak's thesis "Design and Implementation of Cellular Automaton Simulating Domain Growth" (https://www.researchgate.net/publication/303549530_Design_and_Implementation_of_Cellular_Automaton_Simulating_Domain_Growth). The thesis is based on the paper: "Diffusion Controlled Cellular Automaton Performing Mesh Partitioning" (https://www.researchgate.net/publication/220977420_Diffusion_Controlled_Cellular_Automaton_Performing_Mesh_Partitioning?ev=prf_pub).
The bachelor thesis is focused to design and implementation of the Qt graphical user interface of cellular automaton simulating domain decomposition by employing a self-organization principle. The main goal is to enable everybody to achieve a better understanding of complex systems modelling principles, which are explained on an example dealing with domain decomposition, and simultaneously to stimulate their own research motivated by the provided C++ code. The application itself offers a freedom in the configuration of simulation and allows each user to define an arbitrary number of the initial locations of all domains and their respective growth factors. During the run of each simulation, a user can save the image into the file and control the run of the simulation. The application is free to use or modify under given conditions. User’s and programmer’s manual are located in the appendix of the bachelor’s thesis. The application is compatible with Windows and Linux operating systems. The text of the thesis and instruction how to execute the application is available on https://www.researchgate.net/publication/303549530_Design_and_Implementation_of_Cellular_Automaton_Simulating_Domain_Growth
This is a quick introduction to the topic in an animated form. It gives you the impression of a living thing. The original idea of this software arrived when the author was thinking about cell growth. Similar effects are observed in all living organisms. For details, see the paper, book chapter, and software that is open under the GPL license. (To visualize the animated GIF, specialized picture visualizing tools must be used; otherwise, a static picture of eight seeds that do not grow is seen.)
Jiří Kroc
added 6 research items
The theory of Complex Systems (CSs) is offering new mathematical concepts of definition, description, and evaluation of biological systems. A number of vital mathematical descriptions that are constantly developing within CSs increasingly assist biomedical research. The main goal of the presentation is to increase awareness of medical researchers about novel CSs computational methods and their applicability in biology. We can say that modern mathematical research is developing novel computational tools that are comparable to classical hardware devices such as are, e.g., microscope, mass & light spectroscopy, and that have no vital alternative to them. The poster got a continuation in the paper "Complex Systems and Their Use in Medicine: Concepts, Methods and Bio-Medical Applications" [https://www.researchgate.net/publication/330546521_Complex_Systems_and_Their_Use_in_Medicine_Concepts_Methods_and_Bio-Medical_Applications ]. >>>>> (Remark: More details on complexity in the projects of the author.) <<<<<
..... The INTRODUCTION into the one of the best selling book in the field (over 30k ebooks sold) ..... Abstract: Since the sixteenth century there have been two main paradigms in the methodology of doing science. The first one is referred to as “the experimental” paradigm. During an experiment we observe, measure, and quantify natural phenomena in order to solve a specific problem, answer a question, or to decide whether a hypothesis is true or false. The second paradigm is known as “the theoretical” paradigm. A theory is generally understood as a fundamental, for instance logical and/or mathematical explanation of an observed natural phenomenon. Theory can be supported or falsified through experimentation. Link to the online version of the book: http://www.springer.com/gp/book/9783642122026
>>> The best seller in the field - over 30k ebooks sold as 2018. <<< From the Back Cover: Deeply rooted in fundamental research in Mathematics and Computer Science, Cellular Automata (CA) are recognized as an intuitive modeling paradigm for Complex Systems. Already very basic CA, with extremely simple micro dynamics such as the Game of Life, show an almost endless display of complex emergent behavior. Conversely, CA can also be designed to produce a desired emergent behavior, using either theoretical methodologies or evolutionary techniques. Meanwhile, beyond the original realm of applications - Physics, Computer Science, and Mathematics – CA have also become work horses in very different disciplines such as epidemiology, immunology, sociology, and finance. In this context of fast and impressive progress, spurred further by the enormous attraction these topics have on students, this book emerges as a welcome overview of the field for its practitioners, as well as a good starting point for detailed study on the graduate and post-graduate level. The book contains three parts, two major parts on theory and applications, and a smaller part on software. The theory part contains fundamental chapters on how to design and/or apply CA for many different areas. In the applications part a number of representative examples of really using CA in a broad range of disciplines is provided - this part will give the reader a good idea of the real strength of this kind of modeling as well as the incentive to apply CA in their own field of study. Download or buy at Springer: http://www.springer.com/gp/book/9783642122026
Jiří Kroc
added a project goal
The first goal is to disseminate a knowledge of the core of complex systems in an easy to follow form among all who would like to know more about it.
The most important goal is to enable everybody to reach the state when he/she can design complex systems giving desired output what is very difficult task using the contemporary available tools. It is necessary to study literally thousands of complex systems (CSs), and only then, it is possible to reach the edge of all what is known about CSs.
Prior to publication a review paper about CSs in biology and medicine, the poster
serves as a good starting point to all who are interested in backgrounds of CSs in medicine and biology. It is a really exciting area of research, which has a great future and very probably will quickly lead to the development of personalized medicine. The work on it already begun (see the project "Classification of ECGs and Prediction of Arrhythmias").
Review paper serving as a concise starting point to study