Complexity

Published by Hindawi Publishing Corporation
Online ISSN: 1099-0526
Discipline: Nonlinear And Complex Systems
Aims and scope

The purpose of Complexity is to report important advances in the scientific study of complex systems. Complex systems are characterized by interactions between their components that produce new information — present in neither the initial nor boundary conditions — which limit their predictability. Given the amount of information processing required to study complexity, the use of computers has been central to complex systems research.

This Open Access journal publishes high-quality original research, as well as rigorous review articles, across a broad range of disciplines. Studies can have a theoretical, methodological, or practical focus. However, submissions must always provide a significant contribution to complex systems.

 

Editors

Publications
Article
Time irreversibility (asymmetry with respect to time reversal) is an important property of many time series derived from processes in nature. Some time series (e.g., healthy heart rate dynamics) demonstrate even more complex, multiscale irreversibility, such that not only the original but also coarse-grained time series are asymmetric over a wide range of scales. Several indices to quantify multiscale asymmetry have been introduced. However, there has been no simple generator of model time series with "tunable" multiscale asymmetry to test such indices. We introduce an asymmetric Weierstrass function W(A) (constructed from asymmetric sawtooth functions instead of cosine waves) that can be used to construct time series with any given value of the multiscale asymmetry. We show that multiscale asymmetry appears to be independent of other multiscale complexity indices, such as fractal dimension and multiscale entropy. We further generalize the concept of multiscale asymmetry by introducing time-dependent (local) multiscale asymmetry and provide examples of such time series. The W(A) function combines two essential features of complex fluctuations, namely fractality (self-similarity) and irreversibility (multiscale time asymmetry); moreover, each of these features can be tuned independently. The proposed family of functions can be used to compare and refine multiscale measures of time series asymmetry.
 
Article
Based on the consideration of Boolean dynamics, it has been hypothesized that cell types may correspond to alternative attractors of a gene regulatory network. Recent stochastic Boolean network analysis, however, raised the important question concerning the stability of such attractors. In this paper a detailed numerical analysis is performed within the framework of Langevin dynamics. While the present results confirm that the noise is indeed an important dynamical element, the cell type as represented by attractors can still be a viable hypothesis. It is found that the stability of an attractor depends on the strength of noise related to the distance of the system to the bifurcation point and it can be exponentially stable depending on biological parameters.
 
Article
Attempts to understand how information content can be included in an accounting of the energy flux of the biosphere have led to the conclusion that, in information transmission, one component, the semantic content, or "the meaning of the message," adds no thermodynamic burden over and above costs arising from coding, transmission and translation. In biology, semantic content has two major roles. For all life forms, the message of the genotype encoded in DNA specifies the phenotype, and hence the organism that is tested against the real world through the mechanisms of Darwinian evolution. For human beings, communication through language and similar abstractions provides an additional supra-phenotypic vehicle for semantic inheritance, which supports the cultural heritages around which civilizations revolve. The following three postulates provide the basis for discussion of a number of themes that demonstrate some important consequences. (i) Information transmission through either pathway has thermodynamic components associated with data storage and transmission. (ii) The semantic content adds no additional thermodynamic cost. (iii) For all semantic exchange, meaning is accessible only through translation and interpretation, and has a value only in context. (1) For both pathways of semantic inheritance, translational and copying machineries are imperfect. As a consequence both pathways are subject to mutation and to evolutionary pressure by selection. Recognition of semantic content as a common component allows an understanding of the relationship between genes and memes, and a reformulation of Universal Darwinism. (2) The emergent properties of life are dependent on a processing of semantic content. The translational steps allow amplification in complexity through combinatorial possibilities in space and time. Amplification depends on the increased potential for complexity opened by 3D interaction specificity of proteins, and on the selection of useful variants by evolution. The initial interpretational steps include protein synthesis, molecular recognition, and catalytic potential that facilitate structural and functional roles. Combinatorial possibilities are extended through interactions of increasing complexity in the temporal dimension. (3) All living things show a behavior that indicates awareness of time, or chronognosis. The ∼4 billion years of biological evolution have given rise to forms with increasing sophistication in sensory adaptation. This has been linked to the development of an increasing chronognostic range, and an associated increase in combinatorial complexity. (4) Development of a modern human phenotype and the ability to communicate through language, led to the development of archival storage, and invention of the basic skills, institutions and mechanisms that allowed the evolution of modern civilizations. Combinatorial amplification at the supra-phenotypical level arose from the invention of syntax, grammar, numbers, and the subsequent developments of abstraction in writing, algorithms, etc. The translational machineries of the human mind, the "mutation" of ideas therein, and the "conversations" of our social intercourse, have allowed a limited set of symbolic descriptors to evolve into an exponentially expanding semantic heritage. (5) The three postulates above open interesting epistemological questions. An understanding of topics such dualism, the élan vital, the status of hypothesis in science, memetics, the nature of consciousness, the role of semantic processing in the survival of societies, and Popper's three worlds, require recognition of an insubstantial component. By recognizing a necessary linkage between semantic content and a physical machinery, we can bring these perennial problems into the framework of a realistic philosophy. It is suggested, following Popper, that the ∼4 billion years of evolution of the biosphere represents an exploration of the nature of reality at the physicochemical level, which, together with the conscious extension of this exploration through science and culture, provides a firm epistemological underpinning for such a philosophy.
 
Article
We demonstrate that distributions of human response times have power-law tails and, among closed-form distributions, are best fit by the generalized inverse gamma distribution. We speculate that the task difficulty tracks the half-width of the distribution and show that it is related to the exponent of the power-law tail.
 
Article
Using Rule 126 elementary cellular automaton (ECA) we demonstrate that a chaotic discrete system --- when enriched with memory -- hence exhibits complex dynamics where such space exploits on an ample universe of periodic patterns induced from original information of the ahistorical system. First we analyse classic ECA Rule 126 to identify basic characteristics with mean field theory, basins, and de Bruijn diagrams. In order to derive this complex dynamics, we use a kind of memory on Rule 126; from here interactions between gliders are studied for detecting stationary patterns, glider guns and simulating specific simple computable functions produced by glider collisions.
 
Article
We examine the complete dataset of baby name popularity collected by U.S. Social Security Administration for the last 131 years (1880-2010). The ranked baby name popularity can be fitted empirically by a piecewise function consisting of Beta function for the high-ranking names and power-law function for low-ranking names, but not power-law (Zipf's law) or Beta function by itself.
 
Article
Models of continuous opinion dynamics under bounded confidence show a sharp transition between a consensus and a polarization phase at a critical global bound of confidence. In this paper, heterogeneous bounds of confidence are studied. The surprising result is that a society of agents with two different bounds of confidence (open-minded and closed-minded agents) can find consensus even when both bounds of confidence are significantly below the critical bound of confidence of a homogeneous society. The phenomenon is shown by examples of agent-based simulation and by numerical computation of the time evolution of the agents density. The result holds for the bounded confidence model of Deffuant, Weisbuch and others (Weisbuch, G. et al; Meet, discuss, and segregate!, Complexity, 2002, 7, 55--63), as well as for the model of Hegselmann and Krause (Hegselmann, R., Krause, U.; Opinion Dynamics and Bounded Confidence, Models, Analysis and Simulation, Journal of Artificial Societies and Social Simulation, 2002, 5, 2). Thus, given an average level of confidence, diversity of bounds of confidence enhances the chances for consensus. The drawback of this enhancement is that opinion dynamics becomes suspect to severe drifts of clusters, where open-minded agents can pull closed-minded agents towards another cluster of closed-minded agents. A final consensus might thus not lie in the center of the opinion interval as it happens for uniform initial opinion distributions under homogeneous bounds of confidence. It can be located at extremal locations. This is demonstrated by example. This also show that the extension to heterogeneous bounds of confidence enriches the complexity of the dynamics tremendously. Comment: 14 pages, 12 figures, large parts rewritten and improved
 
Article
Based on a recent model of evolving viruses competing with an adapting immune system [1], we study the conditions under which a viral quasispecies can maximize its growth rate. The range of mutation rates that allows viruses to thrive is limited from above due to genomic information deterioration, and from below by insufficient sequence diversity, which leads to a quick eradication of the virus by the immune system. The mutation rate that optimally balances these two requirements depends to first order on the ratio of the inverse of the virus' growth rate and the time the immune system needs to develop a specific answer to an antigen. We find that a virus is most viable if it generates exactly one mutation within the time it takes for the immune system to adapt to a new viral epitope. Experimental viral mutation rates, in particular for HIV (human immunodeficiency virus), seem to suggest that many viruses have achieved their optimal mutation rate. [1] C.Kamp and S. Bornholdt, Phys. Rev. Lett., 88, 068104 (2002)
 
Article
In this article we study the dynamics of coupled oscillators. We use mechanical metronomes that are placed over a rigid base. The base moves by a motor in a one-dimensional direction and the movements of the base follow some functions of the phases of the metronomes (in other words, it is controlled to move according to a provided function). Because of the motor and the feedback, the phases of the metronomes affect the movements of the base while on the other hand, when the base moves, it affects the phases of the metronomes in return. For a simple function for the base movement (such as $y = \gamma_{x} [r \theta_1 + (1 - r) \theta_2]$ in which $y$ is the velocity of the base, $\gamma_{x}$ is a multiplier, $r$ is a proportion and $\theta_1$ and $\theta_2$ are phases of the metronomes), we show the effects on the dynamics of the oscillators. Then we study how this function changes in time when its parameters adapt by a feedback. By numerical simulations and experimental tests, we show that the dynamic of the set of oscillators and the base tends to evolve towards a certain region. This region is close to a transition in dynamics of the oscillators; where more frequencies start to appear in the frequency spectra of the phases of the metronomes.
 
Article
We formulate a flexible micro-to-macro kinetic model which is able to explain the emergence of income profiles out of a whole of individual economic interactions. The model is expressed by a system of several nonlinear differential equations which involve parameters defined by probabilities. Society is described as an ensemble of individuals divided into income classes; the individuals exchange money through binary and ternary interactions, leaving the total wealth unchanged. The ternary interactions represent taxation and redistribution effects. Dynamics is investigated through computational simulations, the focus being on the effects that different fiscal policies and differently weighted welfare policies have on the long-run income distributions. The model provides a tool which may contribute to the identification of the most effective actions towards a reduction of economic inequality. We find for instance that, under certain hypotheses, the Gini index is more affected by a policy of reduction of the welfare and subsidies for the rich classes than by an increase of the upper tax rate. Such a policy also has the effect of slightly increasing the total tax revenue.
 
Best-Fit Parameters for Power-Law Scaling T (N ) = aN b for All Values of Average Noise Power P Simulated. 
Article
We numerically simulate the effects of noise-induced sampling of alternative Hamiltonian paths on the ability of quantum adiabatic search (QuAdS) to solve randomly generated instances of the NP-Complete problem N-bit Exact Cover 3. The noise-averaged median runtime is determined as the noise-power and number of bits N are varied, and power-law and exponential fits are made to the data. Noise is seen to slowdown QuAdS, though a downward shift in the scaling exponent is found for N > 12 over a range of noise-power values. We discuss whether this shift might be connected to arguments in the literature that suggest that altering the Hamiltonian path might benefit QuAdS performance. Comment: 16 pages; 5 figures; 4 tables; to appear in Complexity
 
Article
Advancing the state of the art of simulation in the social sciences requires appreciating the unique value of simulation as a third way of doing science, in contrast to both induction and deduction. Simulation can be an effective tool for discovering surprising consequences of simple assumptions. This essay offers advice for doing simulation research, focusing on the programming of a simulation model, analyzing the results sharing the results, and replicating other people's simulations. Finally, suggestions are offered for building of a community of social scientists who do simulation.
 
Article
Agent-based computational modeling is changing the face of social science. In Generative Social Science , Joshua Epstein argues that this powerful, novel technique permits the social sciences to meet a fundamentally new standard of explanation, in which one "grows" the phenomenon of interest in an artificial society of interacting agents: heterogeneous, boundedly rational actors, represented as mathematical or software objects. After elaborating this notion of generative explanation in a pair of overarching foundational chapters, Epstein illustrates it with examples chosen from such far-flung fields as archaeology, civil conflict, the evolution of norms, epidemiology, retirement economics, spatial games, and organizational adaptation. In elegant chapter preludes, he explains how these widely diverse modeling studies support his sweeping case for generative explanation. This book represents a powerful consolidation of Epstein's interdisciplinary research activities in the decade since the publication of his and Robert Axtell's landmark volume, Growing Artificial Societies . Beautifully illustrated, Generative Social Science includes a CD that contains animated movies of core model runs, and programs allowing users to easily change assumptions and explore models, making it an invaluable text for courses in modeling at all levels.
 
Article
This paper introduces a novel approach of clustering, which is based on group consensus of dynamic linear high-order multi-agent systems. The graph topology is associated with a selected multi-agent system, with each agent corresponding to one vertex. In order to reveal the cluster structure, the agents belonging to a similar cluster are expected to aggregate together. As theoretical foundation, a necessary and sufficient condition is given to check the group consensus. Two numerical instances are shown to illustrate the process of approach.
 
Master and Individual Utility Functions. Note: The master utility function with M = 10, generated from a representative set of idea utilities of size n = 10, is shown by black dots. An individual utility function by adding noise with ξ = 0.2 is shown by gray dots. The x-axis shows idea indices generated by interpreting bit strings as binary notations of an integer; i.e., all of different ideas are lined up along the horizontal axis and their utility values are plotted.
Effects of Within-Group Noise and Group-Level Bias on Decision Convergence and Quality. Note: Effects of within-group noise (ξ) and group-level bias (β) on the level of convergence (left) and the true utility value of the most supported idea (right). Each dot represents an average result of 500 independent simulation runs.
Effects of Balance between Selection-Oriented and Variation-Oriented Behaviors and Group-Level Bias on Decision Convergence and Decision Quality. Note: Effects of balance between selection-oriented and variation-oriented behaviors (p) and group-level bias (β) on the level of convergence (left) and the true utility value of the most supported idea (right). Each dot represents an average result of 500 independent simulation runs.
Effects of Group Size and Social Network Topology on Decision Convergence and Decision Quality. Note: Effects of group size (N) and social network topology (random, small-world, or scale-free) on the level of convergence (left) and the true utility value of the most supported idea (right). Note the log scale for group size. Each dot represents an average result of 500 independent simulation runs.
Distributions of Utilities of Most Supported Ideas between Different Social Network Topologies. Note: Simulation results comparing the distributions of utilities of most supported ideas at the end of simulation between the three social network topologies (random, small-world, or scale-free) for N = 640. The small-world network topology (middle) achieved the highest number of the maximal utility value (1.0) compared to the other two topologies, random (left) and scale-free (right).
Article
Collective, especially group-based, managerial decision making is crucial in organizations. Using an evolutionary theory approach to collective decision making, agent-based simulations were conducted to investigate how collective decision making would be affected by the agents' diversity in problem understanding and/or behavior in discussion, as well as by their social network structure. Simulation results indicated that groups with consistent problem understanding tended to produce higher utility values of ideas and displayed better decision convergence, but only if there was no group-level bias in collective problem understanding. Simulation results also indicated the importance of balance between selection-oriented (i.e., exploitative) and variation-oriented (i.e., explorative) behaviors in discussion to achieve quality final decisions. Expanding the group size and introducing non-trivial social network structure generally improved the quality of ideas at the cost of decision convergence. Simulations with different social network topologies revealed that collective decision making on small-world networks with high local clustering tended to achieve highest decision quality more often than on random or scale-free networks. Implications of this evolutionary theory and simulation approach for future managerial research on collective, group, and multi-level decision making are discussed.
 
Article
A long sequence of tosses of a classical coin produces an apparently random bit string, but classical randomness is an illusion: the algorithmic information content of a classically-generated bit string lies almost entirely in the description of initial conditions. This letter presents a simple argument that, by contrast, a sequence of bits produced by tossing a quantum coin is, almost certainly, genuinely (algorithmically) random. This result can be interpreted as a strengthening of Bell's no-hidden-variables theorem, and relies on causality and quantum entanglement in a manner similar to Bell's original argument.
 
Article
This material was presented in a series of lectures at the Santa Fe Institute, the Los Alamos National Laboratory, and the University of New Mexico, during a one-month visit to the Santa Fe Institute, April 1995.
 
Article
Quantum computers use the quantum interference of different computational paths to enhance correct outcomes and suppress erroneous outcomes of computations. In effect, they follow the same logical paradigm as (multi-particle) interferometers. We show how most known quantum algorithms, including quantum algorithms for factorising and counting, may be cast in this manner. Quantum searching is described as inducing a desired relative phase between two eigenvectors to yield constructive interference on the sought elements and destructive interference on the remaining terms.
 
Article
We aim here to show that reductionism and emergence play a complementary role in understanding natural processes and in the dynamics of science explanation. In particular, we will show that the renormalization group - one of the most refined tool of theoretical physics - allows understanding the importance of emergent processes' role in Nature identifying them universal organization processes, which is to say they are scale independent. We can use the syntaxes of Quantum Field Theory (QFT) and Spontaneous Symmetry Breaking as a trans-disciplinary theoretical scenario for many other forms of complexity, especially the biological and cognitive ones. Comment: 30 pages, 1 figure. accepted in Complexity
 
Article
This paper presents a novel analysis and visualization of English Wikipedia data. Our specific interest is the analysis of basic statistics, the identification of the semantic structure and age of the categories in this free online encyclopedia, and the content coverage of its highly productive authors. The paper starts with an introduction of Wikipedia and a review of related work. We then introduce a suite of measures and approaches to analyze and map the semantic structure of Wikipedia. The results show that co-occurrences of categories within individual articles have a power-law distribution, and when mapped reveal the nicely clustered semantic structure of Wikipedia. The results also reveal the content coverage of the article's authors, although the roles these authors play are as varied as the authors themselves. We conclude with a discussion of major results and planned future work.
 
Article
We present a method for approximating a fitness landscapes as a superposition of "elementary" landscapes. Given a correlation function of the landscape in question we show that the relative amplitudes of contributions with P-ary interactions can be computed. We show an application to RNA free energy landscapes.
 
Article
Since its application to systems, emergence has been explained in terms of levels of observation. This approach has led to confusion, contradiction, incoherence and at times mysticism. When the idea of level is replaced by a framework of scope, resolution and state, this confusion is dissolved. We find that emergent properties are determined by the relationship between the scope of macrostate and microstate descriptions. This establishes a normative definition of emergent properties and emergence that makes sense of previous descriptive definitions of emergence. In particular, this framework sheds light on which classes of emergent properties are epistemic and which are ontological, and identifies fundamental limits to our ability to capture emergence in formal systems.
 
Article
We compared entropy for texts written in natural languages (English, Spanish) and artificial languages (computer software) based on a simple expression for the entropy as a function of message length and specific word diversity. Code text written in artificial languages showed higher entropy than text of similar length expressed in natural languages. Spanish texts exhibit more symbolic diversity than English ones. Results showed that algorithms based on complexity measures differentiate artificial from natural languages, and that text analysis based on complexity measures allows the unveiling of important aspects of their nature. We propose specific expressions to examine entropy related aspects of tests and estimate the values of entropy, emergence, self-organization and complexity based on specific diversity and message length.
 
Article
The existence of risky choices makes the study of attitudes toward risk important. In this article we ask the following questions: Do risk-neutral preferences maximize utility? Are there other measures of social welfare that can explain risk aversion in society? What effect does evolution have on the distribution of risk attitudes and the measures of social welfare? In a static environment risk-neutral risk attitudes maximize utilitarian measures of social welfare, and risk-averse attitudes maximize Rawlsian measures. In a dynamic system agents will tend toward risk, preferring greater affinity for risk when they can accumulate wealth.
 
Article
There have been several highway traffic models proposed based on cellular automata. The simplest one is elementary cellular automaton rule 184. We extend this model to city traffic with cellular automata coupled at intersections using only rules 184, 252, and 136. The simplicity of the model offers a clear understanding of the main properties of city traffic and its phase transitions. We use the proposed model to compare two methods for coordinating traffic lights: a green-wave method that tries to optimize phases according to expected flows and a self-organizing method that adapts to the current traffic conditions. The self-organizing method delivers considerable improvements over the green-wave method. For low densities, the self-organizing method promotes the formation and coordination of platoons that flow freely in four directions, i.e. with a maximum velocity and no stops. For medium densities, the method allows a constant usage of the intersections, exploiting their maximum flux capacity. For high densities, the method prevents gridlocks and promotes the formation and coordination of "free-spaces" that flow in the opposite direction of traffic.
 
Article
It is widely believed that theory is useful in physics because it describes simple systems and that strictly empirical phenomenological approaches are necessary for complex biological and social systems. Here we prove based upon an analysis of the information that can be obtained from experimental observations that theory is even more essential in the understanding of complex systems. Implications of this proof revise the general understanding of how we can understand complex systems including the behaviorist approach to human behavior, problems with testing engineered systems, and medical experimentation for evaluating treatments and the FDA approval of medications. Each of these approaches are inherently limited in their ability to characterize real world systems due to the large number of conditions that can affect their behavior. Models are necessary as they can help to characterize behavior without requiring observations for all possible conditions. The testing of models by empirical observations enhances the utility of those observations. For systems for which adequate models have not been developed, or are not practical, the limitations of empirical testing lead to uncertainty in our knowledge and risks in individual, organizational and social policy decisions. These risks should be recognized and inform our decisions.
 
Two typical realizations. Both refer to N = 100 individuals with initially uniformly distributed opinions in [0, 1]. In the left panel (a single cluster case) parameters are α = 1.0 and β = 0.6; in the right panel (two clusters case), α = 1.0 and β = 0.9. On the right of each panels-the final distribution of opinions representing the number of clusters.
1/(2σ)-rule. We present here the results of two simulations for the determination of the number of large clusters as a function of the threshold σ in the Deffuant et al. model [1] with N = 1000 agents. The inset shows the results for N = 100 agents.
The cluster formation time as a function of population size in the single-cluster case. The straight line in the log-log plot represents the fitted relation (6).
Article
We study the dynamics of public opinion in a model in which agents change their opinions as a result of random binary encounters if the opinion difference is below their individual thresholds that evolve over time. We ground these thresholds in a simple individual cost-benefit analysis with linear benefits of diversity and quadratic communication costs. We clarify and deepen the results of earlier continuous-opinion dynamics models (Deffuant et al., Adv Complex Systems 2000; Weisbuch et al., Complexity 2002) and establish several new results regarding the patterns of opinions in the asymptotic state and the cluster formation time.
 
Article
Coupled Ising models are studied in a discrete choice theory framework, where they can be understood to represent interdependent choice making processes for homogeneous populations under social influence. Two different coupling schemes are considered. The nonlocal or group interdependence model is used to study two interrelated groups making the same binary choice. The local or individual interdependence model represents a single group where agents make two binary choices which depend on each other. For both models, phase diagrams, and their implications in socioeconomic contexts, are described and compared in the absence of private deterministic utilities (zero opinion fields).
 
Article
Biogenesis and Evolution are viewed from the perspective of the universality of the metabolic chart with respect to primary metabolism and the phylogenetic specificity of secondary metabolism. This analysis is developed within the context of the evolution of the universal ancestor through hierarchical networks of chemical reactions.
 
Article
Given an upper triangular matrix A ∈ Rn×n and a tolerance τ, we show that the problem of finding a similarity transformation G such that G−1AG is block diagonal with the condition number of G being at most τ is NP-hard. Let ƒ(n) be a polynomial in n. We also show that the problem of finding a similarity transformation G such that G−1AG is block-diagonal with the condition number of G being at most ƒ(n) times larger than the smallest possible is NP-hard.
 
Article
We develop a new approach to the study of the dynamics of link utilization in complex networks using records of communication in a large social network. Counter to the perspective that nodes have particular roles, we find roles change dramatically from day to day. "Local hubs" have a power law degree distribution over time, with no characteristic degree value. Our results imply a significant reinterpretation of the concept of node centrality in complex networks, and among other conclusions suggest that interventions targeting hubs will have significantly less effect than previously thought.
 
Article
This is the transcript of a lecture given at UMass-Lowell in which I compare and contrast the work of Godel and of Turing and my own work on incompleteness. I also discuss randomness in physics vs randomness in pure mathematics.
 
Article
The importance of statistical patterns of language has been debated over decades. Although Zipf's law is perhaps the most popular case, recently, Menzerath's law has begun to be involved. Menzerath's law manifests in language, music and genomes as a tendency of the mean size of the parts to decrease as the number of parts increases in many situations. This statistical regularity emerges also in the context of genomes, for instance, as a tendency of species with more chromosomes to have a smaller mean chromosome size. It has been argued that the instantiation of this law in genomes is not indicative of any parallel between language and genomes because (a) the law is inevitable and (b) non-coding DNA dominates genomes. Here mathematical, statistical and conceptual challenges of these criticisms are discussed. Two major conclusions are drawn: the law is not inevitable and languages also have a correlate of non-coding DNA. However, the wide range of manifestations of the law in and outside genomes suggests that the striking similarities between non-coding DNA and certain linguistics units could be anecdotal for understanding the recurrence of that statistical law.
 
Article
A number of observations are made on Hofstadter's integer sequence defined by Q(n)= Q(n-Q(n-1))+Q(n-Q(n-2)), for n > 2, and Q(1)=Q(2)=1. On short scales the sequence looks chaotic. It turns out, however, that the Q(n) can be grouped into a sequence of generations. The k-th generation has 2**k members which have ``parents'' mostly in generation k-1, and a few from generation k-2. In this sense the series becomes Fibonacci type on a logarithmic scale. The mean square size of S(n)=Q(n)-n/2, averaged over generations is like 2**(alpha*k), with exponent alpha = 0.88(1). The probability distribution p^*(x) of x = R(n)= S(n)/n**alpha, n >> 1, is well defined and is strongly non-Gaussian. The probability distribution of x_m = R(n)-R(n-m) is given by p_m(x_m)= lambda_m * p^*(x_m/lambda_m). It is conjectured that lambda_m goes to sqrt(2) for large m. Comment: Replaced to conform with version accepted by Complexity
 
Article
We show how a simple scheme of symbolic dynamics distinguishes a chaotic from a random time series and how it can be used to detect structural relationships in coupled dynamics. This is relevant for the question at which scale in complex dynamics regularities and patterns emerge.
 
Article
A source of unpredictability is equivalent to a source of information: unpredictability means not knowing which of a set of alternatives is the actual one; determining the actual alternative yields information. The degree of unpredictability is neatly quantified by the information measure introduced by Shannon. This perspective is applied to three kinds of unpredictability in physics: the absolute unpredictability of quantum mechanics, the unpredictability of the coarse-grained future due to classical chaos, and the unpredictability of open systems. The incompatibility of the first two of these is the root of the difficulty in defining quantum chaos, whereas the unpredictability of open systems, it is suggested, can provide a unified characterization of chaos in classical and quantum dynamics.
 
Results from fitting the logistic model to (the kept) time series of name incidence from French (left column), Dutch (middle column), and American (right column) databases. First row: Histograms of goodness of fit R2. Second row: Goodness of fit R2 versus cumulative incidence of names. Names with high cumulative incidence tend to be well fit. With decreasing cumulative incidence, goodness of fit spreads over a larger range yet remains high for many names. Third row: Parameter sets corresponding to the well fit time series plotted as the carrying capacity, K, versus the imitation coefficient, q. The two parameters do not appear to be strongly correlated. While K varies over several orders of magnitude, q remains tightly confined between 0.04 and 1. Fourth row: Examples of time series of name incidence fit with the logistic model. For each database included in our study, we chose two names with different popularity (high and low popularity names are shown in black and red, respectively). Data is represented with dots and the fit models with lines. The panels are as follows: J French names Philippe (black) and Francisco (red); K Dutch names Ingrid (black) and Moniek (red); L American names Diane (black) and Seymour (red).
The logistic model has predictive power. We illustrate the case of the French name Florine. The data up to year 1999 (black dots) were used to obtain the fit parameters; their 95% confidence intervals were estimated through wild bootstrap. The black line represents the best fit and the grey region the uncertainty generated by bootstrap. The data on the period 2000–2008, shown as blue stars, are well predicted by the logistic fit of the data on the previous period.
Figure S1. Regimes of adoption and abandonment of a name whose incidence dynamics follow the logistic model. 
Article
Goods, styles, ideologies are adopted by society through various mechanisms. In particular, adoption driven by innovation is extensively studied by marketing economics. Mathematical models are currently used to forecast the sales of innovative goods. Inspired by the theory of diffusion processes developed for marketing economics, we propose, for the first time, a predictive framework for the mechanism of fashion, which we apply to first names. Analyses of French, Dutch and US national databases validate our modelling approach for thousands of first names, covering, on average, more than 50% of the yearly incidence in each database. In these cases, it is thus possible to forecast how popular the first names will become and when they will run out of fashion. Furthermore, we uncover a clear distinction between popularity and fashion: less popular names, typically not included in studies of fashion, may be driven by fashion, as well.
 
Article
Describing the dynamics of a city is a crucial step to both understanding the human activity in urban environments and to planning and designing cities accordingly. Here we describe the collective dynamics of New York City and surrounding areas as seen through the lens of Twitter usage. In particular, we observe and quantify the patterns that emerge naturally from the hourly activities in different areas of New York City, and discuss how they can be used to understand the urban areas. Using a dataset that includes more than 6 million geolocated Twitter messages we construct a movie of the geographic density of tweets. We observe the diurnal "heartbeat" of the NYC area. The largest scale dynamics are the waking and sleeping cycle and commuting from residential communities to office areas in Manhattan. Hourly dynamics reflect the interplay of commuting, work and leisure, including whether people are preoccupied with other activities or actively using Twitter. Differences between weekday and weekend dynamics point to changes in when people wake and sleep, and engage in social activities. We show that by measuring the average distances to the heart of the city one can quantify the weekly differences and the shift in behavior during weekends. We also identify locations and times of high Twitter activity that occur because of specific activities. These include early morning high levels of traffic as people arrive and wait at air transportation hubs, and on Sunday at the Meadowlands Sports Complex and Statue of Liberty. We analyze the role of particular individuals where they have large impacts on overall Twitter activity. Our analysis points to the opportunity to develop insight into both geographic social dynamics and attention through social media analysis.
 
Article
We study the micromechanics of collagen-I gel with the goal of bridging the gap between theory and experiment in the study of biopolymer networks. Three-dimensional images of fluorescently labeled collagen are obtained by confocal microscopy and the network geometry is extracted using a 3d network skeletonization algorithm. Each fiber is modeled as a worm-like-chain that resists stretching and bending, and each cross-link is modeled as torsional spring. The stress-strain curves of networks at three different densities are compared to rheology measurements. The model shows good agreement with experiment, confirming that strain stiffening of collagen can be explained entirely by geometric realignment of the network, as opposed to entropic stiffening of individual fibers. The model also suggests that at small strains, cross-link deformation is the main contributer to network stiffness whereas at large strains, fiber stretching dominates. Since this modeling effort uses networks with realistic geometries, this analysis can ultimately serve as a tool for understanding how the mechanics of fibers and cross-links at the microscopic level produce the macroscopic properties of the network. While the focus of this paper is on the mechanics of collagen, we demonstrate a framework that can be applied to many biopolymer networks.
 
Article
We report on experiments of many small motors -- cell phone vibrators -- glued to and interacting through a resonant plate. We find that individual motors interacting with the plate demonstrate hysteresis in their steady-state frequency due to interactions with plate resonances. For multiple motors running simultaneously, the degree of synchronization between motors increases when the motors' frequencies are near a resonance of the plate, and the frequency at which the motors synchronize shows a history dependence. Comment: 7 pages, 8 figures
 
Article
The importance of collective social action in current events is manifest in the Arab Spring and Occupy movements. Electronic social media have become a pervasive channel for social interactions, and a basis of collective social response to information. The study of social media can reveal how individual actions combine to become the collective dynamics of society. Characterizing the groups that form spontaneously may reveal both how individuals self-identify and how they will act together. Here we map the social, political, and geographical properties of news-sharing communities on Twitter, a popular micro-blogging platform. We track user-generated messages that contain links to New York Times online articles and we label users according to the topic of the links they share, their geographic location, and their self-descriptive keywords. When users are clustered based on who follows whom in Twitter, we find social groups separate by whether they are interested in local (NY), national (US) or global (cosmopolitan) issues. The national group subdivides into liberal, conservative and other, the latter being a diverse but mostly business oriented group with sports, arts and other splinters. The national political groups are based across the US but are distinct from the national group that is broadly interested in a variety of topics. A person who is cosmopolitan associates with others who are cosmopolitan, and a US liberal / conservative associates with others who are US liberal / conservative, creating separated social groups with those identities. The existence of "citizens" of local, national and cosmopolitan communities is a basis for dialog and action at each of these levels of societal organization.
 
Article
We discuss a model of an economic community consisting of $N$ interacting agents. The state of each agent at any time is characterized, in general, by a mixed strategy profile drawn from a space of $s$ pure strategies. The community evolves as agents update their strategy profiles in response to payoffs received from other agents. The evolution equation is a generalization of the replicator equation. We argue that when $N$ is sufficiently large and the payoff matrix elements satisfy suitable inequalities, the community evolves to retain the full diversity of available strategies even as individual agents specialize to pure strategies.
 
Article
We derive a general theorem relating the energy, momentum and velocity of any solitary wave solution of the generalized KdV equation which enables us to relate the amplitude, width, and momentum to the velocity of these solutions. We obtain the general condition for linear and Lyapunov stability. We then obtain a two parameter family of exact solutions to these equations which include elliptic and hyper-elliptic compacton solutions. For this general family we explicitly verify both the theorem and the stability criteria.
 
Article
Different methods are used to determine the scaling exponents associated with a time series describing a complex dynamical process, such as those observed in geophysical systems. Many of these methods are based on the numerical evaluation of the variance of a diffusion process whose step increments are generated by the data. An alternative method focuses on the direct evaluation of the scaling coefficient of the Shannon entropy of the same diffusion distribution. The combined use of these methods can efficiently distinguish between fractal Gaussian and L\'{e}vy-walk time series and help to discern between alternative underling complex dynamics.
 
Article
Education at all levels is facing several challenges in most countries, such as low quality, high costs, lack of educators, and unsatisfied student demand. Traditional approaches are becoming unable to deliver the required education. Several causes for this inefficiency can be identified. I argue that beyond specific causes, the lack of effective education is related to complexity. However, information technology is helping us overcome this complexity.
 
Article
News items, book and journal announcements, software and hardware modeling tools, web sites, meetings and workshops, and miscellaneous announcements.
 
Article
This commentary discusses a recently proposed measure of heterogeneity of DNA sequences and compares with the measures of complexity.
 
Article
A discussion of some of the problems in the utilization of game theoretic solution concepts is given. It is suggested that a considerable broadening of solution concepts is called for to take into account sufficient context. Mass agent simulations appear to offer promise for some economic and societal problems.
 
Article
This paper constructs a tree structure for the music rhythm using the L-system. It models the structure as an automata and derives its complexity. It also solves the complexity for the L-system. This complexity can resolve the similarity between trees. This complexity serves as a measure of psychological complexity for rhythms. It resolves the music complexity of various compositions including the Mozart effect K488. Keyword: music perception, psychological complexity, rhythm, L-system, automata, temporal associative memory, inverse problem, rewriting rule, bracketed string, tree similarity Comment: 21 pages, 13 figures, 2 tables
 
Article
News items, book and journal announcements, software and hardware announcements, websites, and miscellaneous news items related to the study of complex systems are provided.
 
Journal metrics
9 days
Submission to first decision
66 days
Submission to final decision
21 days
Acceptance to publication
33%
Acceptance rate
$2,500
APC
2.121 (2021)
Journal Impact Factor™
3.5 (2021)
CiteScore
Top-cited authors
Robert Axelrod
  • University of Michigan
Noradin Ghadimi
  • Islamic Azad University, Ardabil Branch
Yaneer Bar-Yam
  • New England Complex Systems Institute
Ricard Sole
  • University Pompeu Fabra
Stuart A. Kauffman
  • Institute for Systems Biology, Seattle WA United States