Peter GrassbergerUniversity of Calgary · Department of Physics and Astronomy
Peter Grassberger
About
295
Publications
38,135
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
39,854
Citations
Publications
Publications (295)
Based on extensive simulations, we conjecture that critically pinned interfaces in 2-dimensional isotropic random media with short range correlations are always in the universality class of ordinary percolation. Thus, in contrast to interfaces in $>2$ dimensions, there is no distinction between fractal (i.e., percolative) and rough but non-fractal...
Although the title seems self-contradictory, it does not contain a misprint. The model we study is a seemingly minor modification of the "true self-avoiding walk" (TSAW) model of Amit, Parisi, and Peliti in two dimensions. The walks in it are self-repelling up to a characteristic time $T^*$ (which depends on various parameters), but spontaneously (...
We study a generalization of site percolation on a simple cubic lattice, where not only single sites are removed randomly, but also entire parallel columns of sites. We show that typical clusters near the percolation transition are very anisotropic, with different scaling exponents for the sizes parallel and perpendicular to the columns. Below the...
We present high statistics simulation data for the average time $\langle T_{\rm cover}(L)\rangle$ that a random walk needs to cover completely a 2-dimensional torus of size $L\times L$. They confirm the mathematical prediction that $\langle T_{\rm cover}(L)\rangle \sim (L \ln L)^2$ for large $L$, but the prefactor seems to {\it deviate significantl...
Percolation is the paradigm for random connectivity and has been one of the
most applied statistical models. With simple geometrical rules a transition is
obtained which is related to magnetic models. This transition is, in all
dimensions, one of the most robust continuous transitions known. We present a
very brief overview of more than 60 years of...
The theory of percolation deals with the appearance of large connected domains between randomly interlinked nodes and with the spreading of non-conserved ‘agents’ over these domains. Near threshold, when such domains just barely exist, they are typically sparse and fractal, making the transition from disconnectedness to connectedness a “second orde...
A Reply to the Comment by Y. Berezin et al.
We extend a recent study of susceptible-infected-removed epidemic processes
with long range infection (referred to as I in the following) from
1-dimensional lattices to lattices in two dimensions. As in I we use hashing to
simulate very large lattices for which finite size effects can be neglected, in
spite of the assumed power law $p({\bf x})\sim...
We present numerical results for various information theoretic properties of the square lattice Ising model. First, using a bond propagation algorithm, we find the difference 2H_{L}(w)-H_{2L}(w) between entropies on cylinders of finite lengths L and 2L with open end cap boundaries, in the limit L→∞. This essentially quantifies how the finite length...
We study epidemic processes with immunization on very large 1-dimensional
lattices, where at least some of the infections are non-local, with rates
decaying as power laws p(x) ~ x^{-sigma-1} for large distances x. When starting
with a single infected site, the cluster of infected sites stays always bounded
if $\sigma >1$ (and dies with probability...
We review possible measures of complexity which might in particular be
applicable to situations where the complexity seems to arise spontaneously. We
point out that not all of them correspond to the intuitive (or "naive") notion,
and that one should not expect a unique observable of complexity. One of the
main problems is to distinguish complex fro...
In J. Shao et al., PRL 103, 108701 (2009) the authors claim that a model with
majority rule coarsening exhibits in d=2 a percolation transition in the
universality class of invasion percolation with trapping. In the present
comment we give compelling evidence, including high statistics simulations on
much larger lattices, that this is not correct....
Ordinary bond percolation (OP) can be viewed as a process where clusters grow by joining them pairwise, adding links chosen randomly one by one from a set of predefined virtual links. In contrast, in agglomerative percolation (AP) clusters grow by choosing randomly a target cluster and joining it with all its neighbors, as defined by the same set o...
We study the corrections to scaling for the mass of the watershed, the bridge
line, and the optimal path crack in two and three dimensions. We disclose that
these models have numerically equivalent fractal dimensions and leading
correction-to-scaling exponents. We conjecture all three models to possess the
same fractal dimension, namely, $d_f=1.216...
Discontinuous percolation transitions and the associated tricritical points
are manifest in a wide range of both equilibrium and non-equilibrium
cooperative phenomena. To demonstrate this, we present and relate the
continuous and first order behaviors in two different classes of models: The
first are generalized epidemic processes (GEP) that descri...
PageRank (PR) is an algorithm originally developed by Google to evaluate the
importance of web pages. Considering how deeply rooted Google's PR algorithm is
to gathering relevant information or to the success of modern businesses, the
question of rank-stability and choice of the damping factor (a parameter in the
algorithm) is clearly important. We...
For many real-world networks only a small "sampled" version of the original
network may be investigated; those results are then used to draw conclusions
about the actual system. Variants of breadth-first search (BFS) sampling, which
are based on epidemic processes, are widely used. Although it is well
established that BFS sampling fails, in most ca...
We study the statistical behavior under random sequential renormalization (RSR) of several network models including Erdös-Rényi (ER) graphs, scale-free networks, and an annealed model related to ER graphs. In RSR the network is locally coarse grained by choosing at each renormalization step a node at random and joining it to all its neighbors. Comp...
Clustering, assortativity, and communities are key features of complex networks. We probe dependencies between these features and find that ensembles of networks with high clustering display both high assortativity by degree and prominent community structure, while ensembles with high assortativity show much less enhancement of the clustering or co...
We study a model for coupled networks introduced recently by Buldyrev et al., [Nature (London) 464, 1025 (2010)], where each node has to be connected to others via two types of links to be viable. Removing a critical fraction of nodes leads to a percolation transition that has been claimed to be more abrupt than that for uncoupled networks. Indeed,...
We consider the mass-dependent aggregation process (k+1)X→X, given a fixed number of unit mass particles in the initial state. One cluster is chosen proportional to its mass and is merged into one, either with k neighbors in one dimension, or--in the well-mixed case--with k other clusters picked randomly. We find the same combinatorial exact soluti...
We consider percolation on interdependent locally treelike networks, recently
introduced by Buldyrev et al., Nature 464, 1025 (2010), and demonstrate that
the problem can be simplified conceptually by deleting all references to
cascades of failures. Such cascades do exist, but their explicit treatment just
complicates the theory -- which is a strai...
Irreversible aggregation is revisited in view of recent work on renormalization of complex networks. Its scaling laws and phase transitions are related to percolation transitions seen in the latter. We illustrate our points by giving the complete solution for the probability to find any given state in an aggregation process (k+1)X→X, given a fixed...
In this review, we describe applications of the pruned-enriched Rosenbluth
method (PERM), a sequential Monte Carlo algorithm with resampling, to various
problems in polymer physics. PERM produces samples according to any given
prescribed weight distribution, by growing configurations step by step with
controlled bias, and correcting "bad" configura...
We study four Achlioptas-type processes with "explosive" percolation transitions. All transitions are clearly continuous, but their finite size scaling functions are not entirely holomorphic. The distributions of the order parameter, i.e., the relative size s(max)/N of the largest cluster, are double humped. But-in contrast to first-order phase tra...
We introduce the concept of random sequential renormalization (RSR) for arbitrary networks. RSR is a graph renormalization procedure that locally aggregates nodes to produce a coarse grained network. It is analogous to the (quasi)parallel renormalization schemes introduced by C. Song et al. [C. Song et al., Nature (London) 433, 392 (2005)] and stud...
DOI:https://doi.org/10.1103/PhysRevE.83.019903
We correct claims about lower bounds on mutual information (MI) between real-valued random variables made by Kraskov et al., Phys. Rev. E 69, 066138 (2004). We show that non-trivial lower bounds on MI in terms of linear correlations depend on the marginal (single variable) distributions. This is so in spite of the invariance of MI under reparametri...
Additional figures, tables and discussion.
(2.82 MB PDF)
Background:
Existing sequence alignment algorithms use heuristic scoring schemes based on biological expertise, which cannot be used as objective distance metrics. As a result one relies on crude measures, like the p- or log-det distances, or makes explicit, and often too simplistic, a priori assumptions about sequence evolution. Information theor...
We study a process termed "agglomerative percolation" (AP) in two dimensions.
Instead of adding sites or bonds at random, in AP randomly chosen clusters are
linked to all their neighbors. As a result the growth process involves a
diverging length scale near a critical point. Picking target clusters with
probability proportional to their mass leads...
We describe innovation in terms of a generalized branching process. Each new
invention pairs with any existing one to produce a number of offspring, which
is Poisson distributed with mean p. Existing inventions die with probability
p/\tau at each generation. In contrast to mean field results, no phase
transition occurs; the chance for survival is f...
Directed networks are ubiquitous and are necessary to represent complex systems with asymmetric interactions--from food webs to the World Wide Web. Despite the importance of edge direction for detecting local and community structure, it has been disregarded in studying a basic type of global diversity in networks: the tendency of nodes with similar...
Ensembles of networks are used as null models in many applications. However, simple null models often show much less clustering than their real-world counterparts. In this paper, we study a "biased rewiring model" where clustering is enhanced by means of a fugacity as in the Strauss (or "triangle") model, but where the number of links attached to e...
Networks to infer causal structure from spatiotemporal data are constructed making minimal a priori assumptions about the underlying dynamics. The elementary concept of recurrence for a point process in time is generalized to recurrent events in space and time. An event is defined to be a recurrence of any previous event if it is closer to it in sp...
We reconsider the problem of local persistence in directed site percolation.
We present improved estimates of the persistence exponent in all dimensions
from 1+1 to 7+1, obtained by new algorithms and by improved implementations of
existing ones. We verify the strong corrections to scaling for 2+1 and 3+1
dimensions found in previous analyses, but...
We simulate directed site percolation on two lattices with four spatial and one timelike dimensions (simple and body-centered hypercubic in space) with the standard single cluster spreading scheme. For efficiency, the code uses the same ingredients (hashing, histogram reweighing, and improved estimators) as described by Grassberger [Phys. Rev. E 67...
We check claims for a generalized central limit theorem holding at the Feigenbaum (infinite bifurcation) point of the logistic map made recently by Tirnakli, Phys. Rev. 75, 040106(R) (2007); this issue, Phys. Rev.79, 056209 (2009). We show that there is no obvious way that these claims can be made consistent with high statistics simulations. Instea...
We simulate loop-erased random walks on simple (hyper-)cubic lattices of dimensions 2, 3 and 4. These simulations were mainly motivated to test recent two loop renormalization group predictions for logarithmic corrections in d=4, simulations in lower dimensions were done for completeness and in order to test the algorithm. In d=2, we verify with hi...
In probability theory, reinforced walks are random walks on a lattice (or more generally a graph) that preferentially revisit neighboring `locations' (sites or bonds) that have been visited before. In this paper, we consider walks with one-step reinforcement, where one preferentially \emph{revisits} locations irrespective of the number of visits. P...
Alignment of biological sequences such as DNA, RNA or proteins is one of the most widely used tools in computational bioscience. While the accomplishments of sequence alignment algorithms are undeniable the fact remains that these algorithms are based upon heuristic scoring schemes. Therefore, these algorithms do not provide model independent and o...
Clustering is a concept used in a huge variety of applications. We review a conceptually very simple algorithm for hierarchical clustering called in the following the {\it mutual information clustering} (MIC) algorithm. It uses mutual information (MI) as a similarity measure and exploits its grouping property: The MI between three objects X, Y, and...
Background: Alignment of biological sequences such as DNA, RNA or
proteins is one of the most widely used tools in computational
bioscience. All existing alignment algorithms rely on heuristic scoring
schemes based on biological expertise. Therefore, these algorithms do
not provide model independent and objective measures for how similar two
(or mo...
We propose a biologically motivated quantity, twinness, to evaluate local similarity between nodes in a network. The twinness of a pair of nodes is the number of connected, labeled subgraphs of size n in which the two nodes possess identical neighbours. The graph animal algorithm is used to estimate twinness for each pair of nodes (for subgraph siz...
We propose a method to search for signs of causal structure in spatiotemporal data making minimal a priori assumptions about the underlying dynamics. To this end, we generalize the elementary concept of recurrence for a point process in time to recurrent events in space and time. An event is defined to be a recurrence of any previous event if it is...
We apply complex network analysis to the state spaces of random Boolean networks (RBNs). An RBN contains $N$ Boolean elements each with $K$ inputs. A directed state space network (SSN) is constructed by linking each dynamical state, represented as a node, to its temporal successor. We study the heterogeneity of an SSN at both local and global scale...
We show that the predictability of letters in written English texts depends strongly on their position in the word. The first letters are usually the least easy to predict. This agrees with the intuitive notion that words are well defined subunits in written languages, with much weaker correlations across these units than within them. It implies th...
The simplest null models for networks, used to distinguish significant features of a particular network from a priori expected features, are random ensembles with the degree sequence fixed by the specific network of interest. These "fixed degree sequence" (FDS) ensembles are, however, famously resistant to analytic attack. In this paper we introduc...
We generalize a sampling algorithm for lattice animals (connected clusters on a regular lattice) to a Monte Carlo algorithm for "graph animals," i.e., connected subgraphs in arbitrary networks. As with the algorithm in [N. Kashtan et al., Bioinformatics 20, 1746 (2004)], it provides a weighted sample, but the computation of the weights is much fast...
We study random walks on large random graphs that are biased towards a randomly chosen but fixed target node. We show that a critical bias strength bc exists such that most walks find the target within a finite time when b > bc. For b < bc, a finite fraction of walks drift off to infinity before hitting the target. The phase transition at b=bc is a...
We present simulations of a random copolymer model introduced by Kantor and Kardar (Europhys. Lett., 28 (1994) 169). In this model there are two types A, B of monomers. Neighbouring monomers interact with a potential +v0 if they are of the same type, and with −v0 else. When NA ≈ NB, we find a θ-transition at Tθ close to that found by Kantor and Kar...
We study networks representing the dynamics of elementary 1D cellular automata (CA) on finite lattices. We analyze scaling behaviors of both local and global network properties as a function of system size. The scaling of the largest node in-degree is obtained analytically for a variety of CA including rules 22, 54, and 110. We further define the p...
We present a conceptually simple method for hierarchical clustering of data called mutual information clustering (MIC) algorithm. It uses mutual information (MI) as a similarity measure and exploits its grouping property: The MI between three objects X, Y, and Z is equal to the sum of the MI between X and Y, plus the MI between Z and the combined o...
The investigation of synchronization phenomena on measured experimental data such as biological time series has recently become an increasing focus of interest. Different approaches for measuring synchronization have been proposed that rely on certain characteristic features of the dynamical system under investigation. For experimental data the und...
We study a single flexible chain molecule grafted to a membrane which has pores of size slightly larger than the monomer size. On both sides of the membrane there is the same solvent. When this solvent is good, i.e. when the polymer is described by a self-avoiding walk, it can fairly easily penetrate the membrane, so that the average number of memb...
The dimensional crossover phenomena of heat conduction is studied by a two-dimensional (2D) Fermi-Pasta-Ulam lattice. The 2D divergence law of the thermal conductivity is confirmed by the simulations results. The divergence law of the thermal conductivity will change from the 2D class to 1D class as delta=N{y}N{x} decreases, here N{y} is the size i...
We discuss several questions related to the predictability of chaotic systems. We first review the information flow through systems with few degrees of freedom and with sensitive dependence on initial conditions, and ways of measuring this flow. We then ask how the knolwedge obtained in this way can be used in improving forecasts, and propose one w...
We study a single flexible chain molecule grafted to a membrane which has pores of size slightly larger than the monomer size. On both sides of the membrane there is the same solvent. When this solvent is good, i.e. when the polymer is described by a self avoiding walk, it can fairly easily penetrate the membrane, so that the average number of memb...
We propose a simulated annealing algorithm (stochastic non-negative independent component analysis, SNICA) for blind decomposition of linear mixtures of non-negative sources with non-negative coefficients. The demixing is based on a Metropolis-type Monte Carlo search for least dependent components, with the mutual information between recovered comp...
We discuss ways of defining complexity in physics, and in particular for symbol sequences typically arising in autonomous dynamical systems. We stress that complexity should be distinct from randomness. This leads us to consider the difficulty of making optimal forecasts as one (but not the only) suitable measure. This difficulty is discussed in de...
We present detailed simulations of a generalization of the Domany-Kinzel model to 2+1 dimensions. It has two control parameters $p$ and $q$ which describe the probabilities $P_k$ of a site to be wetted, if exactly $k$ of its "upstream" neighbours are already wetted. If $P_k$ depends only weakly on $k$, the active/adsorbed phase transition is in the...
We show that recent claims for the nonstationary behavior of the logistic map at the Feigenbaum point based on nonextensive thermodynamics are either incorrect or can be easily deduced from well-known properties of the Feigenbaum attractor. In particular, there is no generalized Pesin identity for this system, the existing attempts at proofs being...
Extending the central concept of recurrence times for a point process to
recurrent events in space-time allows us to characterize seismicity as a record
breaking process using only spatiotemporal relations among events. Linking
record breaking events with edges between nodes in a graph generates a complex
dynamical network isolated from any length,...
The scaling behavior of randomly branched polymers in a good solvent is studied in two to nine dimensions, modeled by lattice animals on simple hypercubic lattices. For the simulations, we use a biased sequential sampling algorithm with re-sampling, similar to the pruned-enriched Rosenbluth method (PERM) used extensively for linear polymers. We obt...
Lattice animals are one of the few critical models in statistical mechanics violating conformal invariance. We present here simulations of two-dimensional site animals on square and triangular lattices in nontrivial geometries. The simulations are done with the pruned-enriched Rosenbluth method (PERM) algorithm, which gives very precise estimates o...
We present high statistics simulations of weighted lattice bond animals and lattice trees on the square lattice, with fugacities for each non-bonded contact and for each bond between two neighbouring monomers. The simulations are performed using a newly developed sequential sampling method with resampling, very similar to the pruned-enriched Rosenb...
A recently proposed mutual information based algorithm for decomposing data into least dependent components (MILCA) is applied to spectral analysis, namely to blind recovery of concentrations and pure spectra from their linear mixtures. The algorithm is based on precise estimates of mutual information between measured spectra, which allows to asses...
We propose to use precise estimators of mutual information (MI) to find the least dependent components in a linearly mixed signal. On the one hand, this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand, it has the advantage, compared to other implementations of "independent" compon...
We study long polymer chains in a poor solvent, confined to the space between two parallel hard walls. The walls are energetically neutral and pose only a geometric constraint which changes the properties of the coil-globule (or "$\theta$-") transition. We find that the $\theta$ temperature increases monotonically with the width $D$ between the wal...
We demonstrate by means of a simple example that the arbitrariness of defining a phase from an aperiodic signal is not just an academic problem, but is more serious and fundamental. Decomposition of the signal into components with positive phase velocities is proposed as an old solution to this new problem.
We present improved simulations of three-dimensional self avoiding walks with one end attached to an impenetrable surface on the simple cubic lattice. This surface can either be a-thermal, having thus only an entropic effect, or attractive. In the latter case we concentrate on the adsorption transition, We find clear evidence for the cross-over exp...
We describe a class of growth algorithms for finding low energy states of heteropolymers. These polymers form toy models for proteins, and the hope is that similar methods will ultimately be useful for finding native states of real proteins from heuristic or a priori determined force fields. These algorithms share with standard Markov chain Monte C...
Treating realistically the ambient water is one of the main difficulties in applying Monte Carlo methods to protein folding. The solvent-accessible area method, a popular method for treating water implicitly, is investigated by means of Metropolis simulations of the brain peptide Met-Enkephalin. For the phenomenological energy function ECEPP/2 nine...
Obtaining the most independent components from a mixture (under a chosen model) is only the first part of an ICA analysis.
After that, it is necessary to measure the actual dependency between the components and the reliability of the decomposition.
We have to identify one- and multidimensional components (i.e., clusters of mutually dependent compon...
The scaling behaviour of randomly branched polymers in a good solvent is studied in two to nine dimensions, using as microscopic models lattice animals and lattice trees on simple hypercubic lattices. As a stochastic sampling method we use a biased sequential sampling algorithm with re-sampling, similar to the pruned-enriched Rosenbluth method (PER...