Conference Paper

Inconsistencies among Spectral Robustness Metrics

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Network robustness plays a critical role in the proper functioning of modern society. It is common practice to use spectral metrics, to quantify the robustness of networks. In this paper we compare eight different spectral metrics that quantify network robustness. Four of the metrics are derived from the adjacency matrix, the others follow from the Laplacian spectrum. We found that the metrics can give inconsistent indications, when comparing the robustness of different synthetic networks. Then, we calculate and compare the spectral metrics for a number of real-world networks, where inconsistencies still occur, but to a lesser extent. Finally, we indicate how the concept of the R *-value, a weighted sum of robustness metrics, can be used to resolve the found inconsistencies.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... N ETWORKS or graphs are a powerful tool to describe a large variety of real systems, such as power grids and computer networks, criminal organizations, and terrorist groups. Most realistic networks are often subject to natural failures or malicious attacks, which can lead to a corrosive and detrimental risk to the functioning of societies, with regard to costs, security and disruption [1], [2]. For example, in 2017 the 'WannaCry' ransomware attack on NHS network of U.K. eventually infected around 230,000 computers over 150 countries and caused damages of billions of dollars [3]. Other examples include breakdowns in power grids or water supply networks, equipment failures in communicating networks, traffic and congestion in transportation networks, and so on. ...
Preprint
The function or performance of a network is strongly dependent on its robustness, quantifying the ability of the network to continue functioning under perturbations. While a wide variety of robustness metrics have been proposed, they have their respective limitations. In this paper, we propose to use the forest index as a measure of network robustness, which overcomes the deficiencies of existing metrics. Using such a measure as an optimization criterion, we propose and study the problem of breaking down a network by attacking some key edges. We show that the objective function of the problem is monotonic but not submodular, which impose more challenging on the problem. We thus resort to greedy algorithms extended for non-submodular functions by iteratively deleting the most promising edges. We first propose a simple greedy algorithm with a proved bound for the approximation ratio and cubic-time complexity. To confront the computation challenge for large networks, we further propose an improved nearly-linear time greedy algorithm, which significantly speeds up the process for edge selection but sacrifices little accuracy. Extensive experimental results for a large set of real-world networks verify the effectiveness and efficiency of our algorithms, demonstrating that our algorithms outperform several baseline schemes.
... Even quantifying the structural features of an interaction network is not an easy task (Newman, 2003;Wang et al., 2018). There are many different metrics proposed for the task, often describing similar features of the network and Chapter 1. Introduction 3 only some are directly related to specific dynamics on the network. ...
Thesis
Full-text available
Collective dynamics underpin many diverse complex systems e.g., animals, humans, and robotic multi-agent systems where a large number of individuals cooperate. The information that allows for such collective behaviors to emerge flows through an interaction network specific to the ongoing process. Naturally, this yields an intricate interplay between the collective dynamics on the one hand and the features of the interaction network on the other. Therefore, it is of paramount importance to understand the effects that network topological features have on the collective dynamics, to understand how these systems can produce an effective collective response in the presence of changes in their environment, i.e., swarm intelligence. To investigate the influence of the network topology on the collective response, we consider the archetypal leader-follower linear consensus---a distributed deci-\\ sion-making model. Here, we find---through comparing various deterministic graphs and an optimization framework---a nontrivial relationship between the frequency of the driving signal and the optimal network topology. As the pace of the leader increases, the optimal connectivity of the network, in terms of its degree, decreases monotonically. We further demonstrate this rich phenomenology through the use of a swarm of land robots performing a nonlinear heading consensus. As a next step, we go beyond the effects of network degree and study the intricate impact of network clustering and network distance. We find that the flow of information in the leader-follower linear consensus experiences a transition from simple contagion---i.e., based on pairwise interactions---to a complex one---i.e., involving social influence and reinforcement when the pace of the leader increases. We uncover this rich phenomenology---so far limited to threshold-based decision-making processes---and show theoretically that it can be characterized as simple or complex by analyzing the correlations between specific network metrics and the performance of the collective behaviors---here measured as the collective frequency response. We also uncover the complex contagion in a swarm of land robots performing a nonlinear heading consensus. However, limitations with the number of agents led us to develop a large-scale internet-of-things testbed. This large-scale testbed allows us to more fully explore the richness found in the interplay between collective dynamics and network topology as well as tackle some of the challenges that arise in large networked systems. The results presented in this thesis greatly expand our understanding of the effects of network topology on collective dynamics. They have significant ramifications for our understanding of a range of social and animal group behaviors, as well as for the design of cooperative robot systems.
Chapter
In this work, we consider the following problem: given a graph, the addition of which single edge minimises the effective graph resistance of the resulting (or, augmented) graph. A graph’s effective graph resistance is inversely proportional to its robustness, which means the graph augmentation problem is relevant to, in particular, applications involving the robustness and augmentation of complex networks. On a classical computer, the best known algorithm for a graph with N vertices has time complexity . We show that it is possible to do better: Dürr and Høyer’s quantum algorithm solves the problem in time . We conclude with a simulation of the algorithm and solve ten small instances of the graph augmentation problem on the Quantum Inspire quantum computing platform.
Article
Full-text available
The increasing power of computer technology does not dispense with the need to extract meaningful information out of data sets of ever growing size, and indeed typically exacerbates the complexity of this task. To tackle this general problem, two methods have emerged, at chronologically different times, that are now commonly used in the scientific community: data mining and complex network theory. Not only do complex network analysis and data mining share the same general goal, that of extracting information from complex systems to ultimately create a new compact quantifiable representation, but they also often address similar problems too. In the face of that, a surprisingly low number of researchers turn out to resort to both methodologies. One may then be tempted to conclude that these two fields are either largely redundant or totally antithetic. The starting point of this review is that this state of affairs should be put down to contingent rather than conceptual differences, and that these two fields can in fact advantageously be used in a synergistic manner. An overview of both fields is first provided, some fundamental concepts of which are illustrated. A variety of contexts in which complex network theory and data mining have be used in a synergistic manner are then presented. Contexts in which the appropriate integration of complex networks metrics can lead to improved classification rates with respect to classical data mining algorithms and, conversely, contexts in which data mining can be used to tackle important issues in complex network theory applications are illustrated. Finally, ways to achieve a tighter integration between complex networks and data mining, and open lines of research are discussed.
Article
Full-text available
Improving robustness of complex networks is a challenge in several application domains, such as power grids and water management networks. In such networks, high robustness can be achieved by optimizing graph metrics such as the effective graph resistance, which is the focus of this paper. An important challenge lies in improving the robustness of complex networks under dynamic topological network changes, such as link addition and removal. This paper contributes theoretical and experimental findings about the robustness of complex networks under two scenarios: (i) selecting a link whose addition maximally decreases the effective graph resistance; (ii) protecting a link whose removal maximally increases the effective graph resistance. Upper and lower bounds of the effective graph resistance under these topological changes are derived. Four strategies that select single links for addition or removal, based on topological and spectral metrics, are evaluated on various synthetic and real-world networks. Furthermore, this paper illustrates a novel comparison method by considering the distance between the added or removed links, optimized according to the effective graph resistance and the algebraic connectivity. The optimal links are different in most cases but in close proximity.
Article
Full-text available
Despite the robustness of complex networks has been under extensive study in the last decade, a unifying framework able to embrace all the proposed metrics is still lacking. There are two open issues related with this gap in the literature: (a) how to dimension several metrics to allow their summation and (b) how to weight each of the metrics. In this work we propose a solution for the two aforementioned problems by defining the RR^*-value and introducing the concept of \emph{robustness surface} (Ω\Omega). The rationale of our proposal is to make use of Principal Component Analysis (PCA). We firstly adjust to 1 the initial robustness of a network. Secondly, we find the most informative robustness metric under a specific failure scenario. Then, we repeat the process for several percentage of failures and combinatorial configurations. Lastly, we join these values to form the robustness surface, which allows the visual assessment of network robustness variability. Results show that a network presents different robustness surfaces (i.e., dissimilar shapes) depending on the failure scenario and the set of metrics. In addition, the robustness surface allows to compare the robustness of different networks.
Article
Full-text available
We study the robustness of networks under node removal, considering random node failure, as well as targeted node attacks based on network centrality measures. Whilst both of these have been studied in the literature, existing approaches tend to study random failure in terms of average-case behavior, giving no idea of how badly network performance can degrade purely by chance. Instead of considering average network performance under random failure, we compute approximate network performance probability density functions as functions of the fraction of nodes removed. We find that targeted attacks based on centrality measures give a good indication of the worst-case behavior of a network. We show that many centrality measures produce similar targeted attacks and that a combination of degree centrality and eigenvector centrality may be enough to evaluate worst-case behavior of networks. Finally, we study the robustness envelope and targeted attack responses of networks that are rewired to have high and low degree assortativities, discovering that moderate assortativity increases confer more robustness against targeted attacks whilst moderate decreases confer more robustness against random uniform attacks.
Article
Full-text available
An increasing number of network metrics have been applied in network analysis. If metric relations were known better, we could more effectively characterize networks by a small set of metrics to discover the association between network properties/metrics and network functioning. In this paper, we investigate the linear correlation coefficients between widely studied network metrics in three network models (Bárabasi–Albert graphs, Erdös–Rényi random graphs and Watts–Strogatz small-world graphs) as well as in functional brain networks of healthy subjects. The metric correlations, which we have observed and theoretically explained, motivate us to propose a small representative set of metrics by including only one metric from each subset of mutually strongly dependent metrics. The following contributions are considered important. (a) A network with a given degree distribution can indeed be characterized by a small representative set of metrics. (b) Unweighted networks, which are obtained from weighted functional brain networks with a fixed threshold, and Erdös–Rényi random graphs follow a similar degree distribution. Moreover, their metric correlations and the resultant representative metrics are similar as well. This verifies the influence of degree distribution on metric correlations. (c) Most metric correlations can be explained analytically. (d) Interestingly, the most studied metrics so far, the average shortest path length and the clustering coefficient, are strongly correlated and, thus, redundant. Whereas spectral metrics, though only studied recently in the context of complex networks, seem to be essential in network characterizations. This representative set of metrics tends to both sufficiently and effectively characterize networks with a given degree distribution. In the study of a specific network, however, we have to at least consider the representative set so that important network properties will not be neglected.
Article
Full-text available
The concept of natural connectivity is reported as a robustness measure of complex networks. The natural connectivity has a clear physical meaning and a simple mathematical formulation. It is shown that the natural connectivity can be derived mathematically from the graph spectrum as an average eigenvalue and that it changes strictly monotonically with the addition or deletion of edges. By comparing the natural connectivity with other typical robustness measures within a scenario of edge elimination, it is demonstrated that the natural connectivity has an acute discrimination which agrees with our intuition.
Article
Full-text available
Power grids are complex dynamical systems, and because of this complexity it is unlikely that we will completely eliminate blackouts. However, there are things that can be done to reduce the average size and cost of these blackouts. In this article we described two strategies that hold substantial promise for reducing the size and cost of blackouts. Both "reciprocal altruism" and "survivability" respect the necessarily decentralized nature of power grids. Both strategies can be implemented within the context of the existing physical infrastructure of the power grids,which is important because dramatic changes to the physical infrastructure are prohibitively expensive. However, additional engineering and innovation will be needed to bring strategies such as these to implementation and to create power grids with smaller, less costly blackouts.
Article
Full-text available
Spectrum sensing is a fundamental component in a cognitive radio. In this paper, we propose new sensing methods based on the eigenvalues of the covariance matrix of signals received at the secondary users. In particular, two sensing algorithms are suggested, one is based on the ratio of the maximum eigenvalue to minimum eigenvalue; the other is based on the ratio of the average eigenvalue to minimum eigenvalue. Using some latest random matrix theories (RMT), we quantify the distributions of these ratios and derive the probabilities of false alarm and probabilities of detection for the proposed algorithms. We also find the thresholds of the methods for a given probability of false alarm. The proposed methods overcome the noise uncertainty problem, and can even perform better than the ideal energy detection when the signals to be detected are highly correlated. The methods can be used for various signal detection applications without requiring the knowledge of signal, channel and noise power. Simulations based on randomly generated signals, wireless microphone signals and captured ATSC DTV signals are presented to verify the effectiveness of the proposed methods.
Article
Full-text available
This paper studies an interesting graph measure that we call the effective graph resistance. The notion of effective graph resistance is derived from the field of electric circuit analysis where it is defined as the accumulated effective resistance between all pairs of vertices. The objective of the paper is twofold. First, we survey known formulae of the effective graph resistance and derive other representations as well. The derivation of new expressions is based on the analysis of the associated random walk on the graph and applies tools from Markov chain theory. This approach results in a new method to approximate the effective graph resistance. A second objective of this paper concerns the optimisation of the effective graph resistance for graphs with given number of vertices and diameter, and for optimal edge addition. A set of analytical results is described, as well as results obtained by exhaustive search. One of the foremost applications of the effective graph resistance we have in mind, is the analysis of robustness-related problems. However, with our discussion of this informative graph measure we hope to open up a wealth of possibilities of applying the effective graph resistance to all kinds of networks problems.
Article
Full-text available
We introduce the concept of natural connectivity as a mea- sure of structural robustness in complex networks. The natural connec- tivity characterizes the redundancy of alternative routes in a network by quantifying the weighted number of closed walks of all lengths. This definition leads to a simple mathematical formulation that links the natural connectivity to the spectrum of a network. The natural connectivity can be regarded as an average eigenvalue that changes strictly monotonically with the addition or deletion of edges. We calculate both analytically and numerically the natural connectivity of three typical networks: regular ring lattices, random graphs, and random scale-free networks. We also compare the proposed natural connectivity to other structural robustness measures within a scenario of edge elimination and demonstrate that the natural connectivity provides sensitive discrimination of structural robustness that agrees with our intuition.
Article
Full-text available
Communication data rate and energy constraints are important factors which have to be considered when investigating distributed coordination of multi-agent networks. Although many proposed average-consensus protocols are available, a funda- mental theoretic problem remains open, namely, how many bits of information are necessary for each pair of adjacent agents to exchange at each time step to ensure average consensus? In this paper, we consider average-consensus control of undirected networks of discrete-time first-order agents under communication constraints. Each agent has a real-valued state but can only exchange symbolic data with its neighbors. A distributed protocol is proposed based on dynamic encoding and decoding. It is proved that under the protocol designed, for a connected network, av- erage consensus can be achieved with an exponential convergence rate based on merely one bit information exchange between each pair of adjacent agents at each time step. An explicit form of the asymptotic convergence rate is given. It is shown that as the number of agents increases, the asymptotic convergence rate is related to the scale of the network, the number of quantization levels and the ratio of the second smallest eigenvalue to the largest eigenvalue of the Laplacian of the communication graph. We also give a performance index to characterize the total communication energy to achieve average consensus and show that the minimiza- tion of the communication energy leads to a tradeoff between the convergence rate and the number of quantization levels.
Article
Full-text available
Dynamics on networks are often characterized by the second smallest eigenvalue of the Laplacian matrix of the network, which is called the spectral gap. Examples include the threshold coupling strength for synchronization and the relaxation time of a random walk. A large spectral gap is usually associated with high network performance, such as facilitated synchronization and rapid convergence. In this study, we seek to enhance the spectral gap of undirected and unweighted networks by removing nodes because, practically, the removal of nodes often costs less than the addition of nodes, addition of links, and rewiring of links. In particular, we develop a perturbative method to achieve this goal. The proposed method realizes better performance than other heuristic methods on various model and real networks. The spectral gap increases as we remove up to half the nodes in most of these networks.
Article
Full-text available
A new family of graphs, entangled networks, with optimal properties in many respects, is introduced. By definition, their topology is such that it optimizes synchronizability for many dynamical processes. These networks are shown to have an extremely homogeneous structure: degree, node distance, betweenness, and loop distributions are all very narrow. Also, they are characterized by a very interwoven (entangled) structure with short average distances, large loops, and no well-defined community structure. This family of nets exhibits an excellent performance with respect to other flow properties such as robustness against errors and attacks, minimal first-passage time of random walks, efficient communication, etc. These remarkable features convert entangled networks in a useful concept, optimal or almost optimal in many senses, and with plenty of potential applications in computer science or neuroscience.
Article
Full-text available
Dynamical properties of complex networks are related to the spectral properties of the Laplacian matrix that describes the pattern of connectivity of the network. In particular we compute the synchronization time for different types of networks and different dynamics. We show that the main dependence of the synchronization time is on the smallest nonzero eigenvalue of the Laplacian matrix, in contrast to other proposals in terms of the spectrum of the adjacency matrix. Then, this topological property becomes the most relevant for the dynamics. Comment: 14 pages, 5 figures, to be published in New Journal of Physics
Article
Full-text available
Many complex systems, such as communication networks, display a surprising degree of robustness: while key components regularly malfunction, local failures rarely lead to the loss of the global information-carrying ability of the network. The stability of these complex systems is often attributed to the redundant wiring of the functional web defined by the systems' components. In this paper we demonstrate that error tolerance is not shared by all redundant systems, but it is displayed only by a class of inhomogeneously wired networks, called scale-free networks. We find that scale-free networks, describing a number of systems, such as the World Wide Web, Internet, social networks or a cell, display an unexpected degree of robustness, the ability of their nodes to communicate being unaffected by even unrealistically high failure rates. However, error tolerance comes at a high price: these networks are extremely vulnerable to attacks, i.e. to the selection and removal of a few nodes that play the most important role in assuring the network's connectivity. Comment: 14 pages, 4 figures, Latex
Article
Metros (heavy rail transit systems) are integral parts of urban transportation systems. Failures in their operations can have serious impacts on urban mobility, and measuring their robustness is therefore critical. Moreover, as physical networks, metros can be viewed as topological entities, and as such they possess measurable network properties. In this article, by using network science and graph theory, we investigate ten theoretical and four numerical robustness metrics and their performance in quantifying the robustness of 33 metro networks under random failures or targeted attacks. We find that the ten theoretical metrics capture two distinct aspects of robustness of metro networks. First, several metrics place an emphasis on alternative paths. Second, other metrics place an emphasis on the length of the paths. To account for all aspects, we standardize all ten indicators and plot them on radar diagrams to assess the overall robustness for metro networks. Overall, we find that Tokyo and Rome are the most robust networks. Rome benefits from short transferring and Tokyo has a significant number of transfer stations, both in the city center and in the peripheral area of the city, promoting both a higher number of alternative paths and overall relatively short path-lengths.
Conference Paper
Cascading failures are one of the main reasons for blackouts in electrical power grids. Stable power supply requires a robust design of the power grid topology. Currently, the impact of the grid structure on the grid robustness is mainly assessed by purely topological metrics, that fail to capture the fundamental properties of the electrical power grids such as power flow allocation according to Kirchhoff's laws. This paper deploys the effective graph resistance as a metric to relate the topology of a grid to its robustness against cascading failures. Specifically, the effective graph resistance is deployed as a metric for network expansions (by means of transmission line additions) of an existing power grid. Four strategies based on network properties are investigated to optimize the effective graph resistance, accordingly to improve the robustness, of a given power grid at a low computational complexity. Experimental results suggest the existence of Braess's paradox in power grids: bringing an additional line into the system occasionally results in decrease of the grid robustness. This paper further investigates the impact of the topology on the Braess's paradox, and identifies specific substructures whose existence results in Braess's paradox. Careful assessment of the design and expansion choices of grid topologies incorporating the insights provided by this paper optimizes the robustness of a power grid, while avoiding the Braess's paradox in the system.
Article
The average Watts-Strogatz clustering coefficient and the network transitivity are widely used descriptors for characterizing the transitivity of relations in real-world graphs (networks). These indices are bounded between zero and one, with low values indicating poor transitivity and large ones indicating a high proportion of closed triads in the graphs. Here, we prove that these two indices diverge for windmill graphs when the number of nodes tends to infinity. We also give evidence that this divergence occurs in many real-world networks, especially in citation and collaboration graphs. We obtain analytic expressions for the eigenvalues and eigenvectors of the adjacency and the Laplacian matrices of the windmill graphs. Using this information we show the main characteristics of two dynamical processes when taking place on windmill graphs: synchronization and epidemic spreading. Finally, we show that many of the structural and dynamical properties of a real-world citation network are well reproduced by the appropriate windmill graph, showing the potential of these graphs as models for certain classes of real-world networks.
Article
We give basic definitions and some results related to the theory of graph spectra. We present a short survey of applications of this theory. In addition, selected bibliographies on applications to particular branches of science are given.
Article
Seminal paper on algebraic connectivity of a network
Article
We report the current state of the graph isomorphism problem from the practical point of view. After describing the general principles of the refinement-individualization paradigm and proving its validity, we explain how it is implemented in several of the key programs. In particular, we bring the description of the best known program nauty up to date and describe an innovative approach called Traces that outperforms the competitors for many difficult graph classes. Detailed comparisons against saucy, Bliss and conauto are presented.
Book
This concise and self-contained introduction builds up the spectral theory of graphs from scratch, with linear algebra and the theory of polynomials developed in the later parts. The book focuses on properties and bounds for the eigenvalues of the adjacency, Laplacian and effective resistance matrices of a graph. The goal of the book is to collect spectral properties that may help to understand the behavior or main characteristics of real-world networks. The chapter on spectra of complex networks illustrates how the theory may be applied to deduce insights into real-world networks. The second edition contains new chapters on topics in linear algebra and on the effective resistance matrix, and treats the pseudoinverse of the Laplacian. The latter two matrices and the Laplacian describe linear processes, such as the flow of current, on a graph. The concepts of spectral sparsification and graph neural networks are included.
Article
The influence of the network characteristics on the virus spread is analyzed in a new-the N -intertwined Markov chain-model, whose only approximation lies in the application of mean field theory. The mean field approximation is quantified in detail. The N -intertwined model has been compared with the exact 2N-state Markov model and with previously proposed ldquohomogeneousrdquo or ldquolocalrdquo models. The sharp epidemic threshold tauc , which is a consequence of mean field theory, is rigorously shown to be equal to tauc = 1/(lambdamax( A )) , where lambdamax( A ) is the largest eigenvalue-the spectral radius-of the adjacency matrix A . A continued fraction expansion of the steady-state infection probability at node j is presented as well as several upper bounds.
Conference Paper
Distributed decision making in networked systems depends critically on the timely availability of critical fresh information. Performance of networked systems, from the perspective of achieving goals and objectives in a timely and efficient manner is constrained by their collaboration and communication structures and their interplay with the networked system's dynamics. In most cases achieving the system objectives requires many agent to agent communications. A reasonable measure for system robustness to communication topology change is the number of spanning trees in the graph abstraction of the communication system. We address the problem of network formation with robustness and connectivity constraints. Solutions to this problem have also applications in trust and the relationship of trust to control. We show that the general combinatorial problem can be relaxed to a convex optimization problem. We solve the special case of adding a shortcut to a given structure and provide insights for derivation of heuristics for the general case. We also analyze the small world effect in the context of abrupt increases in the number of spanning trees as a result of adding a few shortcuts to a base lattice in the Watts-Strogatz framework and thereby relate efficient topologies to small world and expander graphs.
Article
The Kuramoto model describes a large population of coupled limit-cycle oscillators whose natural frequencies are drawn from some prescribed distribution. If the coupling strength exceeds a certain threshold, the system exhibits a phase transition: some of the oscillators spontaneously synchronize, while others remain incoherent. The mathematical analysis of this bifurcation has proved both problematic and fascinating. We review 25 years of research on the Kuramoto model, highlighting the false turns as well as the successes, but mainly following the trail leading from Kuramoto’s work to Crawford’s recent contributions. It is a lovely winding road, with excursions through mathematical biology, statistical physics, kinetic theory, bifurcation theory, and plasma physics.
Article
In this paper, we shall give a survey of applications of the theory of graph spectra to Computer Science. Eigenvalues and eigenvectors of several graph matrices appear in numerous papers on various subjects relevant to information and communication technologies. In particular, we survey applications in modeling and searching Internet, in computer vision, data mining, multiprocessor systems, statistical databases, and in several other areas. Some related new mathematical results are included together with several comments on perspectives for future research. In particular, we claim that balanced subdivisions of cubic graphs are good models for virus resistent computer networks and point out some advantages in using integral graphs as multiprocessor interconnection networks.
Article
A novel approach to describe the 3D structure of small/medium-sized and large molecules is introduced. A vector and an index are defined on the basis of considering second line graphs with edges weighted by the dihedral angles of the molecule. They measure the 3D `compactness' or folding of the molecular structures, giving maximum values for the most folded structures. We have ranked five protein models according to their degree of folding. The similarity among these proteins has been determined, showing that the most folded proteins are not similar among them, while the less folded ones are similar to each other.
Conference Paper
The second smallest eigenvalue of the Laplacian matrix, also known as the algebraic connectivity, is a remarkable measure to unfold the robustness of complex networks. In this paper we study the asymptotic behavior of the algebraic connectivity in the Erd˝os-Rényi random graph, the simplest model to describe a complex network. We estimate analytically the mean and the variance of the algebraic connectivity by approximating it with the minimum nodal degree. The resulting estimate improves a known expression for the asymptotic behavior of the algebraic connectivity (19). Simulations emphasize the accuracy of the analytical estimation. Further, we study the algebraic connectivity in relation to the graph's robustness to node and link failures, i.e. the number of nodes and links that have to be removed in order to disconnect a graph, called the node and the link connectivity. Extensive simulations show that the node, the edge connectivity and the minimal nodal degree converge to an identical distribution already at small graph sizes.
Article
In a recent work [Proc. Natl. Acad. Sci. USA 108, 3838 (2011)], Schneider et al. proposed a new measure for network robustness and investigated optimal networks with respect to this quantity. For networks with a power-law degree distribution, the optimized networks have an onion structure-high-degree vertices forming a core with radially decreasing degrees and an over-representation of edges within the same radial layer. In this paper we relate the onion structure to graphs with good expander properties (another characterization of robust network) and argue that networks of skewed degree distributions with large spectral gaps (and thus good expander properties) are typically onion structured. Furthermore, we propose a generative algorithm producing synthetic scale-free networks with onion structure, circumventing the optimization procedure of Schneider et al. We validate the robustness of our generated networks against malicious attacks and random removals.
Article
The discovery of community structure is a common challenge in the analysis of network data. Many methods have been proposed for finding community structure, but few have been proposed for determining whether the structure found is statistically significant or whether, conversely, it could have arisen purely as a result of chance. In this paper we show that the significance of community structure can be effectively quantified by measuring its robustness to small perturbations in network structure. We propose a suitable method for perturbing networks and a measure of the resulting change in community structure and use them to assess the significance of community structure in a variety of networks, both real and computer generated.
Article
Continuous-time analog neural networks with symmetric connections will always converge to fixed points when the neurons have infinitely fast response, but can oscillate when a small time delay is present. Sustained oscillation resulting from time delay is relevant to hardware implementations of neural networks where delay due to the finite switching speed of amplifiers can be appreciable compared to the network relaxation time. We analyze the dynamics of continuous-time analog networks with delay, and show that there is a critical delay above which a symmetrically connected network will oscillate. Two different stability analyses are presented for low and high neuron gain. The results are useful as design criteria for building fast but stable electronic networks. We find that for some connection topologies, a delay much smaller than the relaxation time can lead to oscillation, whereas for other topologies, including associative memory networks, even long delays will not produce oscillation. The most oscillation-prone network configuration is the all-inhibitory network; in this configuration, the critical delay for oscillation is smaller than the network relaxation time by a factor of N, the size of the network. Theoretical results are compared with numerical simulations and with experiments performed on a small (eight neurons) electronic network with controllable delay.
Article
We introduce a new centrality measure that characterizes the participation of each node in all subgraphs in a network. Smaller subgraphs are given more weight than larger ones, which makes this measure appropriate for characterizing network motifs. We show that the subgraph centrality [C(S)(i)] can be obtained mathematically from the spectra of the adjacency matrix of the network. This measure is better able to discriminate the nodes of a network than alternate measures such as degree, closeness, betweenness, and eigenvector centralities. We study eight real-world networks for which C(S)(i) displays useful and desirable properties, such as clear ranking of nodes and scale-free characteristics. Compared with the number of links per node, the ranking introduced by C(S)(i) (for the nodes in the protein interaction network of S. cereviciae) is more highly correlated with the lethality of individual proteins removed from the proteome.
Synchronization in small-world systems. Physical review letters
  • M Barahona
  • L M Pecora
Barahona, M., Pecora, L.M.: Synchronization in small-world systems. Physical review letters 89(5), 054101 (2002)
The Internet topology zoo. Selected Areas in Communications
  • S Knight
  • H X Nguyen
  • N Falkner
  • R Bowden
  • M Roughan
Knight, S., Nguyen, H.X., Falkner, N., Bowden, R., Roughan, M.: The Internet topology zoo. Selected Areas in Communications, IEEE Journal on 29(9), 1765 -1775 (2011)
  • W Ellens
  • R E Kooij
Ellens, W., Kooij, R.E.: Graph measures and network robustness. arXiv preprint arXiv:1311.5064 (2013)