Article

The dynamic correlation between degree and betweenness of complex network under attack

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Complex networks are often subjected to failure and attack. Recent work has addressed the resilience of complex networks to either random or intentional deletion of nodes or links. Here we simulate the breakdown of the small-world network and the scale-free network under node failure or attacks. We analyze and discuss the dynamic correlation between degree and betweenness in the process of attack. The simulation results show that the correlation for scale-free network obeys a power law distribution until the network collapses, while it represents irregularly for small-world network.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Other node attack strategies are based on different topological properties of the networks, like eigenvector centrality [5,8], closeness centrality [9,10], and clustering coefficient [5]. In addition, scientists investigated these attack strategies' variations [4,11,12]. By analyzing the impact of these attack strategies, we can identify the node importance in the network. ...
Article
Full-text available
In this study, we investigate the effect of weight thresholding (WT) on the robustness of real-world complex networks. Here, we assess the robustness of networks after WT against various node attack strategies. We perform WT by removing a fixed fraction of weak links. The size of the largest connected component indicates the network's robustness. We find that real-world networks subjected to WT hold a robust connectivity structure to node attack even for higher WT values. In addition, we analyze the change in the top 30% of central nodes with WT and find a positive correlation in the ranking of central nodes for weighted node centralities. Differently, binary node centralities show a lower correlation when networks are subjected to WT. This result indicates that weighted node centralities are more stable indicators of node importance in real-world networks subjected to link sparsification.
... It appears that, in the delay propagation networks for both China and USA, airports with higher degree have larger closeness coefficient. The relationship between degree and betweenness coefficient is shown in Fig. 4(f) and one can see that betweenness coefficient is positively correlated with degree in both China and the USA networks, which is also a feature of scale-free networks [56]. Especially, the range of betweenness are relatively large when degree is lower than 40. ...
... These networks are characterized by several important structural, emergent properties like degree distribution, correlation coefficient, average nearest neighbor, average path length, clustering coefficient, community structure, etc. Recently, the modeling and statistical aspects of such emergent structural properties, therefore, remain an important research area in the study of large-scale real-world complex networks (Zarandi and Rafsanjani 2018;Newman 2003;Cui et al. 2014;Nie et al. 2016;Shakibian and Charkari 2018). In this regard, the node degree distribution has been well studied and viewed as an important structural characteristic of real-world networks (Muchnik et al. 2013). ...
Article
Full-text available
Real-world networks are generally claimed to be scale-free. This means that the degree distributions follow the classical power-law, at least asymptotically. However, closer observation shows that the classical power-law distribution is often inadequate to meet the data characteristics due to the existence of an identifiable nonlinearity in the entire degree distribution in the log-log scale. The present paper proposes a new variant of the popular heavy-tailed Lomax distribution which we named as the modified Lomax (MLM) distribution that can efficiently capture the crucial aspect of heavy-tailed behavior of the entire degree distribution of real-world complex networks. The proposed MLM model, derived from a hierarchical family of Lomax distributions, can efficiently fit the entire degree distribution of real-world networks without removing lower degree nodes, as opposed to the classical power-law-based fitting. The MLM distribution belongs to the maximum domain of attraction of the Frechet distribution and is right tail equivalent to Pareto distribution. Various statistical properties including characteristics of the maximum likelihood estimates and asymptotic distributions have also been derived for the proposed MLM model. Finally, the effectiveness of the proposed MLM model is demonstrated through rigorous experiments over fifty real-world complex networks from diverse applied domains.
... The vulnerability of electric power grids, for example, is related to the number of hubs where multiple connections are concentrated. The broken subgroup symmetries in cancer cell networks are related to the level of attack tolerance at the sites of reduced symmetry in cancer networks [105,106] [107][108][109][110][111]. Identifying subgroup cancer network broken symmetries is a viable strategy to direct targeted therapeutics to these sites [89,90,84]. ...
Article
The problem of cancer is examined from the metaphysical standpoint of essence and ground. An essentialist definition of cancer is assumed that would be valid in all possible worlds in which cancer could logically exist. The grounds of cancer are then examined and elucidated. Two grounding cancer properties are identified and discussed: symmetry breaking and computational intelligence. Each examination leads to concrete conclusions for novel therapeutic approaches and a more fundamental understanding of what cancer is at bottom. Other possible cancer grounding properties related to evolution, adaptability and stochastic features are identified for future work. This approach is novel and offers new solutions to the problem of cancer.
... In the aspect of the study of antiinjury function of robustness in human brain networks, Saeedeh et al. [33] found that human brain networks have a certain degree of anti-injury ability against targeted attack to hub nodes in biological experiment. In the aspect of the anti-injury function in ANN, Nie et al. [34] evaluated the robustness of complex network through the variant of the characteristics (maximal degree, average degree and betweenness), which concluded that the SFN has a certain antiinjury function under node failure or attacks. The researches on the anti-injury function are conducted in the ANNs without nerve electrophysiological characteristics. ...
Article
Full-text available
With the continuous improvement of automation and informatization, the electromagnetic environment has become increasingly complex. Traditional protection methods for electronic systems are facing with serious challenges. Biological nervous system has the self-adaptive advantages under the regulation of the nervous system. It is necessary to explore a new thought on electromagnetic protection by drawing from the self-adaptive advantage of the biological nervous system. In this study, the scale-free spiking neural network (SFSNN) is constructed, in which the Izhikevich neuron model is employed as a node, and the synaptic plasticity model including excitatory and inhibitory synapses is employed as an edge. Under white Gaussian noise, the noise suppression abilities of the SFSNNs with the high average clustering coefficient (ACC) and the SFSNNs with the low ACC are studied comparatively. The noise suppression mechanism of the SFSNN is explored. The experiment results demonstrate that the following. (1) The SFSNN has a certain degree of noise suppression ability, and the SFSNNs with the high ACC have higher noise suppression performance than the SFSNNs with the low ACC. (2) The neural information processing of the SFSNN is the linkage effect of dynamic changes in neuron firing, synaptic weight and topological characteristics. (3) The synaptic plasticity is the intrinsic factor of the noise suppression ability of the SFSNN.
... From a study on RFID patents, Hung and Wang [33] concluded that patents with high betweenness centrality in the network play an important role in the transfer of knowledge. Nie et al. [57] concluded that to test the robustness of smallworld networks, node removal strategy based on betweenness centrality is most damaging. Since our research aims to assess the knowledge flow resulting from knowledge spillovers, the best strategy would hence be to attack the nodes that control this flow. ...
Article
This article sets out to investigate the relationship between the knowledge spillovers and the potential for high-value inventions to emerge. We study knowledge spillovers through the lens of knowledge networks: a web of interrelated knowledge elements with structural patterns. Knowledge spillovers influence the productivity of a technology sector and hence their pattern should impact the pathways for knowledge appropriation in the sector. It is hypothesized that a robust knowledge network is associated with knowledge appropriation leading to valuable inventions. We test the hypothesis using patent citation network of 1821 patents from 78 inventions belonging to two different technological sectors. Based upon the findings, we suggest network robustness as a new metric for determining the technical value of inventions.
... Real networks, which are organized in a complex topological structure, show a large heterogeneity in the capacity and intensity of the connections (the weight of the links) [17]. Recently, many features of weighted networks have been studied, for example, the relationship between the node degree and node strength [18], degree correlations and perturbations [19], node correlations [20], and dynamical properties of nodes and degrees [21]. The study of highly interconnected systems became an important area of multidisciplinary research in network science involving physics, mathematics, biology, and social sciences, but recently the interest has shifted towards weighted networks. ...
Article
Full-text available
We investigated the strength of the interactions of the elements of the Estonian network of payments (link weight of payments and volume of payments) by the realization of particular experiments. Specific statistical measures of this network, which combine the topology of the relations of the strength of links and nodes and their specific weights, were studied with the purpose of discovering beyond the topological architecture of our network and revealing aspects of its complex structure. Moreover, scale-free properties between the strengths and the degree values were found. We also identified clear patterns of structural changes in such a network over the analysed period.
... The betweenness centrality [14,16,17] of a node i refers to the number of all shortest paths that pass through node i. It is a measure of the importance of nodes in the process of network information delivery. ...
Article
Full-text available
In this paper, we presents a new method for extracting interest points from RGB images by complex network analysis theory. Firstly, the RGB images are expressed as the multi-level complex network model. The nodes in the complex network model are the maximum degree pixel of subgraphs and the links are the similarity and distance between the maximum degree pixels. Then three different algorithms were proposed to locate these interest points based on three topology features of high-level complex network model, which are degree, closeness and betweenness centrality. In order to verify the effectiveness of our algorithm, we use our algorithm for four different test images. It is remarkable that the identify targets by degree centrality algorithm are similar to Harris and SIFT algorithm, and the accuracy are more than them. The results show that our algorithm could identify interest points from the images effectively.
... Another concept related to network symmetry is network attack tolerance [67,[96][97][98][99][100][101][102]. Attack tolerance is the network's resilience to random or intentional deletion of nodes or interference with the connection of nodes (deletion of graph edges). ...
Article
Full-text available
Symmetry and symmetry breaking concepts from physics and biology are applied to the problem of cancer. Three categories of symmetry breaking in cancer are examined: combinatorial, geometric, and functional. Within these categories, symmetry breaking is examined for relevant cancer features, including epithelialmesenchymal transition (EMT); tumor heterogeneity; tensegrity; fractal geometric and information structure; functional interaction networks; and network stabilizability and attack tolerance. The new cancer symmetry concepts are relevant to homeostasis loss in cancer and to its origin, spread, treatment and resistance. Symmetry and symmetry breaking could provide a new way of thinking and a pathway to a solution of the cancer problem.
... For example, although the networks considered here have power-law or exponential degree distributions with changing q, they are essentially random by nature and are lack of low degree nodes acting as "bridges" connecting highly connected parts of the networks. The correlation between TA and TB has also been discussed and compared numerically in [31] and [58] in terms of giant component size and diameter. ...
Article
Full-text available
Network measures derived from empirical observations are often poor estimators of the true structure of system as it is impossible to observe all components and all interactions in many real world complex systems. Here, we study attack robustness of complex networks with data missing caused by: 1) a uniform random sampling and 2) a nonuniform random sampling. By introducing the subgraph robustness problem, we develop analytically a framework to investigate robustness properties of the two types of subgraphs under random attacks, localized attacks, and targeted attacks. Interestingly, we find that the benchmark models, such as Erdős-Rényi graphs, random regular networks, and scale-free networks possess distinct characteristic subgraph robustness features. We show that the network robustness depends on several factors including network topology, attack mode, sampling method and the amount of data missing, generalizing some well-known robustness principles of complex networks. Our results offer insight into the structural effect of missing data in networks and highlight the significance of understanding different sampling processes and their consequences on attack robustness, which may be instrumental in designing robust systems.
... 10,11 Complex networks are often vul-nerable under disruptions, e.g., those caused by natural disasters or human intentional attacks. 12,13 Therefore, research on network robustness and resilience has gained significant attention. 14,15 Most researchers focus on network robustness, aiming to simulate network percolation processes; with the result that networks are often rather vulnerable. ...
Article
Full-text available
Robustness of complex networks has been studied for decades, with a particular focus on network attack. Research on network repair, on the other hand, has been conducted only very lately, given the even higher complexity and absence of an effective evaluation metric. A recently proposed network repair strategy is self-healing, which aims to repair networks for larger components at a low cost only with local information. In this paper, we discuss the effectiveness and efficiency of self-healing, which limits network repair to be a multi-objective optimization problem and makes it difficult to measure its optimality. This leads us to a new network repair evaluation metric. Since the time complexity of the computation is very high, we devise a greedy ranking strategy. Evaluations on both real-world and random networks show the effectiveness of our new metric and repair strategy. Our study contributes to optimal network repair algorithms and provides a gold standard for future studies on network repair.
... The second property analyzed is normalized betweenness centrality. The betweenness centrality [35], denoted by , is used to examine the potential of a sensor on the connection control with other sensors and evaluate ratio summation of shortest paths passing through sensor . Therefore, the betweenness centrality of sensor is formulated as ...
Article
Full-text available
Secure localization under different forms of attack has become an essential task in wireless sensor networks. Despite the significant research efforts in detecting the malicious nodes, the problem of localization attack type recognition has not yet been well addressed. Motivated by this concern, we propose a novel exchange-based attack classification algorithm. This is achieved by a distributed expectation maximization extractor integrated with the PECPR-MKSVM classifier. First, the mixed distribution features based on the probabilistic modeling are extracted using a distributed expectation maximization algorithm. After feature extraction, by introducing the theory from support vector machine, an extensive contractive Peaceman-Rachford splitting method is derived to build the distributed classifier that diffuses the iteration calculation among neighbor sensors. To verify the efficiency of the distributed recognition scheme, four groups of experiments were carried out under various conditions. The average success rate of the proposed classification algorithm obtained in the presented experiments for external attacks is excellent and has achieved about 93.9% in some cases. These testing results demonstrate that the proposed algorithm can produce much greater recognition rate, and it can be also more robust and efficient even in the presence of excessive malicious scenario.
Chapter
The problem of cancer is examined from the metaphysical standpoint of essence and ground. An essentialist definition of cancer is assumed that would be valid in all possible worlds in which cancer could logically exist. The grounds of cancer are then examined and elucidated. Two grounding cancer properties are identified and discussed: symmetry-breaking and computational intelligence. Each examination leads to concrete conclusions for novel therapeutic approaches and a more fundamental understanding of what cancer is at bottom. Other possible cancer grounding properties related to evolution, adaptability and stochastic features are identified for future work. This approach is novel and offers new solutions to the problem of cancer.KeywordsCancerIntelligenceSymmetry-breakingComputationGround
Chapter
Complex networks are robust to random failures; but not always to targeted attacks. The resilience of complex networks towards different node targeted attacks are studied immensely in the literature. Many node attack strategies were also proposed, and their efficiency was compared. However, in each of these proposals, the scientists used different measures of efficiency. So, it doesn’t seem easy to compare them and choose the one most suitable for the system under examination. Here, we review the main results from the literature on centrality based node attack strategies. Our focus is only on the works on undirected and unweighted networks. We want to highlight the necessity of a more realistic measure of attack efficiency.
Article
Brain-like intelligence is to simulate the structure and function of the biological brain as much as possible based on the latest findings in brain science. The biological brain has strong robustness under external attacks. In this study, two complex spiking neural networks (CSNNs) are constructed: a spiking neural network (SNN) with small-world topology and a SNN with scale-free topology, in which the nodes are Izhikevich neuron model, and the edges are the synaptic plasticity model including excitatory and inhibitory synapses. For the targeted attack, the anti-injury function of two CSNNs is comparatively analyzed. On this basis, the anti-injury mechanism of CSNN is explored. The experimental results show that: (1) Small-world SNN (SWSNN) has better performance than scale-free SNN (SFSNN) in the anti-injury ability under targeted attack on high-degree nodes, and they are similar in the anti-injury ability under targeted attack on intermediate and low-degree nodes; the results of the robustness of topology are consistent with the results of anti-injury function of CSNNs, which indicates that the anti-injury ability of two kinds of CSNNs is affected by topology. (2) The dynamic evolutions of neuron firing, synaptic weight, and topological characteristic in the information processing of CSNN have a linkage effect under targeted attack; the dynamic evolution of synaptic weight is significantly related to the anti-injury ability of CSNNs, which clues synaptic plasticity is the intrinsic factor of the anti-injury function of CSNNs.
Article
In this study, adversarial graph bandit theory is used to rapidly select the optimal attack node in underwater acoustic sensor networks (UASNs) with unknown topology. To ensure the flexibility and elusiveness of underwater attack, we propose a bandit-based hybrid attack mode that combines active jamming and passive eavesdropping. We also present a virtual expert-guided online learning algorithm to select the optimal node without priori topology information and complex calculation. The virtual expert mechanism is proposed to guide the algorithm learning. The expert establishes a virtual topology configuration, which addresses the blind exploration and energy consumption of attackers to a large extent. With the acoustic broadcast characteristic, we also put forward an expert self-updating method to follow the changes of real networks. This method enables the algorithm to commendably adapt to the dynamic environments. Simulation results verify the strong adaptability and robustness of the proposed algorithm.
Article
Since the expansion of the scale of urban rail construction, the networked structure is becoming a significant characteristic of rail transit systems. For a further understanding of the networked structure, the evolution mechanism of the rail transit network is worth research and discussion. To select the most appropriate modeling space, four topological spaces (L-Space, P-Space, B-Space, and R-Space) are analyzed based on three indicators (degree distribution, clustering coefficient, and average length); P-space is selected as the basic space for topology because of its high clustering coefficient and low average length. The topological network, which is obtained by P-space, shows exponential degree distribution and local-world characteristics. After dissecting the evolution law of degree distribution and other parameters and the connection mechanism of rail transit topology, the improved local-world evolving model of urban rail transit networks (URTNs) is developed. The model is verified by its application in six cities’ rail-transit networks (London, New York, Paris, Beijing, Shanghai, and Shenzhen). The results show little difference between the real network and evolving network, with a high consistency of their degree distributions and the network indicators. These illustrate that the model can reflect the real characteristics of URTNs and that it can be used to generate a new network that has a similar structure to the real network.
Chapter
In recent years, with the rapid development of Internet of Things, the scale of networks has gradually increased, link capacity has become higher and higher. In this context, the research on the reliability of complex networks has been paid more and more attention. The network reliability analysis and evaluation technology need to be implemented by a corresponding network simulation and reliability comprehensive test analysis system, and presented to the user in a variety of visual forms more intuitively, so as to be further applied to the actual network environment. Simulate the network attack mode under complex network environment and get the reliability evaluation results under different attack strategies. Establishing complex network simulation attacks and analyzing them is of great significance in evaluating the reliability of complex IoT topologies. In this paper, a complex network evaluation model was further established. At the same time, three kinds of strategies are designed to simulate attacks on complex networks. Based on Django and the smart home network, the method is implemented in a more appropriate form of visualization. Finally, the method is tested and analyzed. The results show that the method can better represent the process of complex network simulation attacks and the changes of network reliability indicators.
Article
We study the survivability of weighted-edge attacks with incomplete information on a free-scale network. We consider a situation where attackers can detect partial edges of the network, and the information regarding the detected edges may be imprecise. Random and intentional attacks are the two extreme cases of this investigation. In this article, edge betweenness is adopted to describe the weight of an edge. Based on this, α is employed as the parameter describing the scope of edges that can be detected from the network and β as the parameter depicting the accuracy of the already detected edge weight information. Attack strategies that are different from both random and targeted attacks are designed for a scale-free network with incomplete information. Numerical simulations are performed and results are obtained: (i) a larger α or β can worsen both the network connectivity and efficiency while confronting attacks; (ii) when α is small, β has a relatively small impact on the network connectivity ς, but with the increase in α, both α and β both play important roles in it; (iii) β consistently plays an important role in network efficiency, regardless of the value of α. The results of this article are helpful for the future development of effective protection strategies in scale-free networks as it is more convenient and realistic to protect network information than to adjust the network structural topology.
Article
In Command and Control (C2) network, the measure of invulnerability mainly focuses on structural characteristics of the network, where the operational mission has not been adequately considered. As a result, it becomes difficult to assess the invulnerability of C2 network in a dynamical manner. In this paper, the operational entities and heterogeneous relationships among combat entities are analyzed, where the operational C2 network model is constructed based on the combat theory of OODA and the super network. Subsequently, the mission link is defined, which can be used to characterize the combat network. Finally, a new measure of invulnerability for C2 networks is proposed based on the efficiency and entropy of the mission link. In particular, this measure can desirably represent the efficiency of information transmission and robustness of network structures, respectively. The simulation results have demonstrated that the proposed invulnerability measure is highly sensitive and accurate. More specifically, the proposed measure could more accurately reveal the invulnerability of C2 network, where theoretical basis for designing and optimizing the structure of C2 networks can be also provided.
Article
Full-text available
Despite its increasing role in communication, the world wide web remains the least controlled medium: any individual or institution can create websites with unrestricted number of documents and links. While great efforts are made to map and characterize the Internet's infrastructure, little is known about the topology of the web. Here we take a first step to fill this gap: we use local connectivity measurements to construct a topological model of the world wide web, allowing us to explore and characterize its large scale properties. Comment: 5 pages, 1 figure, updated with most recent results on the size of the www
Article
Full-text available
We analyze the betweenness centrality (BC) of nodes in large complex networks. In general, the BC is increasing with connectivity as a power law with an exponent h\eta . We find that for trees or networks with a small loop density h = 2\eta = 2 while a larger density of loops leads to h < 2\eta < 2 . For scale-free networks characterized by an exponent g\gamma which describes the connectivity distribution decay, the BC is also distributed according to a power law with a non universal exponent d\delta . We show that this exponent d\delta must satisfy the exact bound d ³ (g+ 1)/2\delta\geq (\gamma + 1)/2 . If the scale free network is a tree, then we have the equality d = (g+ 1)/2\delta = (\gamma + 1)/2 .
Article
Full-text available
The concept of network efficiency, recently proposed to characterize the properties of small-world networks, is here used to study the effects of errors and attacks on scale-free networks. Two different kinds of scale-free networks, i.e., networks with power law P(k), are considered: (1) scale-free networks with no local clustering produced by the Barabasi–Albert model and (2) scale-free networks with high clustering properties as in the model by Klemm and Eguı́luz, and their properties are compared to the properties of random graphs (exponential graphs). By using as mathematical measures the global and the local efficiency we investigate the effects of errors and attacks both on the global and the local properties of the network. We show that the global efficiency is a better measure than the characteristic path length to describe the response of complex networks to external factors. We find that, at variance with random graphs, scale-free networks display, both on a global and on a local scale, a high degree of error tolerance and an extreme vulnerability to attacks. In fact, the global and the local efficiency are unaffected by the failure of some randomly chosen nodes, though they are extremely sensitive to the removal of the few nodes which play a crucial role in maintaining the network's connectivity.
Conference Paper
Full-text available
Betweenness is a centrality measure based on shortest paths, widely used in complex network analysis. It is computationally-expensive to exactly determine betweenness; currently the fastest-known algorithm by Brandes requires O(nm) time for unweighted graphs and O(nm + n 2logn) time for weighted graphs, where n is the number of vertices and m is the number of edges in the network. These are also the worst-case time bounds for computing the betweenness score of a single vertex. In this paper, we present a novel approximation algorithm for computing betweenness centrality of a given vertex, for both weighted and unweighted graphs. Our approximation algorithm is based on an adaptive sampling technique that significantly reduces the number of single-source shortest path computations for vertices with high centrality. We conduct an extensive experimental study on real-world graph instances, and observe that our random sampling algorithm gives very good betweenness approximations for biological networks, road networks and web crawls.
Article
Full-text available
We study the statistical properties of a variety of diverse real-world networks. We present evidence of the occurrence of three classes of small-world networks: (a) scale-free networks, characterized by a vertex connectivity distribution that decays as a power law; (b) broad-scale networks, characterized by a connectivity distribution that has a power law regime followed by a sharp cutoff; and (c) single-scale networks, characterized by a connectivity distribution with a fast decaying tail. Moreover, we note for the classes of broad-scale and single-scale networks that there are constraints limiting the addition of new links. Our results suggest that the nature of such constraints may be the controlling factor for the emergence of different classes of networks.
Article
Full-text available
In a cell or microorganism, the processes that generate mass, energy, information transfer and cell-fate specification are seamlessly integrated through a complex network of cellular constituents and reactions. However, despite the key role of these networks in sustaining cellular functions, their large-scale structure is essentially unknown. Here we present a systematic comparative mathematical analysis of the metabolic networks of 43 organisms representing all three domains of life. We show that, despite significant variation in their individual constituents and pathways, these metabolic networks have the same topological scaling properties and show striking similarities to the inherent organization of complex non-biological systems. This may indicate that metabolic organization is not only identical for all living organisms, but also complies with the design principles of robust and error-tolerant scale-free networks, and may represent a common blueprint for the large-scale organization of interactions among all cellular constituents.
Article
Full-text available
A common property of many large networks, including the Internet, is that the connectivity of the various nodes follows a scale-free power-law distribution, P(k) = ck(-alpha). We study the stability of such networks with respect to crashes, such as random removal of sites. Our approach, based on percolation theory, leads to a general condition for the critical fraction of nodes, p(c), that needs to be removed before the network disintegrates. We show analytically and numerically that for alpha</=3 the transition never takes place, unless the network is finite. In the special case of the physical structure of the Internet (alpha approximately 2.5), we find that it is impressively robust, with p(c)>0.99.
Article
Full-text available
We study the tolerance of random networks to intentional attack, whereby a fraction p of the most connected sites is removed. We focus on scale-free networks, having connectivity distribution P(k) approximately k(-alpha), and use percolation theory to study analytically and numerically the critical fraction p(c) needed for the disintegration of the network, as well as the size of the largest connected cluster. We find that even networks with alpha < or = 3, known to be resilient to random removal of sites, are sensitive to intentional attack. We also argue that, near criticality, the average distance between sites in the spanning (largest) cluster scales with its mass, M, as square root of [M], rather than as log (k)M, as expected for random networks away from criticality.
Article
Full-text available
In this paper we present the first mathematical analysis of the protein interaction network found in the yeast, S. cerevisiae. We show that, (a) the identified protein network display a characteristic scale-free topology that demonstrate striking similarity to the inherent organization of metabolic networks in particular, and to that of robust and error-tolerant networks in general. (b) the likelihood that deletion of an individual gene product will prove lethal for the yeast cell clearly correlates with the number of interactions the protein has, meaning that highly-connected proteins are more likely to prove essential than proteins with low number of links to other proteins. These results suggest that a scale-free architecture is a generic property of cellular networks attributable to universal self-organizing principles of robust and error-tolerant networks and that will likely to represent a generic topology for protein-protein interactions. Comment: See also http:/www.nd.edu/~networks and http:/www.nd.edu/~networks/cell
Article
Full-text available
We extend the standard scale-free network model to include a "triad formation step." We analyze the geometric properties of networks generated by this algorithm both analytically and by numerical calculations, and find that our model possesses the same characteristics as the standard scale-free networks such as the power-law degree distribution and the small average geodesic length, but with the high clustering at the same time. In our model, the clustering coefficient is also shown to be tunable simply by changing a control parameter---the average number of triad formation trials per time step.
Article
Full-text available
We study the response of complex networks subject to attacks on vertices and edges. Several existing complex network models as well as real-world networks of scientific collaborations and Internet traffic are numerically investigated, and the network performance is quantitatively measured by the average inverse geodesic length and the size of the largest connected subgraph. For each case of attacks on vertices and edges, four different attacking strategies are used: removals by the descending order of the degree and the betweenness centrality, calculated for either the initial network or the current network during the removal procedure. It is found that the removals by the recalculated degrees and betweenness centralities are often more harmful than the attack strategies based on the initial network, suggesting that the network structure changes as important vertices or edges are removed. Furthermore, the correlation between the betweenness centrality and the degree in complex networks is studied.
Article
Full-text available
Many complex systems, such as communication networks, display a surprising degree of robustness: while key components regularly malfunction, local failures rarely lead to the loss of the global information-carrying ability of the network. The stability of these complex systems is often attributed to the redundant wiring of the functional web defined by the systems' components. In this paper we demonstrate that error tolerance is not shared by all redundant systems, but it is displayed only by a class of inhomogeneously wired networks, called scale-free networks. We find that scale-free networks, describing a number of systems, such as the World Wide Web, Internet, social networks or a cell, display an unexpected degree of robustness, the ability of their nodes to communicate being unaffected by even unrealistically high failure rates. However, error tolerance comes at a high price: these networks are extremely vulnerable to attacks, i.e. to the selection and removal of a few nodes that play the most important role in assuring the network's connectivity. Comment: 14 pages, 4 figures, Latex
Conference Paper
Betweenness centrality is an important centrality measure widely used in social network analysis, route planning etc. However, even for mid-size networks, it is practically intractable to compute exact betweenness scores. In this paper, we propose a generic randomized framework for unbiased approximation of betweenness centrality. The proposed framework can be adapted with different sampling techniques and give diverse methods. We discuss the conditions a promising sampling technique should satisfy to minimize the approximation error and present a sampling method partially satisfying the conditions. We perform extensive experiments and show the high efficiency and accuracy of the proposed method.
Article
Coupled biological and chemical systems, neural networks, social interacting species, the Internet and the World Wide Web, are only a few examples of systems composed by a large number of highly interconnected dynamical units. The first approach to capture the global properties of such systems is to model them as graphs whose nodes represent the dynamical units, and whose links stand for the interactions between them. On the one hand, scientists have to cope with structural issues, such as characterizing the topology of a complex wiring architecture, revealing the unifying principles that are at the basis of real networks, and developing models to mimic the growth of a network and reproduce its structural properties. On the other hand, many relevant questions arise when studying complex networks’ dynamics, such as learning how a large ensemble of dynamical systems that interact through a complex wiring topology can behave collectively. We review the major concepts and results recently achieved in the study of the structure and dynamics of complex networks, and summarize the relevant applications of these ideas in many different disciplines, ranging from nonlinear science to biology, from statistical mechanics to medicine and engineering.
Article
Betweenness is a measure of the centrality of a node in a network, and is normally calculated as the fraction of shortest paths between node pairs that pass through the node of interest. Betweenness is, in some sense, a measure of the influence a node has over the spread of information through the network. By counting only shortest paths, however, the conventional definition implicitly assumes that information spreads only along those shortest paths. Here, we propose a betweenness measure that relaxes this assumption, including contributions from essentially all paths between nodes, not just the shortest, although it still gives more weight to short paths. The measure is based on random walks, counting how often a node is traversed by a random walk between two other nodes. We show how our measure can be calculated using matrix methods, and give some examples of its application to particular networks.
Article
Networks of coupled dynamical systems have been used to model biological oscillators, Josephson junction arrays, excitable media, neural networks, spatial games, genetic control networks and many other self-organizing systems. Ordinarily, the connection topology is assumed to be either completely regular or completely random. But many biological, technological and social networks lie somewhere between these two extremes. Here we explore simple models of networks that can be tuned through this middle ground: regular networks 'rewired' to introduce increasing amounts of disorder. We find that these systems can be highly clustered, like regular lattices, yet have small characteristic path lengths, like random graphs. We call them 'small-world' networks, by analogy with the small-world phenomenon (popularly known as six degrees of separation. The neural network of the worm Caenorhabditis elegans, the power grid of the western United States, and the collaboration graph of film actors are shown to be small-world networks. Models of dynamical systems with small-world coupling display enhanced signal-propagation speed, computational power, and synchronizability. In particular, infectious diseases spread more easily in small-world networks than in regular lattices.
Article
Recent work on the Internet, social networks, and the power grid has addressed the resilience of these networks to either random or targeted deletion of network nodes or links. Such deletions include, for example, the failure of Internet routers or power transmission lines. Percolation models on random graphs provide a simple representation of this process but have typically been limited to graphs with Poisson degree distribution at their vertices. Such graphs are quite unlike real-world networks, which often possess power-law or other highly skewed degree distributions. In this paper we study percolation on graphs with completely general degree distribution, giving exact solutions for a variety of cases, including site percolation, bond percolation, and models in which occupation probabilities depend on vertex degree. We discuss the application of our theory to the understanding of network resilience.
Article
The study of networks pervades all of science, from neurobiology to statistical physics. The most basic issues are structural: how does one characterize the wiring diagram of a food web or the Internet or the metabolic network of the bacterium Escherichia coli? Are there any unifying principles underlying their topology? From the perspective of nonlinear dynamics, we would also like to understand how an enormous network of interacting dynamical systems-be they neurons, power stations or lasers-will behave collectively, given their individual dynamics and coupling architecture. Researchers are only now beginning to unravel the structure and dynamics of complex networks.
Article
Despite the apparent randomness of the Internet, we discover some surprisingly simple power-laws of the Internet topology. These power-laws hold for three snapshots of the Internet, between November 1997 and December 1998, despite a 45% growth of its size during that period. We show that our power-laws fit the real data very well resulting in correlation coefficients of 96% or higher. Our observations provide a novel perspective of the structure of the Internet. The power-laws describe concisely skewed distributions of graph properties such as the node outdegree. In addition, these power-laws can be used to estimate important parameters such as the average neighborhood size, and facilitate the design and the performance analysis of protocols. Furthermore, we can use them to generate and select realistic topologies for simulation purposes.
Article
We study the robustness of complex networks subject to edge removal. Several network models and removing strategies are simulated. Rather than the existence of the giant component, we use total connectedness as the criterion of breakdown. The network topologies are introduced a simple traffic dynamics and the total connectedness is interpreted not only in the sense of topology but also in the sense of function. We define the topological robustness and the functional robustness, investigate their combined effect and compare their relative importance to each other. The results of our study provide an alternative view of the overall robustness and highlight efficient ways to improve the robustness of the network models. Comment: 21 pages, 9 figures