# Martin G EverettThe University of Manchester · Department of Social Statistics

Martin G Everett

## About

106

Publications

75,692

Reads

**How we measure 'reads'**

A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more

18,716

Citations

Citations since 2017

## Publications

Publications (106)

Burt (1992) proposed two principal measures of structural holes, effective size and constraint. However, the formulas describing the measures are somewhat opaque and have led to a certain amount of confusion. Borgatti (1997) showed that, for binary data, the effective size formula could be written very simply as degree (ego network size) minus aver...

Most networks examined so far involve connections between nodes all of the same type, known as one‐mode networks. This chapter examines partitioning and clustering in multimode network data. A number of techniques have been developed for dealing with non‐binary data or more precisely non‐network type data. The chapter first concentrates on two mode...

Two mode social network data consisting of actors attending events is a common type of social network data. For these kinds of data it is also common to have additional information about the timing or sequence of the events. We call data of this type two-mode temporal data. We explore the idea that actors attending events gain information from the...

A variety of node-level centrality measures, including purely structural measures (such as degree and closeness centrality) and measures incorporating characteristics of actors (such as the Blau's measure of heterogeneity) have been developed to measure a person's access to resources held by others. Each of these node-level measures can be placed o...

Research using techniques from social network analysis have expanded dramatically in recent years. The availability of network data and the recognition that social network techniques can provide an additional perspective have contributed to this expansion. Social network data are not always in a standard network form and, in many instances, consist...

Valente and Fujimoto (2010) proposed a measure of brokerage in networks based on Granovetter's classic work on the strength of weak ties. Their paper identified the need for finding node-based measures of brokerage that consider the entire network structure, not just a node's local environment. The measures they propose, aggregating the average cha...

A network has a core–periphery structure if it consists of two groups of actors, core actors who interact mainly with other core actors and peripheral actors who connect to the core but do not interact among themselves. We examine techniques designed to uncover core–periphery structures in networks. There are two distinct approaches, the first is t...

Background Research suggests that policy makers often use personal contacts to find information and advice. However, the main sources of information for public health policymakers are not known. This study aims to describe policy makers’ sources of information. Methods A questionnaire survey of public health policy makers across Greater Manchester...

Social network analysts have often collected data on negative relations such as dislike, avoidance, and conflict. Most often, the ties are analyzed in such a way that the fact that they are negative is of no consequence. For example, they have often been used in blockmodeling analyses where many different kinds of ties are used together and all tie...

Composers generally write music alone, and we commonly understand the great figures of classical music as singular geniuses. Even where composers’ social networks and friendships are of contextual interest, it is arguable that their association with other musicians arises because they choose to socialize with similar others. However, it is also pos...

Background Persistent health inequalities encourage researchers to identify new ways of understanding the policy process. Research shows that informal relationships are implicated in finding evidence and making decisions for public health policy. However, few studies use specialised methods such a network analysis to identify key relationships and...

There have been two distinct approaches to two-mode data. The first approach is to project the data to one-mode and then analyze the projected network using standard single-mode techniques, also called the conversion method. The second approach has been to extend methods and concepts to the two-mode case and analyze the network directly with the tw...

Background Persistent health inequalities encourage researchers to identify new ways of understanding the policy process. Informal relationships are implicated in finding evidence and making decisions for public health policy (PHP), but few studies use specialized methods to identify key actors in the policy process.
Methods We combined network an...

In a paper examining informal networks and organizational crisis, Krackhardt and Stern (1988) proposed a measure assessing the extent to which relations in a network were internal to a group as opposed to external. They called their measure the E–I index. The measure is now in wide use and is implemented in standard network packages such as UCINET...

Krackhardt (1994) proposed four dimensions to describe and measure the amount of hierarchy in networks of informal organizations. We examine these conditions, suggest some relaxations and prove that they are both necessary and sufficient to guarantee an arborescence (or out-tree). In addition we suggest situations some of which are outside of infor...

Public health policy-making activities are currently split between local authority and NHS organisations. Despite an increasing body of research on evidence-based policy (EBP), few studies explore the process of policy-making. Little is known about how policies are made in a local context, or how (scientific) evidence is used. Previous research has...

Centrality measures are based upon the structural position an actor has within the network. Induced centrality, sometimes called vitality measures, take graph invariants as an overall measure and derive vertex level measures by deleting individual nodes or edges and examining the overall change. By taking the sum of standard centrality measures as...

We prove a number of results on betweenness and closeness centrality and centralization. In particular, we prove the much used normalization expression for closeness centrality first given by Freeman (1979)5.
Freeman , L. C. 1979 . Centrality in social networks conceptual clarification . Social Networks , 1 : 215 – 239 . [CrossRef], [Web of Science...

Following the foot and mouth disease epidemic in Great Britain (GB) in 2001, livestock movement bans were replaced with mandatory periods of standstill for livestock moving between premises. It was anticipated that these movement restrictions would limit each individual's contact networks, the extent of livestock movements and thus the spread of fu...

The concept of centrality is often invoked in social network analysis, and diverse indices have been proposed to measure it. This paper develops a unified framework for the measurement of centrality. All measures of centrality assess a node's involvement in the walk structure of a network. Measures vary along four key dimensions: type of nodal invo...

We give an overview of some strategies for mapping unstructured meshes onto processor grids. Sample results show that the mapping can make a considerable difference to the communication overhead in the parallel solution time, particularly as the number of processors increase.

In this paper, we look at the betweenness centrality of ego in an ego network. We discuss the issue of normalization and develop an efficient and simple algorithm for calculating the betweenness score. We then examine the relationship between the ego betweenness and the betweenness of the actor in the whole network. Whereas, we can show that there...

We consider the multilevel paradigm and its potential to aid the solution of combinatorial optimisation problems. The multilevel paradigm is a simple one, which involves recursive coarsening to create a hierarchy of approximations to the original problem. An initial solution is found and then iteratively refined at each level, coarsest to finest. A...

A parallel method for the dynamic partitioning of unstructured meshes is described. The method introduces a new iterative optimisation technique known as relative gain optimisation which both balances the workload and attempts to minimise the interprocessor communications overhead. Experiments on a series of adaptively refined meshes indicate that...

A parallel method for dynamic partitioning of unstructured meshes is described. The method employs a new unified iterative optimisation technique which both balances the workload and attempts to minimise the interprocessor communications overhead. Experiments on a series of adaptively refined meshes indicate that the algorithm provides partitions o...

A new method is described for optimising graph partitions which arise in mapping unstructured mesh calculations to parallel computers. The method employs a combination of iterative techniques to both evenly balance the workload and minimise the number and volume of interprocessor communications. It is designed to work efficiently in parallel as wel...

A parallel method for dynamic partitioning of unstructured meshes is described.

Preface [Special Issue containing a selection of papers presented at the International Symposium on Combinatorial Optimisation (CO2000) held at the University of Greenwich, London, from 12-14 July 2000.

Models and Methods in Social Network Analysis, first published in 2005, presents the most important developments in quantitative models and methods for analyzing social network data that have appeared during the 1990s. Intended as a complement to Wasserman and Faust's Social Network Analysis: Methods and Applications, it is a collection of articles...

We present a graph theoretic model of analysing food web structure called regular equivalence. Regular equivalence is a method for partitioning the species in a food web into "isotrophic classes" that play the same structural roles, even if they are not directly consuming the same prey or if they do not share the same predators. We contrast regular...

https://sites.google.com/site/ucinetsoftware/

Social network analysts have tried to capture the idea of social role explicitly by proposing a framework that precisely gives conditions under which grouped actors are playing equivalent roles. They term these methods positional analysis techniques. The most general definition is regular equivalence which captures the idea that equivalent actors a...

A common but informal notion in social network analysis and other fields is the concept of a core/periphery structure. The intuitive conception entails a dense, cohesive core and a sparse, unconnected periphery. This paper seeks to formalize the intuitive notion of a core/periphery structure and suggests algorithms for detecting this structure, alo...

Network analysts have developed a number of techniques for identifying cohesive subgroups in networks. In general, however, no consideration is given to actors that do not belong to a given group. In this paper, we explore ways of identifying actors that are not members of a given cohesive subgroup, but who are sufficiently well tied to the group t...

Given a relation α (a binary sociogram) and an a priori equivalence relation π, both on the same set of individuals, it is interesting to look for the largest equivalence πo that is contained in and is regular with respect to α. The equivalence relation πo is called the regular interior of π with respect to α. The computation of πo involves the lef...

This paper extends the standard network centrality measures of degree, closeness and betweenness to apply to groups and classes as well as individuals. The group centrality measures will enable researchers to answer such questions as ‘how central is the engineering department in the informal influence network of this company?’ or ‘among middle mana...

this paper we discuss the partition optimisation problem and its bearing on the graph partitioning problem

There are a large number of techniques that try and determine areas within a network in which individuals are more closely linked to each other than outsiders. However, once these cohesive subgraphs have been identified researchers are often left with a long list of overlapping subgroups and have no means of assessing the structure or importance of...

The use of unstructured mesh codes on parallel machines can be one of the most efficient ways to solve large computational-fluid-dynamics (CFD) and computational-mechanics (CM) problems. An important consideration, however, is the problem of distributing the mesh across the memory of the machine at run time so that the computational load is evenly...

A parallel method for the dynamic partitioning of unstructured meshes is described. The method introduces a new iterative optimization technique known as relative gain optimization which both balances the workload and attempts to minimize the interprocessor communications overhead. Experiments on a series of adaptively refined meshes indicate that...

Network analysis is distinguished from traditional social science by the dyadic nature of the standard data set. Whereas in traditional social science we study monadic attributes of individuals, in network analysis we study dyadic attributes of pairs of individuals. These dyadic attributes (e.g. social relations) may be represented in matrix form b...

A parallel method for dynamic partitioning of unstructured meshes is described. The method employs a new iterative optimisation technique which both balances the workload and attempts to minimise the interprocessor communications overhead. Ex- periments on a series of adaptively refined meshes indicate that the algorithm provides partitions of an e...

A coloration is an exact regular coloration if whenever two vertices are colored the same they have identically colored neighborhoods. For example, if one of the two vertices that are colored the same is connected to three yellow vertices, two white and red, then the other vertex is as well. Exact regular colorations have been discussed informally...

. We give an overview of some strategies for mapping unstructured meshes onto processor grids. Sample results show that the mapping can make a considerable difference to the communication overhead in the parallel solution time, particularly as the number of processors increase. 1 Introduction The use of unstructured mesh codes on parallel machines...

this paper, we give a brief overview of the progress made on a mesh decomposition algorithm based on the recursive clustering method together with its application to the parallelisation of an unstructured mesh CFD code.

We give an overview of some strategies for mapping unstructured meshes onto processor grids. Sample results show that the mapping can make a considerable difference to the communication overhead in the parallel solution time, particularly as the number of processors increase.

this paper we discuss the mesh partitioning problem in the light of the coming generation of massively parallel machines and the resulting implications for such algorithms

The requirement for a very accurate dependence analysis to underpin software tools to aid the generation of efficient parallel implementations of scalar code is argued. The current status of dependence analysis is shown to be inadequate for the generation of efficient parallel code, causing too many conservative assumptions to be made. This paper s...

In principle, unstructured mesh computational fluid dynamics (CFD) codes can be parallelized using a mesh decomposition approach similar to structured mesh codes. However, for unstructured codes the mesh structure is problem dependent and algorithms for automatically decomposing the mesh onto the processors are required. An algorithm based upon a r...

A new method is described for optimizing graph parti tions that arise in mapping unstructured mesh calcula tions to parallel computers. The method employs a combination of iterative techniques to evenly balance the workload and minimize the number and volume of interprocessor communications. When combined with a fast direct-partitioning technique (...

A new method is described for solving the graph-partitioning problem which arises in mapping unstructured mesh calculations to parallel computers. The method, encapsulated in a software tool, JOSTLE, employs a combination of techniques including the Greedy algorithm to give an initial partition together with some powerful optimisation heuristics. A...

The theory of regular equivalence has advanced over the last 15 years on a number of different fronts. Notation and terminology have developed often making it difficult to obtain a coherent view of the area as a whole. This paper attempts to provide a framework in which to develop and explore the general mathematical theory of regular equivalence a...

A recent study (Hummon and Carley 1992) has indicated that one of the main research paths in the discipline of social networks is the study of roles and positions. Several key positional concepts have been elucidated, including regular colorings (White and Reitz 1983), automorphic colorings (Everett 1985), and structural colorings (Lorrain and Whit...

In this paper we present two algorithms for computing the extent of regular equivalence among pairs of nodes in a network. The first algorithm, REGE, is well known, but has not previously been described in the literature. The second algorithm, CATREGE, is new. Whereas REGE is applicable to quantitative data, CATREGE is used for categorical data. Fo...

The use of regular graph colouring as an equivalent simple definition for regular equivalence is extended from graphs to digraphs and networks. In addition new concepts of regular equivalence for edges and hypergraphs are presented using the new terminology.

In this paper the need for a numerical algorithm that can automatically select, not only the step size and order, but also the most suitable numerical Ordinary Differential Equation (ODE) integrator from a prescribed set, is demonstrated. Such an integrator based upon explicit and implicit Runge-Kutta methods is described along with all the require...

In a previous paper Wade et al.10 a type-insentive Runge-Kutta based code, SARK, was developed for the integration of a system of ODEs. The algorithm automatically determined which numerical method to use for the current integration step from a suite of internal explicit and implicit Runge-Kutta methods. Results produced showed that the switching a...

The principal goal of studying experimental exchange networks is to understand the relationship between power and network position. In this paper we provide a formal definition of the appropriate notion of position, and explore some of the consequences of assuming that power is a function of position. It is shown that, in highly structured graphs,...

Blockmodels are used to collapse redundant elements in a system in order to clarify the patterns of relationships among the elements. Traditional blockmodels define redundancy in terms of structural equivalence. This choice serves many analytic purposes very well, but is inadequate for others. In particular, role systems would be better modeled by...

There are a number of tools which are necessary to enable computational modelling software to be used to greatest effect. This paper considers two very different problems which are however closely related. The first problem is automatic generation in three dimensions. A method which employs the idea of recursive subdivision of the original data spa...

Kosaka (1989) used group theory to reformulate the structure of IKI. Unfortunately the paper contains a misconception of how the group concept can be applied. In this note we correct that error.

The role colouring of a graph is an assignment of colours to the vertices which obeys the rule that two vertices are coloured the same only if their neighbourhoods have the same colour set. We investigate the set of role colourings for a graph proving that it forms a lattice. We also show that this lattice can be trivial and this can only occur if...

Seidman (1983a) has suggested that the engineering concept of LS sets provides a good formalization of the intuitive network notion of a cohesive subset. Some desirable features that LS sets exhibit are that they are difficult to disconnect by removing edges, they are relatively dense within and isolated without, they have limited diameter, and ind...

We show that a juncture homomorphism does not imply that the associated semigroups of a network and its image are isomorphic.

In trying to assess the validity of formal definitions of positional analysis it is necessary to have certain standard test data. The mathematical literature contains a large collection of important graphs specifically constructed as counter-examples to conjectures. We show that one of these graphs is useful in understanding the workings of certain...

Structural equivalence (Lorrain and White, 1971) and automorphic equivalence (Everett, 1985) are generalized to define neighborhood‐ and ego‐centered equivalences. It is shown that local versions of these equivalences can then be formulated quite naturally. In addition to these natural localizations, a generalized procedure capable of localizing an...

Recent work by Borgatti and Everett (1989) has shown that the collection of regular equivalences described by White and Reitz (1983) forms a lattice. In this paper, we present a procedure called iterated roles for tracing systematic paths through the lattice. At the heart of iterated roles is the proof that the regular equivalence of a regular equi...

In this paper, we explore the structure of the set of all regular equivalences (White and Reitz 1983), proving that it forms a lattice, and suggest a general approach to computing certain elements of the lattice. The resulting algorithm represents a useful complement to the White and Reitz algorithm, which can only find the maximal regular equivale...

This paper expands the concept of “block”, as in “blockmodeling”, by relating it to the “blocks” of permutation groups. Crucial to this development is the idea of graph automorphism, which captures the essence of “regular equivalence” in a way that allows the flexibility of the group block concept. Blocks, unlike regular or structural equivalence c...

The orbits or a graph, digraph or network provide an effective definition for role equivalence since they are a natural generalization of the principle of substitutability of structural equivalence. Calculation of the orbits is a computationally difficult task but in this paper we present a fast and efficient algorithm which finds the orbits of a l...

A set of points S of a graph is convex if any geodesic joining two points of S lies entirely within S. The convex hull of a set T of points is the smallest convex set that contains T. The hull number (h) of a graph is the cardinality of the smallest set of points whose convex hull is the entire graph. Characterisations are given for graphs with par...

A new concept of role similarity is presented; this definition is a generalisation of structural equivalence but is stricter than the idea of regular equivalence of White and Reitz (1983). This new definition forms a standard part of graph theory literature and as such has been well researched. Consequently a complexity measure first proposed by Mo...

Everett (1982a) has proposed a graph theoretic technique for analysing social networks. It is not usually possible to apply this method to data which contain a large number of interlocking cycles. We demonstrate a technique which overcomes this problem by referring to the data analysed by MacRae (1960).

In this paper we show how the algorithm EBLOC can be extended to deal with valued data directly. The technique is then applied to Sampson's (1969) monastery data and comparisons are made with the original EBLOC algorithm.

Everett (1982) has proposed a graph theoretic blocking procedure for social networks. In this paper we give a comprehensive description of the implementation of this procedure with special reference to a FORTRAN program contained in the Appendix.

In this paper we examine some of the mathematical properties of Everett's graph theoretic blocking procedures. In particular we discuss how k‐plexes are blocked by Everett's algorithm. We also give a mathematical definition and algorithm for finding a class of actors identified as important in Everett's paper.

Everett and Nieminen have extended the Boyle homomorphism from undirected to directed graphs. Some sociological considerations implemented in conjunction with the homomorphism, indicate the structural importance of small cycles. These cycles are then used to construct an algorithm, which will produce a blocking of a social network. The algorithm is...

Boyle has given a condition for defining a homomorphism in terms of minimal paths for undirected graphs. The purpose of such homomorphisms is to provide a simpler graph which will reflect the structure of the more complex graph, and thereby enable the researcher to make observations which may have been shrouded by a preponderance of nodes and edges...