ChapterPDF Available

Complexity and organization–environment relations: Revisiting Ashby's law of requisite variety

Complexity and
Relations: Revisiting Ashby’s
Law of Requisite Variety
Max Boisot and Bill McKelvey
It is a commonplace of organization theory
that organized systems must adapt to their
environment in order to survive (Lawrence
and Lorsch, 1967; Aldrich, 1979). The cyber-
netician, W. Ross Ashby (1956) is perhaps
best known for his Law of Requisite Variety,
which framed the internal order generated by
a system as its response to impinging envi-
ronmental forces (Ashby, 1962). In this
chapter, we recast Ashby’s law as the Law of
Requisite Complexity (McKelvey and Boisot,
2009). The latter holds that, to be effica-
ciously adaptive, the internal complexity of a
system must match the external complexity it
Current thinking holds that organizations can
invest in adaptation in two ways: (1) simplify
the complexity of incoming stimuli so as to
economize on the resources that need to be
expended in responding; (2) invest more
resources in the response than they judge to
be strictly necessary so as to ensure some
degree of adaptation. The risks asso ciated
with the first approach are those of oversim-
plification – i.e. unfamiliar stimuli merely
get assimilated to familiar ones and hence get
mis-classified. The risks associated with the
second are that resources get depleted by
unnecessarily complex responses before
adaptation occurs. To explore the trade-offs a
system faces between stimulus simplification
and response complexification, we draw on
complexity theories to develop a conceptual
framework, the Ashby Space, that can help
researchers and practitioners to frame the
challenges of adaptation in resource-efficient
ways. We first briefly review key aspects of
general systems theories, early organization
theories, and complexity theories. We then
draw on Ashby’s Law to create the Ashby
Space and illustrate its use by applying it to
the 2007 liquidity crisis.
General systems theory
Our point of departure is a living cybernetic
system capable of responding to its environ-
ment in adaptive ways, defined as the class
of systems behaviours that contribute to the
maintenance of system identity in the face
of external perturbations (Churchman and
Ackoff, 1950). Cybernetics was defined by
Wiener (1948) as the science of control and
communication in the animal and the machine.
All living and most mechanical systems are
sustained by the presence of positive and
negative feedback loops; the first amplify-
ing and the second dampening information-
bearing signals of relevance to them. The
study of negative feedback in general systems
theory (GST) showed how systems acted to
preserve themselves under changing external
conditions. The distinction between the sys-
tem’s interior and its exterior is essential to
the preservation of a system’s identity and
continued survival under conditions of envi-
ronmental change. Through the mechanism
of homeostasis (Ashby, 1956), a system is
able to maintain an ‘internal’ equilibrium in
the face of ‘external’ perturbations. Yet sys-
tems are also capable of generating change
autonomously by amplifying feedback instead
of merely adapting to external contingencies
by dampening it – an idea that took root in
GST with Maruyama’s (1963) classic paper
on deviation-amplifying positive feedback
Organizations in environments
The cybernetic systems discussed by Wiener
and others (Buckley, 1968), exhibited minimal
complexity. They were designed to respond
to a limited range of external contingencies,
and to do so primarily through negative feed-
back processes. Human organizations, by
contrast, are capable of dealing with a mas-
sive range of external contingencies, far
exceeding those that an individual human
being, let alone a simple cybernetic machine,
can handle. Yet for most of the twentieth
century human organizations were conceived
of as simple machines, tightly controllable
by their creators or owners (Taylor, 1911;
Fayol, 1916; Koontz and O’Donnell, 1964)
and hence predictable in their behaviour.
Etzioni (1961) analyses an organization’s
capacity to secure compliance in carrying out
complex tasks through the exercise of power
expressed via hierarchical authority relations,
suggesting that, in the human case, this capac-
ity is what distinguishes internal from external
organization – a distinction that was later taken
up by the markets and hierarchies framework
(Coase, 1937; Williamson, 1975). With inter-
nal organization, the exercise of power allows
multiple negative feedback loops to be brought
under some central control in order to achieve
stability and a unitary agency.
The passage from a mechanistic to a more
organic conception of human organization
(Burns and Stalker, 1961) had taken place by
the early 1960s, partly in response to the
discovery that human organization was nei-
ther as controllable nor as predictable as had
been assumed (Trist and Bamforth, 1951;
McGregor, 1960). The systemic processes
that demarcated a lower-entropy1 internal
organization from a higher-entropy external
environment were not all under managerial
control. Nevertheless, through evolution, and
in contrast to a purely mechanical system,
an organic system could learn to maintain
a distinction between internal and external
environment, preserving a boundary and
exercising some measure of control over
what crossed the boundary (Miller, 1978).
Homeostasis could thus be maintained inside
the boundary across a wider range of environ-
mental changes than in the case of a purely
mechanical system. An intelligent organic
system could then take this adaptive capacity
one step further by generating representations
of both its internal and external environment
(March and Simon, 1958). These could be
manipulated so as to allow it to anticipate
and respond to the future states of both.
The organic conception of organizations
emerged alongside the new GST being for-
mulated in biology, itself aspiring to the
status of ‘a general theory of organization’
(Bertalanffy, 1968: 34; Kast and Rosenzweig,
1973). A cybernetic system could now be
viewed as a special case of a general system,
one that was equilibrium-seeking. A subset
of these – autopoietic systems – exhibited the
property of self-organization (Maturana and
Varela, 1980), exploiting the dampening and
stabilizing effects of negative feedback effects
to achieve autopoietic closure. The interior
of any autopoietic system will always be
characterized by a lower level of entropy than
that of its environment. Indeed, for many biolo-
gists, this entropy differential actually defines
organization (Brooks and Wiley, 1988; Weber
et al., 1988).
A number of scholars then began to study
the way that the structures and behaviours of
organized human systems adapt to changes in
the environment (Woodward, 1958; Lawrence
and Lorsch, 1967; Thompson, 1967). An
environment experienced as complex pro-
vokes a matching process of differentiation
and integration in such structures and behav-
iours; one experienced as simple, less so. In
these contingent responses of an organized
system to the characteristics of its environ-
ment, we have, in effect, a first social science
application of Ashby’s Law of Requisite
Variety (1956): an adaptive system survives to
the extent that the variety it generates matches
that of the environment it finds itself in. In
what could then be seen as a further applica-
tion of Ashby’s law, Perrow (1972) framed
the issue of organizational complexity in
terms of the tasks that human organization
has to perform, characterizing task complex-
ity by its resistance to both routinization and
understanding. For Perrow, complexity had
both an objective and a subjective side, that is,
it can be inter-subjectively ascertained to be a
property of the environment itself (objective
complexity) or it can describe an individual’s
experience irrespective of the objective prop-
erties of the environment (s)he encounters
(subjective complexity).
Complex adaptive systems
The foregoing view assumes that organiza-
tions are objects in an environment that can
be treated as a residual category – i.e. it com-
prises everything that the organization is not.
Yet we, either as external observers or as
members of organizations, are the ones who
decide where to place boundaries around
‘the’ organization and hence who define what
we will treat as residual. We then see the envi-
ronment as having higher levels of entropy
because we ignore the degree to which it is
itself organized and capable of exerting force
on organizations. The emergence of far-from-
equilibrium thermodynamics (Prigogine,
1955; Prigogine and Stengers, 1984), and of
the complex adaptive systems (CAS) perspec-
tive in the 1990s, however, challenged this
stability-seeking, ‘object-oriented’ view of
The first phase in the development of the
CAS perspective can be traced back to the
physicist Erwin Schrödinger, who, in a small
book called What is Life? (1944), had sug-
gested that life self-organizes by sucking in
low entropy from its environment and spit-
ting out high entropy back into it. Prigogine
and his co-workers in Europe, building on
Bénard’s (1901) study of emergent structures
in fluids, then further postulated that new
order and, by implication, organization
emerged from a speeding up of such entropy
production (Swenson, 1989). Prigogine
labelled the resulting organized entities ‘dis-
sipative structures’. In a teapot, for example,
the ‘rolling boil’ familiar to chefs describes a
shift from conduction – homogeneous mole-
cules dissipating heat by vibrating faster in
place – to convection – molecules circulating
around the pot. The shift speeds up heat
transfer and in so doing more efficiently
reduces an imposed energy differential. This
phase transition, which occurs at the so-
called ‘1st critical value’ of imposed energy –
McKelvey (2001) calls this an adaptive
tension defines an ‘edge of order’ (Haken,
1977; Mainzer, 1994/2007). Living ‘dissipa-
tive’ systems become increasingly efficient
and exploitative of their environment, indeed,
in some cases so much so that at the ‘edge of
order’, many lose their capacity to adapt and
die (Miller, 1990).
A second phase, more focused on living
systems, was initiated by Anderson (1972),
Gell-Mann (1988), Holland (1988, 1995) and
Arthur (1994) at the Santa Fe Institute in
New Mexico. These scholars explored the
behaviour of heterogeneous agents interact-
ing at the so-called ‘edge of chaos’, a state
that emerges at a ‘2nd critical value’ of adap-
tive tension. At this value, a second phase
transition occurs from the order that appeared
at the 1st critical value to chaos. Between
the ‘edges’ of order and chaos lies a region
of emergent complexity, or what Kauffman
(1993) terms the ‘melting’ zone. It is a zone
in which adaptive capability is at its maxi-
mum. Bak (1996) argued that to survive,
entities need to maintain themselves near the
edge of chaos, i.e. in the melting zone, in a
which the entity achieves and then maintains
an efficaciously adaptive state under changing
environmental (or even internal) conditions.
A process of self-organization is initiated when
heterogeneous agents in search of improved
fitness interconnect under conditions of exog-
enously or endogenously imposed adaptive
tension. New order is an emergent outcome
of this process.
A third phase, driven by the new discipline
of econophysics, is now underway, focusing
on the outcomes of self-organization and
emergent new order. According to Thietart
and Forgues (this volume), Prietula (this
volume) and Tracy (this volume), emergent
phenomena appear in the nonlinear, intra-
and inter-level causal processes of multi-
level hierarchies. Nonlinearities are a source
of butterfly-effects3 and scalability, extend-
ing across multiple hierarchical levels within
organisms and other organized entities.
Butterfly effects, which are tiny initiating
events (i.e. Holland’s ‘lever points’ (1995: 5))
that can produce extreme outcomes such as
hurricanes, stock market crashes, giant firms,
etc., can be expressed in power law form
(Zipf, 1949; Newman, 2005). Scalability
(Brock, 2000) and scale-free causes (West and
Deering, 1995; Andriani and McKelvey, 2009)
are best understood by considering a cauli-
flower. First cut off a ‘floret’ and then cut
a smaller floret from the first; keep cutting
successively smaller florets in this way. Each
will be smaller than the former, but each will
exhibit the same shape, structure, and gene-
sis. Scalability reproduces the same ‘fractal’
structure4 at different scales (Mandelbrot,
1982); which is to say that scale-free causes
generate the same dynamic, effect, or charac-
teristic at multiple levels of a system.
In what follows we take organizing to be
an emergent far-from-equilibrium phenome-
non that neither entails nor precludes the
existence of ‘organizations’ as stable objects.
The latter occupy one end of a continuum
along which a range of organizational phe-
nomena can be located. Order-creation via
the amplification of positive feedback at one
level of an organization becomes as impor-
tant as equilibrium-seeking via the damping
effects of negative feedback at another. When
working in tandem, both contribute to the
‘organizing’ process; hence both can be
adaptive. We now explore this point further
by means of the Ashby Space.
Ross Ashby, one of the founders of GST, was
interested in the range or variety of situations
that an animal or a machine could respond
and adapt to. His Law of Requisite Variety
states that ‘only variety can destroy variety’
(Ashby, 1956: 207): a system survives to the
extent that the range of responses it is able to
marshal – as it attempts to adapt to imposing
tensions – successfully matches the range
of situations – threats and opportunities –
confronting it. In the case of a living system,
the response might be wholly behavioural
and often outside a system’s cognitive con-
trol – as in the case of a hormonal response
or a reflex. Alternatively the response might
be a blend of behaviour and cognition that is
contingent on the system classifying a stimu-
lus as foreshadowing, say, the presence of a
foe and requiring a fight-or-flight decision. It
will then respond to representations of its
environment that are constructed out of such
classification activity rather than to its envi-
ronment directly (Plotkin, 1993). Gell-Mann
(2002: 16–17; see also Maguire this volume)
sees representations as effectively complex
‘schemas’ – structured descriptions of an
objective external world which incorporate
neither too few nor too many degrees of free-
dom. What advantage do schemas confer?
If it is not to waste its energy responding
to every will-o’-the-wisp, a system must
build schemas in ways that distinguish
meaningful information (stimuli conveying
‘important’ real-world regularities) from
noise (meaningless stimuli). In other words,
it must distinguish between what Gell-Mann
has labelled ‘effective’ and ‘crude complexity
(Gell-Mann, 1994). Note that what consti-
tutes information or noise for a system is
partly a function of the system’s own expec-
tations and judgments about what is impor-
tant (Gell-Mann, 2002) – as well as of its
motivations – and hence, of its models of the
world and its intents (Dennett, 1987). Valid and
timely representations (schemas) economize
on organism’s scarce energy resources (Ball,
2004; Vermeij, 2004). This can even be seen
in how we use language. Zipf (1949) showed
how the frequency of word use inversely cor-
relates with word length. The resulting power
law distribution established a PRINCIPLE OF
LEAST EFFORT as defined in Table 16.1.
The Ashby Space
We illustrate the functioning of Ashby’s law
with a simple diagram we call the Ashby
Space (Figure 16.1). On the vertical axis we
place the real-world stimuli that impinge on
an organism. These range in variety from low
to high. A low-variety stimulus might be an
image of the moon; a high-variety stimulus
might be the trajectory of an insect in a
swarm.5 On the horizontal axis, we place the
variety of a system’s responses to the stimuli.
These also range from low to high. A low-
variety response to the moon-as-stimulus
would simply be to stare at it, meditate, and
otherwise do nothing. Here, it is the absence
of a response that is adaptive. A high-variety
response to the insect swarm, by contrast,
might be to chase after each individual insect
flying past. This could prove exhausting and
time consuming. The first type of response
The Requisite
Variety Diagonal
Variety of Responses
Variety of
Figure 16.1 The Ashby Space
saves on scarce resources of energy and time;
the second wastes them. The diagonal in the
diagram indicates the set of points at which
variety can be considered ‘requisite’, that is,
where the variety of a system’s response
matches that of incoming stimuli in an adap-
tive way – it facilitates survival whether or not
it does so with an efficient use of resources.
Ashby stressed the need to reduce the flow
of some forms of variety from the external
environment to certain essential processes in
a living system. This was the role of regula-
tion, and, as Ashby pointed out, the amount
of regulation that can be achieved is bounded
by the amount of information that can be
transmitted and processed by a system
(Ashby, 1956). The variety that the system
then has to respond to depends in part on its
internal schema development and transmis-
sion capacities and in part on the operation of
tuneable filters, controlled by the system’s
cognitive apparatus, and used by the system
to separate out regularities from noise (Clark,
1997) – i.e. Gell-Mann’s effective complexity
from its crude complexity. The more intelli-
gent a system, the higher will be the cognitive
component in its response relative to the purely
behavioural one. Birds mostly act according
to genetically derived behavioural instincts;
monkeys produce both behavioural and
cognitive responses; humans exhibit higher-
level cognitive skills. There is, thus, a trade-
off between the behavioural and the cognitive
resources that a living system has to marshal
to be adaptive.
The matching of stimulus and response
variety on the diagonal can only be considered
functionally adaptive, however, if it occurs
inside the region of the schematic diagram
labelled OAB in Figure 16.2 which describes
a response budget available to a living system
defined in terms of energetic, temporal and
spatial resources. The curve AB constitutes
the system’s adaptive frontier, i.e. the region
in which it reaches the limit of the budget it
can draw on for the purposes of adaptation.
To the right of this region, the mix of cogni-
tive and behavioural variety required to
respond to incoming stimuli is too high for
adaptive purposes, causing the system to
spend too much of its resource budget and,
thus, eventually leading to its physical disin-
tegration. Above this region, the resources
consumed by the data processing required to
register incoming stimuli, to interpret them,
and to formulate adaptive responses also exceed
the system’s resource budget, eventually
leading to errors and to adaptive failure – in
the language of decision theory, the sys-
tem’s rationality is ‘bounded’ (Simon, 1947).
The Adaptive Frontier
Low High
Variety of Responses
Variety of
Figure 16.2 The adaptive frontier
Cognitive and physical disintegration, how-
ever, are not mutually exclusive alternatives:
the first will sooner or later lead to the second
and vice versa. And even when it is operating
within its resource budget, at any point above
the diagonal of Figure 16.1, the system is
still under-adapting – cognitively or behav-
iourally – relative to what is actually required.
Likewise, at any point below the diagonal it
is using up its budget wastefully or ineffec-
tively relative to what is required (Thaler,
1992). The challenge for an adaptive system,
then, is to locate itself at some point on the
diagonal in Figure 16.1 while remaining within
the budget area OAB in Figure 16.2.
The shape of the resource budget, sche-
matically represented by the curve OAB
varies with the intelligence of the system.
Figure 16.3 illustrates the point by compar-
ing the resource budget of a human being
with that of a hummingbird. Given its larger
brain size, a human being can readily apply
its resource budget to the data processing and
transmission tasks that convert high-variety
stimuli into low-variety ones, or vice versa.
It does this by interpreting the stimulus,
distinguishing which part of the variety
associated with it is information bearing
and which part is noise. In doing so, it can
use its resource budget to move either down
or up the vertical dimension of the Ashby
Space. Hummingbirds, by contrast are better
off deploying their ‘flatter’ resource budgets
towards the right in Figure 16.3, i.e. towards
more energetic responses. But human beings
go further. As indicated in Figure 16.4, their
capacity for social collaboration and for cre-
ating technological artefacts extends their
resource budget along both the vertical and
the horizontal axis of the diagram, thus sig-
nificantly increasing the level of environ-
mental variety that they can adapt to. We no
longer just walk, we can fly at several times
the speed of sound. And the stimuli that we
process and respond to no longer originate in
our immediate environment; CNN collects
them from around the globe. The human case
thus calls for a more dynamic formulation of
Ashby’s law: The rate at which a human
system’s adaptation budget increases variety –
i.e. at which the adaptive frontier expands
must at least match the rate at which
environmental variety increases.
What are the different response strategies
available to intelligent agents in the face of
variety? Consider an agent located at point Q
in Figure 16.5 corresponding to some prior
background activity shown as level X along
the horizontal axis. The agent now registers a
high-variety stimulus at point Y along the
Variety of
Variety of Responses
Low High
Figure 16.3 The adaptive frontier of hummingbirds and humans
vertical axis. It could respond to the variety
associated with point Y directly in a ‘mind-
less’ behaviourist fashion either by waiting to
see what happens, or by generating responses
that move it horizontally to the right by trial
and error until it hits the diagonal at Ci.e.
one of the responses proves to be adaptive.
No cognitively-driven simplification of the
stimulus is involved here; its response – a
mixture of cognition and behaviour – is thus
costly in terms of resources consumed.
In adopting this headless chicken response,
however, the agent might well move outside
its budget area OAB in Figure 16.2 thus
depleting its resource budget. When the
sheer variety of the stimuli allows neither
prediction nor anticipation – the first speci-
fies with precision some future event whereas
the second can only orient to general classes
of events – the agent would then do better to
adopt the wait-and-see option and let nature
show its hand. Alternatively, if the agent
Low High
Extended Adaptive Frontier
Variety of Responses
Variety of
Figure 16.4 The socio-technical expansion of the adaptive frontier
Low High
Variety of Response
Variety of
Figure 16.5 An agent at point Q in the Ashby space
believes that the variety of stimuli conceals
some structure, it could attempt to respond in
a purely cognitive fashion by moving verti-
cally down the diagram until it approaches
the horizontal axis at point R. In this case, the
agent treats all incoming stimuli either as
familiar regularities or as noise and thus not
in need of any new response. This is the strat-
egy of agents who have ‘seen it all before’
and – possibly overconfidently – feel no need
to actually do anything different. Call this a
routinizing response. But since any down-
ward movement calls for an interpretation and
classification of incoming stimuli, whether
this strategy is adaptive or not will depend
on how well the resulting schema matches
the real-world variety-reducing regularities
confronting the agent i.e. how effectively
complex they are.
Intelligent adaptive agents are best off
locating on the diagonal in the Ashby Space,
somewhere between O and a point before
which the diagonal of Figure 16.1 would
intersect the budget line AB of Figure 16.2.
That is, an intelligent agent first needs to
interpret the stimuli impinging upon it. This
requires a cognitive move either up or down
the diagram’s vertical scale that extracts
information about relevant regularities from
noisy incoming stimuli. The agent then
needs to develop a relevant schema and
respond with some action to regularities so
extracted – a behavioural move horizontally
across the diagram towards the right that is
only adaptive if it stops when it meets the
diagonal and does so before exhausting its
budget. A cognitive move up the Ashby
Space, effectively expands the range and
variety of stimuli that an agent will need to
process before responding as a result, as
Gell-Mann would put it, its schemas will
become more complex. Such an upward
move delivers exploratory learning (Holland,
1975; March, 1991). A cognitive move down
the Ashby Space, by contrast, draws on prior
learning to reduce both the range and variety
of stimuli and simplify the schemas required –
it delivers exploitative learning (Holland,
1975; March, 1991). Clearly, the further down
towards O an intelligent agent can move
before having to turn right and respond with
a physical (behavioural) action, the more
easily it can secure a quiet life for itself by
achieving adaptation within its resource
budget. Conversely, the further up the vertical
scale towards A the rightward move occurs,
the more turbulent life becomes for the agent
and the more resources it has to expend in
order to adapt.
The trajectory of any living system (i.e.
agent) through the Ashby Space reflects its
‘intelligence’ – its capacity to discern mean-
ingful regularities, develop adaptive schemas,
and generate effectively complex responses.
Given the limited number of stimuli that a
hummingbird’s brain can ‘make sense’ of,
for example, any trade-off that the bird is
required to make between its energy and
data-processing resources favours drawing
predominantly on its energy resources. The
variety of stimuli that a human being can
respond to adaptively, by contrast, is much
greater so that the trade-off favours draw-
ing predominantly on its data-processing
resources. A living system’s trajectory through
the space thus also tells us something about
its physiology. Not only are there physiologi-
cal limits as to what may count as a stimulus,
and hence as data, for a given type of system
a frog, for instance, can only detect and proc-
ess peripheral movement (Lettvin et al., 1959)
and a bat’s movements are guided by sound,
not sight – but there are also cognitive limits
on the system’s capacity to process the data
contained in the stimulus. It thus confronts
a problem of bounded rationality (Simon,
1986). Above the budget line the variety of
stimuli may be such that a system cannot even
register them. Yet, as indicated by Figure 16.4,
for many living systems and especially for
human beings, the budget area OAB is con-
stantly being expanded outward from the
origin by means of artefacts (Clark, 1997),
cultural transmission (Gregory, 1981; Boyd and
Richerson, 1985) and organized collective
action (Corning, 2003). These simultaneously
increase the variety of interpretive schemas
available to a system on the vertical axis and
that of the responses available to it on the
horizontal one – its effective complexity
and thus its adaptive capacity.6
Complexity in the Ashby Space –
three ontological regimes
Computational theory teaches us that prob-
lems whose size grows much faster than
their inputs may require what effectively
amounts to an infinite amount of data
processing for their solution (Chaitin, 1974;
Sipser, 1997). This will happen when the
inputs – which here we take to be stimuli –
cannot be made sense of. From the computa-
tional perspective, an intelligent agent
grappling with such vast problems will then
experience input stimuli as being unfathom-
ably complex. No regularities or structure
can be extracted from them and no sense
can, therefore, be made of them. Even prob-
lems whose size only grows moderately
faster than their inputs will be experienced
as very complex to an intelligent agent. Only
problems whose size is in some linear rela-
tionship with their inputs will come across
as ordered. If we now take variety to be the
phenomenological manifestation of com-
plexity at work and further assume that
problem-input size correlates with stimulus
variety for an intelligent agent such as a
human being (Grünwald et al., 2005), we can
map the different input sizes of various threats
and opportunities to which an agent has to
adapt onto the vertical axis of Figure 16.1 to
give us three distinct ontological regimes:
the Chaotic, the Complex, and the Ordered.
We show these in Figure 16.6.7
Mixing two regularities
Stimuli appearing in the chaotic regime8 at
the top of the diagram are hard to extract
useful information from and may be judged
computationally intractable, not just because
of the size problem but because they are
also experienced as chaotic. Unless luck
intervenes, an intelligent agent drawing on
conventional representations and unaware
of chaos dynamics can typically make no
sense of such stimuli within an adaptive
time frame – i.e. before depleting its energy
budget. Here, phenomena cannot even be
anticipated, let alone predicted. As sug-
gested earlier, an intelligent agent must then
either wait for nature to show its hand in
order to respond or it must proceed by trial
and error. How it will experience the adaptive
tension that it confronts under either option
will be a function of the resources available
The Chaotic Regime
The Complex Regime
The Ordered Regime
Low High
Variety of Response
Variety of
Figure 16.6 Ashby’s law in three regimes
to it since a lack of resource can itself be a
source of tension.
Stimuli appearing in the ordered regime at
the bottom of the diagram, by contrast, are
mostly linear in nature and are experienced
as relatively unproblematic by an intelligent
agent the resulting linear regularities and
noise are the stuff of everyday experience and
in the human case, the products of ‘normal
science’ (Kuhn, 1962).
In his discussion of the processes that under-
pin the three regimes, Gell-Mann (2002;
Maguire, this volume) distinguishes between
regularities produced by two fundamentally
different generative processes (Bhaskar, 1975):
Type 1. Reductionist Regularities: The causal
processes that are well captured through reduc-
tionist normal science, which are predictable and
easily represented by equations; the focus of
classical physics and neoclassical economics (Gell-
Mann, 2002: 19). These characterize the Ordered
Regime. They may be confidently schematized to
yield predictions that then become the basis of
prescriptive solutions.
Type 2. Scale-free Regularities: Outcomes result-
ing from an accumulation of random tiny initiating
events amplified by positive feedback effects that
generate unpredictable, seldom repeated nonlinear
– and possibly extreme – outcomes that have last-
ing effects; what Gell-Mann calls frozen accidents
(2002: 20). Scale-free regularities are at best prob-
lematic and beyond the reach of the explanatory
traditions of normal science.
Stimuli appearing in the complex regime
of Figure 16.2 are experienced as a blend of
Gell-Mann’s two types of regularities – a
partly law-like and partly unpredictable mix of
tiny initiating events (TIEs), frozen accidents,
and power-law phenomena bathed in noise.
Schema development in this regime is chal-
lenging to be sure, but computationally tracta-
ble once methods for separating out the two
kinds of regularities from noise are available.
The more phenomena intelligent agents
can classify unproblematically as ordered,
the more they can economize on scarce
data processing and energetic resources,
holding these in reserve for more challenging
phenomena – i.e. in responding, they will
attempt to minimize the distance that they have
to travel up and to the right in Figure 16.1.
Human beings have a historically validated
interest in steering phenomena downward in
the figure towards the ordered regime if they
possibly can, in order to economize on the
resources needed to respond – this is the
origin of their preference for simple mechan-
ical representations identified in the opening
section and, of course, of Gell-Mann’s reduc-
tionist regularities. But they can overdo it. If
too many of their ‘interpreted’ experiences
end up in the ordered regime – i.e. if they all
‘make sense’ and can be taken for granted
human beings lose their sense of the essen-
tially contingent nature of things and either
maladapt or fossilize. When human organiza-
tions overdo it, they encounter Miller’s (1990)
Icarus Paradox, and unwittingly end up
placing themselves in situations that turn out
to be beyond their capacity to adapt to – e.g.
they become so good at being efficient they
lose their capacity to change.
Clearly, the first step in schema develop-
ment with respect to some impinging real-
world phenomenon is to identify the ontology
appropriate for dealing with it. We outline
three possibilities in Figure 16.7. If, for
example, an agent interprets a phenomenon as
being ordered, it will pursue the cognitively-
routinizing response. This puts the agent
on the least-cost trajectory of moving down
the Q-to-R path in Figure 16.5 so as to stay
within its budget area OAB i.e. the data-
information-schema-development process
underlying the regularities is well understood.
If, by contrast, the agent views the phenom-
enon as chaotic, it will either do nothing and
wait or pursue the largely behavioural head-
less chicken response of moving from Q to C
in Figure 16.5 i.e. it could quite possibly
move outside its budget area. On this trajec-
tory the agent, knowing nothing of scalabil-
ity, power laws, and scale-free theories,
cannot make sense of anything. Latent regu-
larities completely escape it, leading it to
respond mindlessly. It may then expend so
much undirected energy that it ends up disin-
tegrating outside its budget area.
If an agent takes the phenomenon to be
complex – i.e. neither so ordered that it can
mobilize a least-cost response, nor so chaotic
that it can mobilize no meaningful schema at
all – it is on a scalability trajectory, one
defined both by butterfly-events, frozen acci-
dents, and nonlinearities as well as by many
other attributes characterizing the Complex
Regime. Here an adaptive response is feasi-
ble but more expensive than in the Ordered
Regime since schema development combines
both law-like and scalable TIEs. However,
the agent can now more successfully move
up the diagonal and still remain within its
budget frontier.
Which ontology is adaptive for an agent
may depend on how it experiences the level of
adaptive tension that it confronts. Increasing
tension often increases the level and strength
of connectivity between hitherto unconnected
phenomena, thus transforming what would
ordinarily appear to be reductionist regu-
larities into scale-free ones. TIEs will then
propagate more rapidly and easily through a
system, getting amplified in the process to
produce magnified, nonlinear, and possibly
extreme, outcomes. To illustrate: imagine a
fishing net lying loosely crumpled up in a
pile. Cut the net between any two nodes and
the rest of the net will remain undisturbed
and the effects of the cut will remain strictly
local. Now place the net under tension by
stretching it taut. If the net is taut enough,
then a single cut could initiate a tear that
would instantaneously spread from one end
of the net to the other. A similar dynamic
underlies the power blackouts that occasion-
ally afflict the New England power grid when
the utilities, by temporarily shutting down
one overloaded station, trigger a cascade of
further shutdowns throughout the North East
US. Given tension plus connectivity, then,
what starts off as a TIE can rapidly propagate
throughout any network, growing in severity
as it does so, with an extreme outcome the
result. An adaptive strategy in the Complexity
Regime of the Ashby Space thus needs a
data-processing epistemology appropriate to
the ontology underpinning the scale-free
regularities that it is called upon to deal with.
Anticipating scalability – the TIEs
that bind
The focus on negative feedback and equilib-
rium that has characterized the ‘object’ view
of organization and much economic think-
ing delivers predictability, control and the
maintenance of organizational identity – i.e.
survival – at a low cost. After all, equilibrium
spells stability and stability, in turn, maintains
The Ordered Regime
The Complex Regime
The Chaotic Regime
Variety of Response
Variety of
The Routinizer
The Strategist
The Behaviourist
Figure 16.7 Three responses in three regimes
identity and facilitates prediction and con-
trol. Positive feedback, by contrast favours
emergent self-organizing outcomes that
might be anticipated but cannot be predicted.
New order suddenly appears, often at the
expense of the old order – a complexity inter-
pretation of Schumpeter’s (1934) creative
destruction but no one can tell where
or when it will happen. The adaptive chal-
lenge is to anticipate it and to recognize and
reinforce or negate it i.e. to manage it
when it appears. This, however, turns out to
be less a question of how to anticipate the
downstream processes of emergent self-
organization than of how to anticipate the
upstream scalability dynamics that drive these.
Recall that two key elements giving rise to
self-organization are adaptive tension and con-
nectivity. Positive feedback between ele-
ments connected under tension is one source
of scalability that may push some TIEs to scale
up – possibly to deliver extreme outcomes
but there are others. In Table 16.1 we list six
that Andriani and McKelvey (2009) suggest
readily apply to organizations. For example:
Hierarchical modularity : Drug and toy compa-
nies having products produced in the Chinese
hinterland have discovered that too much
local (modular) autonomy due to culture, lan-
guage, distance, time zones, cheating on prod-
uct standards, trying to cut production costs,
coupled with the long-distance-based costs
of exerting more hierarchical monitoring (i.e.
increasing connection costs) led to poisonous
products. They paid a high price for modularity
bordering on anarchy. Walmart has abandoned
some large merger attempts in foreign coun-
tries because the connection costs of trying
to get firms in foreign culture to behave like
US Walmart stores were too expensive, even
unworkable. Hence Simon’s (1962) call for near
decomposability, but not anarchy and Gell-
Mann’s (1994) effective complexity just the
right number of connections.
Combination theory : It is like the ‘perfect storm’:
A container ship is loaded top-heavy; a severe
storm hits; the engine stalls for some unknown
reason; the ship can’t be steered ‘into the storm’;
consequently it capsizes. If any deviation occurs
by itself, nothing happens. But all three together
produce the extreme event.
Least effort : For Zipf and his analyses of language,
it was all about efficiency – I don’t want to use
words you don’t know; you don’t want to learn
words I am not going to use. Over the past dec-
ades even unabridged dictionaries have shrunk in
number of words – go to your library and check it
out! Dahui et al. (2005) show that Zipf’s Law of
least effort applies only to changing language;
Ishikawa (2006) and Podobnik et al. (2006) show
that it only applies to industries and economies
in transition as opposed to static ones. But
further analysis of Zanini’s (2008) industries
(Drayton, 2010) shows that the power-law line
of market capitalization is straightest in the most
mature industries, insurance and machinery;
see Figure 16.8. This appears opposite to what
Dahui et al., Ishikawa, and Podobnik et al., find.
It suggests that in free-market-based economies,
market capitalization (i.e. stock-market prices)
trends towards maximum ‘least-effort’ efficiency
as traders buy and sell on information based on
‘fundamentals’ (i.e. valid information about the
true value of the well understood mature firms);
this, then, leads to the improved power-law
Preferential attachment : With the ‘hub and spoke’
airport design, the more flights arriving at an
airport, the higher the incentive for other flights
to depart from there; the more flights departing
from there, the more incentive for more flights to
land there – the air transport equivalent of ‘the
rich get richer’.
Spontaneous order creation : In Wikipedia, for
example, one person writes a controversial entry.
Others join in to expand, correct, add references,
etc. Controversy, instability, and constant revising
of what some other person writes emerge. The
Wiki ‘hierarchy’, which has also emerged over
the years, begins to exert a stronger ‘review’ role,
hoping for abduction to the best explanation and
stability as well.
Self-organized criticality : Unlike the firms
frozen in states of efficiency producing obsolete
products—described in Danny Miller’s Icarus
Paradox book—effective firms have to keep
changing their product lines to keep up with
changing technologies and customer tastes.
Perhaps we see this most obviously in hamburger
stands around the world; they are pretty good at
adapting to changing local tastes and to what
competing hamburger chains are offering.
Given the complex interactions involved,
one cannot predict scalable outcomes.
Nevertheless, an understanding of adaptive
tension and connectivity allows one to ration-
ally anticipate and adapt to the dynamics of
scalability. Spotting meaningful TIEs then
becomes easier since one knows what to look
for. The greater the familiarity of scholars and
practitioners with scalability dynamics, the
earlier they are likely to spot and respond
adaptively to meaningful TIEs. This will
allow them to competently engage with the
Rank Over Market Capitalization
top 30, 2006
1000000 10000000 100000000 1000000000 10000000000 1E+11
Market Capitalization
1E+12 1E+13
Figure 16.8 Zanini’s industry market capitalizations in power-law form
Left to right, plots are of software, chemicals, machinery biotech R&D, and insurance.
Table 16.1 A sample of scale-free theories of nature*
1 Hierarchical modularity: As number of employees, n, in a firm increases, connectivity could increase by up to
n(n–1)/2, producing an imbalance between the gains from more employees vs. the cost of maintaining connectivity;
consequently organizations form modular designs so as to reduce the cost of connectivity; Simon argued that adaptive
advantage goes to ‘nearly decomposable’ subsystems (Simon, 1962).
2 Combination theory: The interactive combination of multiple exponential or lognormal (or other skew) distributions or
increased complexity of components (subtasks, processes) results in a power law distribution (West and Deering, 1995;
Newman, 2005).
3 Least effort: Word frequency is a function of ease of usage by both speaker and listener; this gives rise to Zipf’s
(power) Law; the efficiency of least effort is now found to apply to changing language as well as firms and economies
in transition (Zipf, 1949; Dahui et al., 2005; Ishikawa, 2006; Podobnik et al., 2006).
4 Preferential attachment: Given newly arriving agents into a system, larger nodes with an enhanced propensity to
attract agents will become disproportionately even larger (Barabási, 2002).
5 Spontaneous order creation: Heterogeneous agents seeking out other agents to copy/learn from so as to improve
fitness generate networks; given positive feedback, some networks become groups, some groups become larger
groups and hierarchies (Holland, 1995; Kauffman, 1993).
6 Self-organized criticality: Under constant tension of some kind (gravity, ecological balance), some systems reach
a critical state where they maintain stasis by preservative behaviours – such as Bak’s small to large sandpile
avalanches – which vary in size of effect according to a power law (Bak, 1996).
*We list six out of fifteen scale-free theories discussed by Andriani and McKelvey (2009).
Complexity Regime in the Ashby Space
instead of escaping prematurely either into
the Chaotic or the Ordered Regime.
Wiener’s 1948 book on cybernetics was
about control in animals and machines.
Bertalanffy’s 1968 book on general systems
theory also framed systems in terms of top-
down control processes: as in thermostats,
negative feedback loops keep systems
targeted on the objectives of their designers.
Extending these authors’ insights to cover
human organizations, Thompson (1967) saw
top management bureaucracies as top-down
control devices that created machine-like
working conditions for lower-level employ-
ees. Yet in the same period some organiza-
tional theorists (Burns and Stalker, 1961)
discovered a bottom-up process of autono-
mous, organic changes emerging from below
in organizations that allows them to respond
flexibly and adaptively to changing environ-
mental conditions (Lawrence and Lorsch,
1967). In sum, in the 1960s we see organiza-
tion theory adopting the basic tenets of Ashby’s
Law, holding that efficacious adaptation
occurs only when internal variety/complexity
matches external variety/complexity. The
Ashby Space invites organizational practition-
ers and scholars to now go one step further
and to incorporate the insights of complexity
theory with those of Ashby. It offers them a set
of regimesthe chaotic, the complex and the
ordered – that can help them to adapt intelli-
gently and economically to the ever wider set
of contingencies that confront them in a com-
plex and globalizing world, one in which TIEs
can rapidly scale up to produce extreme out-
comes. But what are the limits of adaptation?
Is there, for example, any limit to the expan-
sion by human beings of their data-processing
and schema-building resources – i.e. to the
vertical expansion of the budget area OAB of
Figure 16.2? A brief look at the 2007 liquidity
crisis illustrates the issues involved.
An example
By August 2007 some 8,000 US (smaller)
banks (Guerrera, 2009) accepted minimalist
risk/reward positions by staying away from
subprime mortgages, teaser loans, and by
insisting that mortgage borrowers show proof
of income and good credit. Such caution kept
them firmly ensconced in the Ordered Regime
of the Ashby Space. Some 12 major banks and
over 100 other smaller banks, however, had
adopted a risk/reward profile that increased the
level of adaptive tension confronting them
and tipped them over into the Complexity
Regime of the Space. Their financial engi-
neering models, derivatives, credit default
swaps, securitized loan packages, etc., gave
rise to risky loans amounting to some $50
trillion worldwide (Cooper, 2008; Morris,
2008; Foster and Magdoff, 2009). While these
loans had appeared solid before the bursting
of the US and other housing bubbles (e.g. in
the UK and Spain, among others) – they
became increasingly toxic over the course of
the year. Yet, while many of these high-risk
banks went bankrupt, the few that remained –
Goldman Sachs, Morgan Stanley, Citigroup,
Bank of America, and Wells Fargo – were
able to exploit the Federal Reserve bailouts by
engaging in merger and acquisition activity to
emerge far stronger and larger than they had
been. Here we see both positive and negative
scalability dynamics at work, triggered by
some early TIEs – the invention of derivatives
in 1973 and of mortgage-backed securities
c. 1985 (McKelvey and Yalamova, 2011;
Yalamova and McKelvey, 2011b).
As indicated by Figure16.6, the Complexity
Regime of the Ashby Space is sandwiched
between order and chaos. The tipping point
between the Ordered and Complex Regimes
is often crossed by risk-induced tension – i.e.
fear, greed, ambition, risk-taking, etc. – that
leads to a phase transition. On the one hand,
the 8,000 conservative small banks minimized
their risks and remained in the Ordered
Regime below the 1st critical value. They
applied most of the conventional tools of
risk management to achieve reductionist
regularities. Given low levels of adaptive tension,
they could pursue replicable and reliable rou-
tines and achieve levels of predictability that
kept their response budgets under control.
On the other hand, in response to strong
demands for wealth-creation and for large
bonuses by both owners and senior employ-
ees, large banks pursued high-risk strategies
that significantly increased the levels of
adaptive tension they were exposed to. For
them, fear, greed, ambition, and risk-taking
increased tension to the point that a phase
transition occurred. They thus found them-
selves in the Complexity Regime but getting
ever closer to the 2nd critical value at the
edge of chaos – i.e. the Chaotic Regime – as
a positive feedback cycle (i.e. greed risk-
taking more greed more risk-taking
and so on, etc.) got amplified (Minsky, 1976,
1982; McKelvey and Yalamova, 2011).
Recent evidence from econophysics shows
that stock-market traders cross a tipping
point – indicated by what is termed the Hurst
exponent between efficient-market behav-
iour (Fama, 1970) and the herding behav-
iour (Brunnermeier, 2001; Hirshleifer and
Teoh, 2003) that causes the power-law distri-
bution of stock-market price volatilities
(Alvarez-Ramirez et al., 2008; Yalamova and
McKelvey, 2011a, 2011b). Herding behav-
iour results in the positive feedback and other
scale-free dynamics, that, as Minsky (1982,
1986) and Yalamova and McKelvey (2011a,
2011b) argue, set off bubble build-ups. As
greed and risk-taking push market tensions to
the edge of chaos, they subsequently produce
a market crash.
In the Complexity region of the Ashby
Space we can expect to see increased levels
of tension-induced connectivity and herding
as traders and banks copy what appear to be
the best trading rules/strategies at the time,
given the absence of accurate information
about fundamental values of firms. But even-
tually the variety of stimuli confronting traders
and banks overpowers the seeming value of
rule-based herding responses so that panicked
reactions set in. We then see the collapse of
herding-based, price-volatility-induced power
laws as traders that are approaching the edge
of chaos and the collapse of markets (Grech
and Pamula, 2008) begin to jump ship. The
headless chicken response now goes into full
swing, and the adaptive resource budget gets
squandered as the crash progresses. In the
2007 liquidity crisis, the failure of mortgage-
backed loans quickly set up the conditions
that gave rise to the ~$50 trillion’s worth of
toxic loans worldwide (Marshall, 2009).
In the Complexity Regime of the Ashby
Space, power-law thinking trumps the
Gaussian thinking and normal distributions
on which most risk management models
depend. Power law distributions show how
TIEs can get amplified to generate extreme
events. In this region, all that can be hoped for
is anticipation, not prediction. Why, then,
given the dangers, would managers and entre-
preneurs ever want to operate in this space?
Because, in this space, in contrast to the linear
and hence calculable risk/returns associated
with the Ordered Regime, TIEs can offer
positive payoffs that may also be power law-
distributed i.e. being nonlinear the payoffs
can be very large indeed. It is the relentless
quest for extreme positive payoffs, forced on
managers by corporate owners and talented
employees that keeps pushing them to the
Edge of Chaos (McKelvey, 2001, 2008).
Scholars and practitioners who have some
appreciation of the scalability dynamics in the
Complex Regime of the Ashby Space stand a
better chance of securing the payoffs availa-
ble in this region while avoiding the dangers.
By integrating Ashby’s perspective on the
nature of efficacious adaptation with our grow-
ing understanding of the complexity phe-
nomenon, the Ashby Space offers scholars
and practitioners a conceptual framework for
thinking through some of the more pressing
problems that confront a globalizing world.
What, for example, are the challenges of
adapting to nonlinear changes in the climate?
Or of adapting to the emergence of asymmet-
ric threats? What are the scalable opportu-
nities that we can associate with the spread
of the Internet or of mobile telephony? The
above challenges will not be successfully
addressed in the ordered regime of the Ashby
Space. We must learn to wander out into the
Complex Regime and explore what it has to
offer us without necessarily falling into the
Chaotic one. To succeed we need a more
nuanced yet theoretically robust view of how
organized systems partition their environ-
ment in their attempts to adapt to it within
the resource envelope available to them.
Current treatments of the human organiza-
tion/environment interface are often too
descriptive and too under-theorized to yield
the insights needed. Much of the necessary
thinking is today coming out of theoretical
biology where the use of the terms ‘organiza-
tion’ and ‘environment’ extends well beyond
their application in management and the
social sciences. The Ashby Space offers a
conceptual bridge between these different
disciplines. Future research – theoretical and
empirical should exploit the potential
synergies on offer.
1 Entropy measures a system’s degree of disor-
ganization, taking it to be the amount of uncertainty
still remaining in the system once its observable,
uncertainty-reducing regularities are accounted for.
2 Terms shown in SMALL CAPITALS are further defined
in Table 16.1, with examples later in the chapter.
3 The term, butterfly effects dates back to the
title of E.N. Lorenz’s paper of (1972): ‘Predictability:
Does the flap of a butterfly’s wings in Brazil set off a
tornado in Texas?’ Paper presented at the 1972 meet-
ing of the American Association for the Advancement
of Science. Washington, DC.
4 Fractals are defined as shapes that can be sub-
divided into parts, each of which is (at least approxi-
mately) a reduced-size copy of the whole (Mandelbrot,
1982). The same mathematical equation – or adap-
tive causal dynamic in biology or for firms – creates
similar causal dynamics at each level of a fractal
structure. See Andriani and McKelvey (this volume)
for further discussion of fractals and scalability.
5 In what follows we do not distinguish between
the variety that exists within a given stimulus or
response vs. that which occurs across stimuli and
responses. The distinction is one that the organism
itself must make through acts of interpretation. See
6 A phylogenetic application of this argument
would allow us to map the vertical and horizontal
dimensions of the Ashby Space respectively onto
Salthe’s (1985) and Eldredge’s (1985) ecological and
genealogical hierarchies, yielding an evolutionary per-
spective on adaptation. See Brooks and Wiley (1988).
7 Although the horizontal axis could also be so
partitioned, for ease of exposition we refrain from
doing so.
8 Here, we are using the term ‘chaotic’ in its eve-
ryday sense. This is broader than its mathematical
sense à la chaos theory (Guastello, 1995) since it
mixes deterministic and stochastic processes.
Aldrich, H.E. (1979) Organizations and Environments.
Englewood Cliffs, NJ: Prentice-Hall.
Alvarez-Ramirez, J., Alvarez, J., Rodriguez, E., and
Fernandez-Anaya, G. (2008) Time-varying Hurst
Exponent for US Stock Markets. Physica A, 387:
Anderson, P.W. (1972) More is different. Science, 177:
Andriani, P. and McKelvey, B. (2009) From Gaussian to
Paretian thinking: Causes and implications of power
laws in organizations. Organization Science, 20(6).
Arthur, W.B. (1994) Increasing Returns and Path
Dependence in the Economy. Ann Arbor, MI:
University of Michigan Press.
Ashby, R.W. (1956) An Introduction to Cybernetics.
London: Methuen.
Ashby, W.R. (1962) Principles of the self-organizing
system. In: H. von Foerster and G.W. Zopf (eds)
Principles of Self-Organization. New York: Pergamon,
pp. 255–278.
Bak, P. (1996) How Nature Works. New York:
Ball, P. (2004) Critical Mass. London: Arrow Books.
Barabási, A.-L. (2002) Linked. Cambridge, MA: Perseus.
Bénard, H. (1901) Les tourbillons cellulaires dans une
nappe liquide transportant de la chaleur par convec-
tion en régime permanent. Annales de Chimie et de
Physique, 23: 62144.
Bertalanffy, L. von (1968) General System Theory.
New York: Braziller.
Bhaskar, R. (1975) A Realist Theory of Science. London:
Leeds Books. [2nd edn published by Verso (London),
Boyd, R. and Richerson, P.J. (1985) Culture and the
Evolutionary Process. Chicago: The University of
Chicago Press.
Brock, W.A. (2000) Some Santa Fe scenery. In:
D. Colander (ed.) The Complexity Vision and the
Teaching of Economics. Cheltenham, UK: Edward
Elgar, pp. 29–49.
Brooks, D.R. and Wiley, E.O. (1988) Evolution as
Entropy. Chicago: University of Chicago Press.
Brunnermeier, M.K. (2001) Asset Pricing under
Asymmetric Information: Bubbles, Crashes, Technical
Analysis, and Herding. Oxford, UK: Oxford University
Buckley, W. (ed.) (1968) Modern Systems Research for
the Behavioral Scientist. Chicago: Aldine.
Burns, T. and Stalker, G.M. (1961) The Management of
Innovation, London: Tavistock.
Chaitin, G.J. (1974) Information-theoretic computa-
tional complexity. IEEE Transactions on Information
Theory, 20: 10–15.
Churchman, C.W. and Ackoff, R.L. (1950) Purposive
behavior and cybernetics. Social Forces, 29: 32–39.
Clark, A. (1997) Being There. Cambridge, MA: MIT
Coase, R. (1937) The nature of the firm. Economica,
4: 386–405.
Cooper, G. (2008) The Origin of Financial Crises.
New York: Vintage Books.
Corning, P. (2003) Nature’s Magic. Cambridge, MA:
Cambridge University Press.
Dahui, W., Menghui, L., and Zengru, D. (2005) True
reason for Zipf’s law in language. Physica A, 358:
Dennett, D.C. (1987) The Intentional Stance. Cambridge,
Mass: The MIT Press.
Drayton, C. (2010) Translating Zanini’s power curves
into power laws. Research project at the
UCLA Anderson School of Management, Los
Angeles, CA.
Eldredge, N. (1985) Unfinished Synthesis: Biological
Hierarchies and Modern Evolutionary Thought.
New York: Oxford University Press.
Etzioni, A. (1961) A Comparative Analysis of Complex
Organizations. New York: Free Press.
Fama, E.F. (1970) Efficient capital markets: a review
of theory and empirical work. Journal of Finance,
21: 383–417.
Fayol, H. (1916) Administration industrielle et
générale. Bulletin de la Société de l’Industrie
Minérale, No. 10.
Foster, J.B. and Magdoff, F. (2009) The Great Financial
Crisis. New York: Monthly Review Press.
Gell-Mann, M. (1988) The concept of the Institute. In:
D. Pines (ed.) Emerging Synthesis in Science.
Reading, MA: Addison-Wesley, pp. 1–15.
Gell-Mann, M. (1994) The Quark and the Jaguar.
New York: Freeman.
Gell-Mann, M. (2002) What is complexity? In:
A.Q. Curzio and M. Fortis (eds) Complexity and
Industrial Clusters. Heidelberg, Germany: Physica-
Verlag, pp. 13–24.
Grech, D. and Pamula, G. (2008) The local Hurst expo-
nent of the financial time series in the vicinity of
crashes on the Polish stock exchange market.
Physica A, 387: 4299–4308.
Gregory, R.L. (1981) Mind in Science. Middlesex, UK:
Penguin Books.
Grünwald, P., Myung, I., and Pitt, M. (2005) Advances
in Minimum Description Length. Cambridge, MA:
MIT Press.
Guastello, S.J. (1995) Chaos, Catastrophe, and Human
Affairs. Mahwah, NJ: Lawrence Erlbaum.
Guerrera, F. (2009) Old bank axioms gain new
currency. Financial Times (Tuesday, July 7), p. 12.
Haken, H. (1977) Synergetics, An Introduction. Berlin:
Hayek, F.A. (1967) Studies in Philosophy, Politics and
Economics. Chicago, IL: University of Chicago Press.
Hirshleifer, D. and Teoh, S.H. (2003) Her behaviour and
cascading in capital markets: A review and synthesis.
European Financial Management, 9: 25–66.
Holland, J.H. (1975) Adaptation in Natural and Artificial
Systems. Cambridge, MA: MIT Press.
Holland, J.H. (1988) The global economy as an adaptive
system. In: P.W. Anderson, K.J. Arrow, and D. Pines
(eds) The Economy as an Evolving Complex System.
Reading, MA: Addison-Wesley, pp. 117–124.
Holland, J.H. (1995) Hidden Order. Cambridge, MA:
Perseus Books.
Ishikawa, A. (2006) Pareto index induced from the
scale of companies. Physica A, 363: 367–376.
Kast, F.E. and Rosenzweig, J.E. (1973) Contingency
Views of Organization and Management. Chicago:
Science Research Associates.
Kauffman, S.A. (1993) The Origins of Order. New York:
Oxford University Press.
Koontz, H. and O’Donnell, C.J. (1964) Principles of
Management. New York: McGraw-Hill.
Kuhn, T.S. (1962) The Structure of Scientific Revolutions.
Chicago, IL: University of Chicago Press.
Lawrence, P.R and Lorsch, J.W. (1967) Organization
and Environment. Cambridge, MA: Harvard Business
School Press.
Lettvin, J.Y., Maturana, H.R., McCulloch, W.S., and
Pitts, W.H. (1959) What the frog’s eye tells the
frog’s brain. Proceedings of the Institute of Radio
Engineering, 47: 1940–1951.
Lorenz, E.N. (1972) Predictability: Does the flap of a
butterfly’s wings in Brazil set off a tornado in Texas?
Paper presented at the 1972 meeting of the American
Association for the Advancement of Science.
Washington, DC.
Mainzer, K. (1994) Thinking in Complexity. New York:
Springer-Verlag. [5th edn published in 2007.]
Mandelbrot, B.B. (1982) The Fractal Geometry of
Nature. New York: Freeman.
March, J.G. (1991) Exploration and exploitation in organ-
izational learning. Organization Science, 2: 71–87.
March, J.G. and Simon, H.A. (1958) Organizations.
New York: Wiley.
Marshall, A.G. (2009) Entering the greatest depression
in history: More bubbles waiting to burst. Global
Research (August 16): 1–11. http://www.globalresearch.
Maruyama, M. (1963) The second cybernetics. American
Scientist, 51: 164–179.
Maturana, H.R. and Varela, F.H. (1980) Autopoiesis
and Cognition. Dordrecht, The Netherlands: Reidel.
McGregor, D. (1960) The Human Side of Enterprise.
New York: McGraw-Hill.
McKelvey, B. (2001) Energizing order-creating net-
works of distributed intelligence. International
Journal of Innovation Management, 5: 181–212.
McKelvey, B. (2008) Emergent strategy via complexity
leadership. In: M. Uhl-Bien and R. Marion (eds)
Complexity and Leadership: Part I. Charlotte, NC:
Information Age Publishing, pp. 225–268.
McKelvey, B. and Boisot, M. (2009) Redefining strate-
gic foresight: ‘Fast’ and ‘far’ sight via complexity
science. In: L.A. Costanzo and R.B. MacKay (eds)
Handbook of Research on Strategy and Foresight.
Cheltenham, UK: Elgar, pp. 15–47.
McKelvey, B. and Yalamova, R. (2011) The build-up to
the 2007 liquidity crisis: An example of scalability
dynamics in action. In: G. Sundström and E. Hollnagel
(eds) Governance and Control of Financial Systems:
A Resilience Engineering Perspective, Ch. 7.
Miller, D. (1990) The Icarus Paradox. New York:
Miller, J.G. (1978) Living Systems. New York:
Minsky, H.P. (1982) Can ‘It’ Happen Again? Armonk,
NY: M.E. Sharpe, Inc.
Minsky, H.P. (1986) Stabilizing an Unstable Economy.
New Haven, CT: Yale University Press. [2nd edn
published by McGraw-Hill, 2008.]
Morris, C.R. (2008) The Two Trillion Dollar Meltdown
(revised and updated). New York: PublicAffairs.
Newman, M.E.J. (2005) Power laws, Pareto distribu-
tions and Zipf’s law. Contemporary Physics, 46:
Perrow, C. (1972) Complex Organizations: A Critical
Essay. Glennview, Il: Scott, Foresman.
Plotkin, H. (1993) Darwin Machines and the Nature of
Knowledge. Cambridge, MA: Harvard University
Podobnik, B., Fu, D., Jagric, T., Grosse, I., and Stanley, H.E.
(2006) Fractionally integrated process for
transition economics. Physica A, 362: 465–470.
Prigogine, I. (1955) An Introduction to Thermo-
dynamics of Irreversible Processes. Springfield, IL:
Prigogine, I. and Stengers, I. (1984) Order Out of Chaos.
New York: Bantam.
Salthe, S. (1985) Evolving Hierarchical Systems: Their
Structure and Representation. New York: Columbia
University Press.
Schrödinger, E. (1944) What is Life? Cambridge, UK:
Cambridge University Press.
Schumpeter, J. (1934) The Theory of Economic
Development. Boston, MA: Harvard University Press.
Simon, H.A. (1947) Administrative Behavior. New York:
Simon, H.A. (1962) The architecture of complexity.
Proceedings of the American Philosophical Society,
106: 467–482.
Simon, H.A. (1986) Rationality in psychology and
economics. Journal of Business, 59(4): S209–S224.
Sipser, M. (1997) Introduction to the Theory of
Computation. Boston, MA: PWS Publishing Co.
Swenson, R. (1989) Emergent attractors and the law of
maximum entropy production. Systems Research,
Taylor, F.W. (1911) The Principles of Scientific
Management. New York: Harper.
Thaler, R. (1992) The Winner’s Curse. Princeton, NJ:
Princeton University Press.
Thompson, J.D. (1967) Organizations in Action.
New York: McGraw Hill.
Trist, E.L. and Bamforth, K.W. (1951) Some social and
psychological consequences of the longwall method
of coal getting. Human Relations, 4: 3–38.
Vermeij, G. (2004) Nature: An Economic History.
Princeton, NJ: Princeton University Press.
Weber, B.H., Depew, D.J., and Smith, J.D. (eds) (1988)
Entropy, Information, and Evolution. Cambridge,
MA: MIT Press.
West, B.J. and Deering, B. (1995) The Lure of Modern
Science. Singapore: World Scientific.
Wiener, N. (1948) Cybernetics. Cambridge, MA: MIT
Williamson, O.E. (1975) Markets and Hierarchies,
New York: Free Press.
Woodward, J. (1958) Management and Technology.
London: HMSO.
Yalamova, R. and McKelvey, B. (2011a) Explaining
what leads up to stock market crashes: A phase
transition model and scalability dynamics. Journal of
Behavioral Finance.
Yalamova, R. and McKelvey, B. (2011b) Using power
laws and the Hurst coefficient to identify stock market
trading bubbles. In: G. Sundström and E. Hollnagel
(eds) Governance and Control of Financial Systems:
A Resilience Engineering Perspective, Ch. 9
Zanini, M. (2008) Using ‘power curves’ to assess
industry dynamics. The McKinsey Quarterly,
Zipf, G.K. (1949) Human Behavior and the Principle of
Least Effort. New York: Hafner.
... The environments in which an organization operate are dynamic; therefore, leadership must be flexible to new ideas, responsive to shifts, and embrace uncertainty. Formal governance within organizations oftentimes seeks to routinize tasks for efficiency, but this creates a vulnerable position in the Anthropocene (March, 1991;Boisot and Mckelvey, 2011;Havermans et al., 2015;Daviter, 2017). If an organization overemphasizes exploitative behaviors, it will likely be unable to adjust therefore collapsing due to a lack of flexibility, innovation, and adaptive capacity (Lichtenstein and Ashmos Plowman, 2009;Papachroni et al., 2016;Sovacool et al., 2018;Martin, 2019). ...
... The more increasingly non-linear relationships become, the more complex the system. While complexity is in part synonymous with unpredictability, complex organizations are necessary to help navigate increasingly complex environments (Folke et al., 2005;Boisot and Mckelvey, 2011;Head and Alford, 2013). When there are numerous collaborators sharing (and creating) knowledge, learning becomes increasingly fruitful (Rashman et al., 2009;Park et al., 2013). ...
... Decentralized organizations are capable of meeting complexity with complexity because they encourage non-linear relationships which emerge unpredictable endeavors (Lichtenstein and Ashmos Plowman, 2009;Boisot and Mckelvey, 2011). Decentralized organizational structures emphasize the significance of informal relationships. ...
Full-text available
Leadership is a critical component in approaching infrastructure resilience. Leadership, the formal and informal governance within an organization, drives an infrastructure system's ability to respond to changing circumstances. Due to the instability of the Anthropocene, infrastructure managers (individuals who design, build, maintain, and decommission infrastructure) can no longer rely on assumptions of stationarity, but instead that shifts are occurring at a faster rate than institutions and infrastructure organizations are adapting. Leadership and organizational change literature provide considerable insights into the ability of organizations to navigate uncertainty and complexity, and infrastructure organizations may be able to learn from this knowledge to avoid obsolescence. Therefore, this article asks: what leadership capabilities do infrastructure organizations need to readily respond to stability and instability? An integrative leadership framework is proposed, exploring capabilities of collaboration, perception and exploration toward learning, and flexible informal and formal governance leveraged by leadership. These capabilities are driven by underlying tensions (e.g., climate change, emerging technologies) and managed through enabling leadership, a set of processes for pivoting between stability and instability. The framework is then applied to infrastructure organizations. Lack of market competition may make infrastructure organizations more open to collaboration and, therefore, learning. However, the need to provide specific services may cause risk adversity and an avoidance of failure, restricting flexibility and innovation. It is critical for infrastructure organizations to identify their strengths and weaknesses so they may develop an approach to change at pace with their external environments.
... To perform research into complex systems in which power law distributions are in operation, there is a need to interpret the processes of dynamicity and that requires qualitative and longitudinal studies. 59 There is also value in an abductive logic of inquiry, which allows for the weaving and entanglement of previous evidence into the greater understanding of the whole complex adaptive system. 60 Health and social care systems deal with many interconnected and entangled issues that require researchers in the field to take a participatory, inclusive, integrated and multidisciplinary approach to research and that requires theoretical and methodological pluralism. ...
... The heterogeneity of empirical studies is perhaps not unsurprising as it is still early days in the application of complexity theory to health and social care. Given Ashby's law of requisite variety as operationalised in the Ashby space as described by Boisot and McKelvey,59 this makes it hard to initially establish any consistency in the domain. We therefore propose guidance that could provide more comparability in evidence-based studies going forward. ...
Background Complexity theory has been chosen by many authors as a suitable lens through which to examine health and social care. Despite its potential value, many empirical investigations apply the theory in a tokenistic manner without engaging with its underlying concepts and underpinnings. Objectives The aim of this scoping review is to synthesise the literature on empirical studies that have centred on the application of complexity theory to understand health and social care provision. Methods This scoping review considered primary research using complexity theory-informed approaches, published in English between 2012 and 2021. Cochrane Database of Systematic Reviews, MEDLINE, CINAHL, EMBASE, Web of Science, PSYCHINFO, the NHS Economic Evaluation Database, and the Health Economic Evaluations Database were searched. In addition, a manual search of the reference lists of relevant articles was conducted. Data extraction was conducted using Covidence software and a data extraction form was created to produce a descriptive summary of the results, addressing the objectives and research question. The review used the revised Arksey and O’Malley framework and adhered to the Preferred Reporting Items for Systematic Reviews and Meta-analysis Extension for Scoping Reviews (PRISMA-ScR). Results 2021 studies were initially identified with a total of 61 articles included for extraction. Complexity theory in health and social care research is poorly defined and described and was most commonly applied as a theoretical and analytical framework. The full breadth of the health and social care continuum was not represented in the identified articles, with the majority being healthcare focused. Discussion Complexity theory is being increasingly embraced in health and care research. The heterogeneity of the literature regarding the application of complexity theory made synthesis challenging. However, this scoping review has synthesised the most recent evidence and contributes to translational systems research by providing guidance for future studies. Conclusion The study of complex health and care systems necessitates methods of interpreting dynamic prcesses which requires qualitative and longitudinal studies with abductive reasoning. The authors provide guidance on conducting complexity-informed primary research that seeks to promote rigor and transparency in the area. Registration The scoping review protocol was registered at Open Science Framework, and the review protocol was published at BMJ Open ( ).
... Viable systems also have to survive within a resource budget. It would not be intelligent for such a system to interact with its environment in an adaptive manner which nevertheless results in a budgetary overspend that threatens survival (Boisot & McKelvey, 2011). ...
... There were, however, two examples of computational mechanisms that actually increased resource availability and in doing so expanded the swarm's adaptive frontier, the outer limit of its resource budget (Boisot & McKelvey, 2011). The swarm was able to save considerable energy expenditure by placing a high valuation on a potential nest site containing energy expensive beeswax comb built by a previous colony (Seeley, 2010;Visscher et al., 1985). ...
Full-text available
The honey bee swarm carried out its best‐of‐N nest site computations by operating more than a dozen different information processing loops in parallel and by recruiting more resources to provide greater precision in loops evaluating the better quality sites. The positive feedback effects of recurrent recruitment by means of waggle dance signalling amplified the utilisation of the swarm's energy, memory and carrier resources. The relatively strong negative feedback effects of various attenuation mechanisms tended to reduce resource use and therefore counter‐balanced amplification for long enough for a meaningful nest site survey and quorum decision to be made. Some information processing mechanisms such as exploration tendency, waggle dance signalling, site non‐specific attenuation, noise reduction, independent site evaluation, energy efficient coding, mixed precision processing, self‐organising computation and quorum decision making were found to profoundly influence the efficiency of resource use. Significant insights were also gained into extended cognition, dark data processing, information quality and resource leakage. Finally, the energy cost of acquiring and processing sensory information was estimated.
... Además, la idea de flujo remonta no solamente a la característica dinámica de la empresa, también ayuda a reflexionar sobre la velocidad de generación de productos y servicios, a la calidad de estos, a la idea de valor y a su mejor distribución.Es de reconocer que los temas financieros, sobre todo los dirigidos hacia la rentabilidad y a la creación de riqueza, suelen ser secundarios en muchos proyectos de emprendimiento social. Usualmente, los emprendedores están más orientados al cumplimiento de sus metas sociales, pero también se debe notar que la falta de cultura financiera puede llegar al fracaso del proyecto social.En cuanto a la información utilizada para el proceso cibernético, un elemento clave es la variedad de componentes de la cual se desprende la ley de la variedad requerida o ley de Ashby(Boisot & McKelvey, 2011). Todos los sistemas, incluyendo sus subsistemas, poseen elementos que tienden a aumentar su variedad. ...
Full-text available
El emprendimiento social permite que diferentes actores interactúen en acciones conjuntas las cuales permitan mejorar las condiciones de vida de la sociedad; en esta interacción surgen procesos reflexivos guiados por el diálogo; es así como el propósito de este estudio es ofrecer una propuesta fenomenológica a realizarse dentro de los procesos de intervención acción soportada en la teoría de los sistemas con una visión constructivista y cognitivista. El método se basó en la revisión bibliográfica de los conceptos clave (emprendimiento social y teoría de sistemas), para luego estudiar el caso de un estudio cualitativo con enfoque etnográfico realizado con un emprendimiento social ecoturístico. Como resultado relevante se propone la necesidad de ayudar a las empresas sociales mediante procesos de intervención acción, guiados teóricamente a través de los fenómenos derivados de la teoría de sistemas, los cuales les permitan entender y actuar de una manera más acorde con su naturaleza como proyecto social.
Dieser Beitrag ist als Vorüberlegung einer systemtheoretischen Forschung zum Thema „Umweltkrisen“ zu verstehen. In ihm geht es um Beziehungsweisen sozialer Systeme. Auf Grundlage der allgemeinen Unterscheidung von System und Umwelt werden Krise und Resilienz als sozialinterne Prozesse definiert. Als Differenz weist ein geschlossenes System auch Eigenschaften der Offenheit auf. „Krise“ markiert dabei den Bereich der Öffnung: der Irritation, des im Voraus unklaren operativen Anschlusses sowie der Passung zur Umwelt. „Resilienz“ markiert den Bereich der Schließung: der Selbstreferenz und Rekursivität. Beide Prozesse bleiben im Routineverlauf des Systems latent bzw. implizit. In einer akuten Krise jedoch werden sie dem System selbst offenbar. Der Beitrag konzipiert diesen Zusammenhang als „Explikationsphase“. In ihr stellt sich Krise als Unterbrechung der sozialen Beobachtungsstruktur dar. Das Moment der Offenheit wird explizit. Ebenso wird explizit, dass die Wiedereinführung von Beobachtbarkeit nur durch das System selbst erfolgen kann. Der Prozess, in welchem Letzteres gelöst wird, ist als explizite Resilienz beschreibbar. In der Explikationsphase besteht, so die Annahme des Beitrags, Gelegenheit zur Umstellung der Beobachtungsstruktur auf eine höhere Flexibilität und Umweltsensibilität. Der Beitrag diskutiert dieses nicht zuletzt mit dem Anliegen, Lösungsdynamiken für das problematisch gewordene Verhältnis von Weltgesellschaft und außerkommunikativer Umwelt aufzuzeigen.
This chapter outlines the analytical elements that organizes the volume. This chapter aims to understand how public organizations facing turbulence and complexity solve public problems in the short term whilst fostering the institutionalization of problem-solving in the long run through a variety of institutional arrangements. One potential organizational choice includes the reliance on pre-existing organizational designs. However, if such structures are either absent or deemed poor, an alternative is to design new platforms for collaborative governance that are distinguished from existing governmental structures by emphasizing the inclusion of various actors across the public and private sectors as well as civil society. This chapter introduces a set of theoretical ideas that guides the volume. In short, the chapter outlines the concept of complexity and the governing of complexity in contemporary societies. The subsequent section presents turbulence and outline varieties of turbulence and consequences for public governance. The final section discusses how public organizations may embrace turbulence by fostering resilience.
Leadership is a function in social systems and aims to reduce (social) complexity and contingency. Hierarchy as the predominant leadership model is seen gradually superseded by more egalitarian approaches, due to the fact that in nowadays organizations the full potential of humans is needed, while cascading instructions top-down is not anymore sufficiently functional in many cases. The associated power shift leads to new tasks for leadership, as well as for those taking over leadership roles, which are reflected by facilitating and maintaining cooperation, where cooperation is a deliberate act of “free” individuals. We will argue for two forms of cooperation a weak form (compliance) and a strong form (considering the interests of all). The theory of cooperation is being discussed from an ethical point of view, based on the moral philosophy by Morton Deutsch and James Tufts. Major implications on the principles and practices of leadership will be shown. Ultimately suggestions for the implementation of a corresponding leadership ethics are given. The findings are important for building sustainable work units on an ethical foundation.
Complexity is an essential and fundamental concept in complex systems. The most rudimentary perspective of complexity suggests a large number of entities/variables in rich interaction, not totally “knowable,” subject to emergence, and dynamically changing over time. However, for complex system governance (CSG), complexity has much more profound ramifications than the rudimentary perspective. Thus, the purpose of this chapter is to explore in-depth the nature, role, and implications of complexity for CSG. Three central themes of complexity are explored. First, the many different variations of complexity are synthesized into a set of cogent themes to provide a grounded perspective to inform CSG. Second, the role that complexity holds for the emerging CSG field is explored. Additionally, insights into the themes are provided in relation to CSG. Third, a set of implications of complexity for the design, deployment, and development aspects of CSG are examined. These implications are examined considering both field development as well as practice for CSG. The chapter closes with complexity-related challenges for CSG field development along with theoretical, methodological, and practice implications.KeywordsSystems theoryComplex system governanceComplexityEmergence