Content uploaded by Patrick Hoverstadt
Author content
All content in this area was uploaded by Patrick Hoverstadt on Aug 12, 2016
Content may be subject to copyright.
VIABILITY AS A BASIS FOR PERFORMANCE MEASUREMENT
Patrick Hoverstadt and Ian Kendrick
Fractal Consulting, UK
Steve Morlidge
Unilever. UK
Abstract
Conventional approaches to Performance Measurement have been based on a
mechanistic ‘target-plan-variance’ model that was introduced into mainstream
management practice in the 1950’s. It has been subject to criticism from within
both the academic and practitioner community over the last 50 years but has
proved remarkably resistant to change. Systems ideas, particularly those
emanating in the field of Cybernetics, have not been successfully applied in
this field because, it is argued, the concepts have been misunderstood and
falsely blamed for the perceived failings of conventional practice. Some ways
in which Cybernetic models could be used to help design performance
management systems are discussed and a case made for more research and
experimentation in this field.
Historical Context
The genesis of modern conventional performance management practices can be traced back
to the work of Donaldson Brown and Alfred Sloan at General Motors in the 1920’s and
1930’s. They did not take root in the majority of commercial enterprises until the 1950’s,
however, following the growth of MBA programmes in the United States supported by the
codification of knowledge in management textbooks and the missionary work of
Management Consultants.
Whilst the approach has been refined over the years the basic model, often described as
budgetary control, has remained remarkably consistent. Targets are expressed as absolute
numbers and are fixed as part of an annual process that involves a negotiation around a
detailed plan that is usually expressed in financial terms. Control is then exercised by analysis
of variances between actual and plan and action taken to bring performance back into line
with plan. This process usually follows the lines of the organisational hierarchy: targets and
plans agreed at one level then form the basis for the process at the next level down and so
forth. Typically the resultant plans and targets are perceived as commitments; commitments
often underpinned by financial incentives and/or informal systems of rewards and
punishments.
In recent years there has been a recognition that financial measures cannot fully deal with the
complexity of the management process. Non financial measures, such as those generated by
the application of the Balanced Scorecard, form an increasingly important part of the mix,
particularly in service industries and the public sector. The basic paradigm, however, remains
essentially the same.
Right from the very early days of its adoption as standard management practice the model has
been subject to criticism. In 1952 Chris Agyris (Agyris, 1952) conducted work for the US
Controllers Foundation which sought to understand the reason why the application of
budgetary control often led to demotivated employees and dysfunctional behaviour and
concluded that participation in the budget setting process was the solution.
This sense of dissatisfaction remains. Arguably, amongst practitioners it has grown. Recent
surveys (Answerthink, 2003) still point to behavioural problems associated with budgeting
but also reveal problems with the bureaucracy and cost of these practices and the inflexible
nature of the process in the face of market turbulence.
There have been numerous initiatives over the decades to reform budgeting, such as PPBS,
ZBB, ABB and so on, but most of these operate from within the current target-plan-variance
model. A notable exception in recent years is the Beyond Budgeting Model (Hope and Fraser,
2003) which advocates a less mechanistic set of processes sitting within a devolved model of
management.
Within the academic community the root cause of the problem is perceived to be the ‘poverty
of management control philosophy’. In an influential paper published in 1978 under this title
(Hofstede, 1978), Geert Hofstede characterised this paradigm as cybernetic in nature, since it
relied upon the specification of quantified goals, and the operation of a negative feedback
loop to correct deviation from these goals. Drawing on Boulding's Hierarchy of Systems
(Boulding, 1956) he argues that this model could only be applied successfully to ‘machine
like’ systems and is therefore an inappropriate model to apply to complex social systems.
Partly as a result of the impact of this paper, and the fact that academics operating within the
mainstream paradigm began associating themselves with ‘cybernetic philosophy’
(Marciariello, 1984) it became deeply unfashionable to apply systems ideas to the study of
Management Control Systems. In academia work in this field has come to be dominated by
paradigms drawn from other academic domains such as sociology (e.g. Contingency Theory).
Early Developments in Cybernetics and Systems
The criticism launched by Hofstede was fair and accurate insofar as it correctly identified that
the conventional paradigm is consistent with the approach applied to the control of simple
machines. Also it is fair to say that the very early years of cybernetic work focussed on
understanding the science of controlling physical systems – now firmly established in the
science of control engineering.
Where it missed the mark, however, was that it was also recognised by the very early
cyberneticians that the same basic concepts, such as feedback, needed to be applied in very
different ways to explain how control operates in the biological and social worlds.
The seminal piece of work was that of W Ross Ashby, who in 1956 published ‘An
Introduction to Cybernetics’ (Ashby, 1956). One of his most notable contributions is the
‘Law of Requisite Variety’, a concept which underpins all subsequent work involving the
application of cybernetic concepts to the control of complex systems.
In simple terms the Law of Requisite Variety states that ‘only variety can absorb variety’.
This expresses the idea that is there is a logically derived relationship between the complexity
of the environment, the flexibility of a regulator and the specificity of the desired output
states of any system.
Inter alia the consequence of this law is that it provides proof that simple mechanical systems
CANNOT successfully control complex systems operating within a dynamic environment.
Furthermore it can be used to assert that the only way that complex organisations CAN be
successfully be controlled is through exploiting the capacity of a system for self organisation
and self regulation, since, by definition, a command and control approach cannot have
‘requisite variety’. It can also be argued that IF one attempts to use an inflexible mechanical
model in these circumstances, and or goals are too tightly defined then we should expect to
see exactly those phenomena we have come to associate with budgeting – ‘gaming’
behaviour, bureaucracy and so on.
So, had Hofstede understood more about the science of cybernetics, arguably he would have
understood that it was less the source of the problem than the source of the solution.
Cybernetics and The Control of Complex Probabilistic Systems
Over the last 30 years there has been a considerable amount of work done attempting to apply
the insights of Ashby and others to control in the biological and social world. Most notable is
the work of Stafford Beer (Beer, 1981) who created the Viable Systems Model, based on a
model of the autonomic nervous system and how it successful regulates activity in the human
body. In more recent times the work of the biologists Humberto Maturana and Francisco
Varela (Maturana and Varela, 1998) has been applied to the understanding of organisations,
specifically how they relate to the environment and to the maintenance of form and identity.
What does a performance management model based on a systems paradigm look like and
how does this differ from the mechanistic paradigm? What implications does this have for
performance measurement?
A Systems Paradigm
In broad terms a conventional approach to performance management can be seen to be
comprised of three basic steps:
Decide
The decision as to ‘what to do’ is typically see as an output of the strategy process. Usually
this starts with some form of an environmental scan and business audit and ends with the
formulation of a set of broad action plans and the definition of performance aspirations. It is
carried out periodically and is conceived of as an intellectual process that is the prerogative of
‘top management’.
Plan
Planning is the process of turning the strategy into a set of aligned actions, anchored in time
and usually associated with a set of quantified targets supported by detailed plans, often
financial in nature. This is typically an annual process, subject to negotiation between ‘top’
and ‘middle’ management, and factors in recent performance and the perceived requirements
of key stakeholders such as the investment community.
Execute
The implementation of plans is the responsibility of management for which they are held
accountable. This accountability is exercised though the use of measurement procedures that
identify deviation from plan and which are often supported by incentive systems anchored
around the achievement of targets.
The underlying assumption on which this paradigm is built is that success flows in a linear
and quasi-deterministic fashion from disciplined execution of a well-conceived strategy that
is subsequently manifest in exceptional financial returns.
In contrast a systemic paradigm is based on the idea that success takes the form of viability -
long term survival – and that this is achieved by the cultivation and maintenance of healthy
relationships – a correct balance - between the organisation (system) and its environment and
between its constituent parts (subsystems).
The mechanisms that support this approach can be conceived of as what Ashby called
‘homeostats’; richly interconnected ‘machines’ using a mixture of positive and negative
feedback and feedforward mechanisms to regulate the relationships between the following
entities:
The organisation with its environment
Maturana and Varela conceived this as taking the form of ‘structural coupling’ whereby an
organisation co-evolves with its environment (which often takes the form of other
organisations) to build and maintain a set of mutually beneficial relationships. In this scheme
competition is an occasional manifestation of the process of change in these arrangements
rather than the driving force in their evolution. In Beer’s Viable Systems Model he
distinguishes between relationships in the ‘inside and now’ and the ‘outside and then’. The
former exists between the operational elements of the organisation (Systems 1) and their
environments and is typically characterised as a process of regulation. Those that reside in the
‘outside and then’ can be viewed as underpinning the process of adaptation and are manifest
in the relationship between System 4 and the potential future environment of the organisation.
The organisation with itself
In Beer’s model internal balance (homeostasis) is maintained through 3 sets of homeostats.
The first set manages the relationships between the operational elements and that part of the
organisation responsible for maintaining the cohesion of the enterprise in the ‘inside and
now’ – System 3. It is comprised of System 2, whose purpose is anti-oscillatory, System 3*
which monitors the internal state of the systems and a ‘command channel’ along which
resources are allocated and interventions made.
The second set strikes a balance between the ‘here and now’ and the ‘now and then’. It sits
between System 3 and System 4.
The final set regulates this relationship and supplies ‘organisational closure’ by defining the
ultimate purpose and identity of the organisation. Maturana and Varela defined ‘life’ as
possessing ‘autopoeisis’ – a set of processes devoted to the task of ‘self maintenance’ and
‘self creation’ – and Beer came to describe the role of System 5 in these terms.
Unlike the conventional paradigm where causality is linear and relationships based on the
exercise of hierarchical power the systemic model is founded on circular logic and feedback
loops which come together to build complex patterns of interdependency. The systemic
model explicitly acknowledges the constraints imposed by Ashby’s Law, which places a limit
on the extent to which an organisation can be successfully managed by ‘command and
control’. Thus the primary mode of the systemic model is self organisation and self control
which in Beer’s model is manifested in its ‘recursive’ nature whereby each ‘viable system’
contains and is contained within a self similar ‘viable system’ to create a fractal
organisational architecture.
Viability and Performance Measurement
In Beer’s model the task of performance measurement is to monitor, in real time’, those
metrics which are relevant to the viability of the organisation. The process of monitoring
takes account of three measures: actuality, capability and potentiality. Rather than setting
arbitrary targets which then need to be hit, processes are set up to detect signs of incipient
instability - a failure to regulate or adapt effectively - by tracking ratios between these three
measures. In this way appropriate corrective or opportunistic action can be taken as soon as
possible.
The use of three simple ratios in this way potentially has the following benefits:
Economy of expression – a simple ‘language’
Comparability – between financial and non financial measures and across organisational units
and between organisations
Goals – reduced need for fixed, arbitrary and time bound targets.
Economy of computation – a single ‘filter’ able to deal with different types of metric
Whilst ‘actuality’ is relatively easy to measure in principle there is a big challenge in
establishing ways of measuring ‘potentiality’ and ‘capability’ which in the eyes of some
makes Beer’s scheme unworkable.
On the other hand, under a conventional system, targets serve a similar purpose and the lack
of scientifically robust way to set them has not proved to be an insurmountable problem. In
this context the benefits of adopting a ‘viability based’ approach to performance
measurement make it worthy of serious consideration.
Conclusion
Conventional performance management practice is based on a linear deterministic model that
is inappropriate to the control of complex social systems operating in turbulent and
unpredictable environments. Contrary to received wisdom in parts of the academic
community, Cybernetics potentially provides the intellectual building blocks for a new
paradigm, based on a fundamental appreciation of what is required to manage the
interdependencies between an organisation and its environment and between its constituent
parts. It holds the promise of developing a set of practice that is not only more efficient and
effective as a control philosophy, but also more sensitive to the human need for self
determination in the work place.
References
Agyris, C. (1952) ‘The Impact of Budgets on People’, The Controllership Foundation
Answerthink. (2003) ‘Quo Vadis Budgets’ The Hackett Group, Frankfurt
Ashby, W.R. (1956) ‘Introduction To Cybernetics’ London, Chapman Hall
Beer, S. (1981) ‘The Brain of The Firm’, Chichester, John Wiley
Boulding, K.E. (1956), ‘General Systems Theory – The Skeleton of Science’, Management
Science: 197-208
Hofestede, G. (1978) ‘The Poverty of Management Control Philosophy’ Academy of
Management Review, 3, (3), 450-461
Hope, J, and Fraser, R. ‘Beyond Budgeting: How Managers Can Break Free From the
Performance Trap’ Boston, Harvard Business School Press
Hoverstadt P (2004) The structure of performance measurement in organisations -
implications and application of systems models PMA
Hoverstadt P (2006) Measuring the performance of management PMA
Maciariello, J. (1984), ‘Management Control Systems’ London, Prentice Hall.
Maturana, H, and Varela, F. ‘The Tree of Knowledge’ Boston, Shambala,