ArticlePDF Available

Abstract

In the 1950s and 1960s Ross Ashby created a general theory of adaptive systems. His work is well known among cyberneticians and systems scientists, but not in other fields. This is somewhat surprising, because his theories are more general versions of the theories in many fields. The philosophy of science claims that more general theories are preferred because a small number of propositions can explain many phenomena. Why, then, are Ashby's theories not widely known and praised? Do scientists really strive for more general, parsimonious theories? This paper reviews the content of Ashby's theories, discusses what they reveal about how scientists work, and suggests what their role might be in the academic community in the future.
1
Ross Ashby’s general theory of adaptive systems
Stuart A. Umpleby
Department of Management
The George Washington University
Washington, DC 20052 USA
October 19, 2008
Prepared for a special issue of the International Journal of General Systems
Based on a presentation at the W. Ross Ashby Centenary Conference
University of Illinois in Urbana-Champaign, March 4-6, 2004
2
Ross Ashby’s general theory of adaptive systems
Stuart A. Umpleby
Department of Management
The George Washington University
Washington, DC 20052 USA
umpleby@gwu.edu
Abstract
In the 1950s and 1960s Ross Ashby created a general theory of adaptive systems. His work
is well-known among cyberneticians and systems scientists, but not in other fields. This is
somewhat surprising, because his theories are more general versions of the theories in many
fields. The philosophy of science claims that more general theories are preferred because a
small number of propositions can explain many phenomena. Why, then, are Ashby’s theories
not widely known and praised? Do scientists really strive for more general, parsimonious
theories? This paper reviews the content of Ashby’s theories, discusses what they reveal
about how scientists work, and suggests what their role might be in the academic community
in the future.
Keywords: Cybernetics; Complexity; Adaptation; Self-Organization; Requisite Variety
1. Two kinds of contributions to science
There are two ways in which more general theories can be constructed. The first type of
more general theory results when a new dimension is added to an existing theory (Krajewski
1977). The new theory is more general because it can explain a larger number of phenomena.
For example, in physics relativity theory added the consideration that the relative velocity of
two objects would affect mass, length, and time. The gas laws added the diameter of
molecules, which previously had been treated as point masses. In cybernetics Heinz von
Foerster’s work added “amount of attention paid to the observer” to the traditional
philosophy of science (Umpleby 2005).
The second type of more general theory is a more abstractly worded theory. The theories of
Ross Ashby are examples. However, these theories still require the knowledge in more
specialized fields in order to operationalize them and put them to use. For example, Ashby
spoke about the need for requisite variety in a regulator. Operationalizing this theory in
computer science requires specifying the speed or memory capacity of a computer. In game
theory variety is expressed in possible moves. An example of requisite variety in
management is the need to match production capacity to customer demand. I shall now
review Ashby’s method and theories.
2. Ashby's method
Ashby used state determined systems to describe the processes of interest to him – regulation,
adaptation, self-organization, etc. He used state determined systems not because he thought
3
the world was deterministic. (Some of my students have jumped to this conclusion.) Rather,
he wanted to communicate clearly about topics that had previously been dealt with vaguely.
Also, he wanted to deal with nominal, ordinal, and interval variables as well as cardinal
variables, since control and communication often do not lend themselves to the cardinal
variables that are possible in fields such as physics and economics. Furthermore, he wanted
to create a general theory that would encompass systems defined on both animate and
inanimate objects. As Ashby put it,
Cybernetics treats not things but ways of behaving. It does not ask “what is this thing?”
but “what does it do?”… It is thus essentially functional and behaviouristic.
Cybernetics deals with all forms of behavior in so far as they are regular, or determinate,
or reproducible. The materiality is irrelevant… The truths of cybernetics are not
conditional on their being derived from some other branch of science. Cybernetics has
its own foundations (Ashby 1956, 1).
Ashby was particularly talented at creating examples to illustrate his theoretical points. For
example, he illustrates learning as movement toward equilibrium by describing how a kitten
finds a comfortable position near a fire or learns to catch mice (Ashby 1960). As one
example of a sequence of events, he put a flow chart on the door to his office with steps
including “knock,” “enter,” etc. (Conant 1981, 363) His example of “The Dynamics of
Personality” described a recurring sequence of events in the lives of a husband and wife
(Conant 1981, 365). His example, “A Brief History of Amasia,” illustrated legal, cultural,
and strategic rules in a multi-nation system somewhat like Europe at the start of World War I.
The events that unfolded were determined by the rules within the system (Conant 1981, 367-
9).
As I read Ashby’s books I imagined my own examples in fields of interest to me. However,
some of my students have wanted examples in their fields of interest to be already in the text.
Hesitancy to exercise imagination may be an obstacle to appreciating the relevance and
importance of Ashby’s work.
Ashby was concerned not with simple phenomena or with unorganized complexity (e.g.,
molecules of gas in a container) but rather with organized complexity, including brains,
organisms, and societies. His approach to studying organized complexity was unusual.
Rather than building a more complex structure by assembling components, Ashby chose to
look for constraints or interaction rules which reduce the maximum possible variety to the
variety actually observed. Laws, whether scientific or parliamentary, are examples of
constraints, which reduce variety from what can be imagined to what is observed.
3. Level of theorizing
Ashby’s level of theorizing was unusual. His interdisciplinary theories are more general or
abstract than the theories in most disciplines. Consequently, it can be said that his theories lie
at a level of abstraction between the theories in disciplines such as biology, psychology, and
economics and more general fields such as philosophy or mathematics.
However, theories at a more general level are neither sufficient nor necessary. A more
generally worded theory is not sufficient because “domain-specific knowledge,” which is
4
obtained from discipline-based theories, is still needed in order to apply the theory in
practice.
Also, a more generally worded theory is often perceived as not being necessary. That is, if a
scientist is interested only in one field, a theory worded in more general terms may be seen as
contributing nothing essential to his or her field. In discipline-based universities only a few
people are genuinely interested in more than one field. So, few people feel a need for more
general theories. Furthermore, a factor limiting the growth of cybernetics in the United States
is that Americans look for meaning through examples or applications. Europeans are more
likely to search for meaning in more general conceptualizations (Umpleby 2005). Ashby’s
theories are very helpful to scientists who are interested in knowing how the theories in two
or more fields are similar. In this way his theories aid the transfer of ideas from one field to
another. This is why his theories have been of great interest to systems scientists and
cyberneticians.
Ashby’s theories, because they are very general, are very good theories in that they are
parsimonious. They explain a large number of phenomena using few statements. Although
Ashby’s theories have been criticized for being so general they are tautological (Berlinski
1976), an alternative view is that his theories are axiomatic or definitional. It is remarkable
that Ashby was able to formulate theories that work for so many domains. Discipline-based
theories do not. One can take the formal structure and operationalize it in many fields.
Ashby's general theories then become a tool for developing more specific, operationalizable
theories in specific disciplines.
4. Ashby’s epistemology
One interesting feature of Ashby’s work is that it is compatible with second order cybernetics
(the idea that the observer should be included within the domain of science) even though
Ashby never directly addressed the issue of the observer or second order cybernetics. Indeed,
Heinz von Foerster created the phrase “second order cybernetics” in 1974 after Ashby’s death
in 1972. To understand Ashby’s epistemology, it is important to be familiar with the terms
he used and his definitions. What is observed, he called the “machine.” For Ashby, “the
system” is an internal conception of “the machine.” A “system” is a set of variables selected
by an observer. Ashby does not directly discuss the role of the observer in science or the
observer as a participant in a social system. But because he defines a system as a set of
variables selected by an observer, his work is quite compatible with second order cybernetics.
5. Regulation
As a person concerned with the successful functioning of brains, Ashby was concerned with
the general phenomenon of regulation. Ashby divides all possible outcomes into the goal
subset and the non-goal subset. The task of a regulator is to act in the presence of
disturbances so that all outcomes lie within the goal subset. In accord with the general nature
of his theories, systems that we recognize as regulators can be potentially defined on
organisms, organizations, nations, or any other objects of interest.
There are various types of regulators. An error-controlled regulator can be very simple, for
example a thermostat. A cause-controlled regulator requires a model of how the machine
5
will react to a disturbance. One consequence of Ashby’s view of regulation is the Conant and
Ashby theorem, “every good regulator of a system must be a model of that system.” (Conant
and Ashby 1970). Von Foerster once said that Ashby told him this was the idea he was
looking for when he began his explorations in cybernetics.
6. Learning
For Ashby learning involved the adoption of a pattern of behavior that is compatible with
survival. He distinguished learning from genetic change. Genes determine behavior directly,
and genetically controlled behavior changes slowly. Learning, on the other hand, is an
indirect method of regulation. In organisms that are capable of learning, genes do not
determine behavior directly. They merely create a versatile brain that is able to acquire a
pattern of behavior within the lifetime of the organism. As examples, Ashby noted that the
genes of a wasp tell it how to catch its prey, but a kitten learns how to catch mice by
pursuing them. Hence, in more advanced organisms the genes delegate part of their control
over the organism to the environment. Ashby’s Automatic Self-Strategizer is both a blind
automaton going to a steady state, at which it sticks, and a player that “learns” from its
environment until it always wins (Conant 1981, 373-6).
7. Adaptation
As a psychiatrist and director of a psychiatric hospital, Ashby was primarily interested in the
problem of adaptation. In his theory of adaptation two feedback loops are required for a
machine to be considered adaptive (Ashby 1960). The first feedback loop operates frequently
and makes small corrections. The second feedback loop operates infrequently and changes
the structure of the system, when the “essential variables” go outside the bounds required for
survival. As an example, Ashby proposed an autopilot. The usual autopilot simply maintains
the stability of an aircraft. But what if a mechanic miswires the autopilot? This could cause
the plane to crash. An “ultrastable” autopilot, on the other hand, would detect that essential
variables had gone outside their limits and would begin to rewire itself until stability
returned, or the plane crashed, depending on which occurred first.
The first feedback loop enables an organism or organization to learn a pattern of behavior
that is appropriate for a particular environment. The second feedback loop enables the
organism to perceive that the environment has changed and that learning a new pattern of
behavior is required. Ashby’s double loop theory of adaptation influenced Chris Argyris
(1982) who wrote about “double loop learning” and Gregory Bateson (1972) who coined the
term “deutero learning.”
The effectiveness of the double loop conceptualization is illustrated by the great success of
quality improvement methods within the field of management. Probably no set of
management ideas in recent years has had a greater impact on the relative success of firms
and the relative competitiveness of nations. This success is indicated by the international
acceptance of the ISO 9000 standard as a minimum international model of management and
the creation of quality improvement awards in Japan, the U.S., Europe, and Russia to identify
the best companies to emulate. The basic idea of quality improvement is that an organization
can be thought of as a collection of processes. The people who work IN each process should
also work ON the process, in order to improve it. That is, their day-to-day work involves
6
working IN the process (the first, frequent feedback loop). And about once a week they meet
as a quality improvement team to consider suggestions and to design experiments on how to
improve the process itself. This is the second, less frequent feedback loop that leads to
structural changes in the process. Hence, process improvement methods, which have been so
influential in business, are an illustration of Ashby’s theory of adaptation.
8. Intelligence
Ashby defined “intelligence” as appropriate selection. He asked the question, “can a
mechanical chess player outplay its designer? He answered the question by saying that a
machine could outplay its designer, if it were able to learn from its environment (Conant
1981). Furthermore, intelligence can be amplified through a hierarchical arrangement of
regulators. The lower level regulators perform specific regulatory tasks many times. The
higher level regulators decide what rules the lower level regulators should use. A
bureaucracy is an example. Gregory Bateson said that cybernetics is a replacement for small
boys, since in earlier days small boys were given the tasks of putting another log on the fire,
turning over an hour glass, etc. Such simple regulatory tasks are now usually performed by
machines, which are designed using ideas from cybernetics.
9. The law of requisite variety
The law of requisite variety is sometimes called Ashby’s law. It is probably his most widely
known contribution to science. One can explain the law either as a relationship between
information and selection or as a relationship between a regulator and the system being
regulated. In terms of a relationship between information and selection, the law of requisite
variety says that the amount of selection that can be performed is limited by the amount of
information available. Once one has exhausted the information available, no further rational
grounds for selection exist. For example, universities routinely require applicants to submit
not only their grades in earlier schooling but also their scores on standardized tests.
Recommendations are also required. If such information is not provided, no rational grounds
for selection exist.
In terms of the relationship between a regulator and the system being regulated, the law of
requisite variety says that the variety in a regulator must be equal to or greater than the
variety in the system being regulated. For example, when buying a computer, one first
estimates the size of the task – the data storage space and speed required – and then buys a
computer with at least that capacity. A smaller computer would not be adequate. As a
second example, when a manager supervises employees, it is necessary that the manager pays
attention to only some of the behavior of the employees. Otherwise the manager will not be
able to control the variety the employees can generate. “Management by exception” refers to
the practice that a manager trains subordinates how to handle various tasks. When they
encounter a task they have not been trained for, they ask the manager. The result is that each
employee interacts with the manager only occasionally; and the manager is able to supervise
several subordinates.
The law of requisite variety has some important implications. When confronted with a
complex situation, there are only two choices – increase the variety in the regulator, usually
7
by hiring staff, or reduce the variety in the system being regulated. The second strategy is
possible because the observer defines “the system.”
In an earlier article (Umpleby 1990) I described four strategies of regulation: 1) one-to-one
regulation of variety, for example, football or war; 2) one-to-one regulation of disturbances,
for example crime control in a city (2/ 1000); 3) changing the rules of a game, for example
government regulation of industry (1/ 640,000); 4) changing the game, for example the
global models produced by the Club of Rome in the 1970s (12/ 4 billion). The global models
focused on population, resources, and environment rather than the ideological competition of
the Cold War (Meadows, et al., 1974). As the subject of attention moves from the concrete to
the conceptual the impact of decisions increases. By choosing a more conceptual strategy,
rather than a more direct and immediate strategy, it becomes possible to regulate a very large
system, such as the global economy. In the example above the difference in regulatory
capability between any two steps is a factor of about one thousand. However, the same
strategies can be used in managing a household or managing an organization. The law of
requisite variety says that variety must be controlled, if successful regulation is to be
achieved, but variety need not be controlled directly. If one is clever in creating
conceptualizations and organizational structures, the amount of variety that can be controlled
can be very large.
10. Self-organizing systems
In the 1950s the concept of self-organization was of interest due to a debate over whether one
should program machines that would behave in an intelligent manner or design machines that
would learn from their environments, hence, they would organize themselves. In 1956 at a
conference at Dartmouth University people in the field of artificial intelligence chose the first
strategy. Cyberneticians chose to continue studying neurophysiology in order to better
understand learning and human cognition.
Three conferences on self-organization were held around 1960. At the time a self-organizing
system was thought to interact with and be organized by its environment. However, Ashby
formulated a different conception of self-organization: “every isolated, determinate, dynamic
system obeying unchanging laws will develop organisms that are adapted to their
environments.” (Ashby 1962) He explained the idea as follows: Imagine a system. It has
unstable states and stable, equilibrial states. Over time it will go toward the stable, equilibrial
states. As it does so, it selects, thereby organizing itself. Such a system is open to energy (it
is dynamic) but closed to information (the interaction rules among the elements of the system
do not change). At about the same time Heinz von Foerster, with his example of the
magnetic cubes in a box, explained how such a system could generate more complex entities
(von Foerster 1962).
Interest in self-organizing systems reemerged in the 1980s and 1990s as a result of interest in
cellular automata, fractals, and chaos theory. Although there clearly were new techniques
available for computer simulation, it is surprising that so little reference was made to the
basic theoretical work done in the 1960s (Asaro 2007). Ashby’s definition of self-
organization is different from the earlier definition. The earlier definition of self-organization
is what one finds in the literature on complexity where it is possible to speak of self-
organizing, adaptive systems (Waldrop 1992). In Ashby’s definitions an adaptive system is
8
open to information, but a self-organizing system is closed to information (the interaction
rules do not change during the period of observation).
The principle of self-organization is an example of Ashby’s talent for formulating general
principles. His principle of self-organization is a more general version of Adam Smith’s
theory that industrial firms will compete to bring to market products desired by customers,
Charles Darwin’s theory of natural selection among organisms and species, Karl Popper’s
theory of scientific progress by means of conjectures and refutations, and B.F. Skinner’s
theory that behavior modification can be achieved through rewards and punishments. In each
case variation is subjected to selection in a competitive environment.
Furthermore, the principle of self-organization leads to a general design rule – to manipulate
any object, expose it to an environment, such that the interaction rules between the object and
its environment change the object in the desired direction. This type of regulation relies not
on changing the object directly but rather on changing the environment of the object. For
example, to make steel from iron, put the iron in a blast furnace; to educate a child, send it to
school; to regulate behavior of individuals, administer rewards and punishments; to control
corporate behavior, pass laws and create regulatory agencies.
11. The future of Ashby’s legacy
Ross Ashby left a legacy of elegant theories of regulation, learning, adaptation, and self-
organization. He created a new level of theorizing about systems that process information and
perform selections. These theories have influenced many fields – computer science, robotics,
management, psychology, biology, sociology, political science, and the philosophy of
science. As a transdisciplinary field cybernetics serves as a catalyst for further developments
in many fields. That is the role that cybernetics and general systems theory have played until
now. However, when we think about the impact that these theories may have in the future, at
least two possibilities come to mind.
Just as physics provides a theory of matter and energy which is used in the various fields of
engineering, cybernetics may one day be seen as providing a theory of form and pattern for
the various fields of the social sciences, library science, computer science and design
disciplines such as architecture and public policy.
Also, more general theories hold great promise for Institutes of Advanced Study, which are
becoming common on university campuses as ways of fostering interdisciplinary
communication. Indeed, John Warfield has suggested that such institutes should offer their
own degrees and that systems science and cybernetics should be the core curriculum. He
proposes that the modern university should be thought of as consisting of three colleges. The
Heritage College would consist of those fields that teach what we have learned in the past –
the sciences, the humanities, and the arts. The Professional College would consist of the
applied fields – engineering, law, medicine, business, and agriculture. The Horizons College
would be concerned with the future and with design. It would integrate the knowledge of the
other two colleges and bring people together to work on problems that do not yield to
disciplinary analyses and solutions (Warfield 1996).
Despite the fact that more general theories are more valuable because they explain more
phenomena with fewer statements, Ashby’s theories have not received as much attention as
9
they deserve. The reason no doubt lies in the traditions in universities that enforce narrow
specialization. However, as knowledge grows and an integrated understanding is needed to
cope with the problems of a global society, probably increased attention will be paid to more
general theories. When that day comes, Ashby’s work will receive renewed attention and
acclaim.
12. Acknowledgement
This article benefited from helpful comments by George Klir and Peter Asaro, for which the
author is grateful.
13. References
Argyris, C., 1982. Reasoning, Learning and Action: Individual and Organizational.
San Francisco: Jossey-Bass.
Asaro, P. (2007). " Heinz von Foerster and the Bio-Computing Movements of the 1960s." in
Albert Müller and Karl H. Müller (eds.) An Unfinished Revolution? Heinz von
Foerster and the Biological Computer Laboratory | BCL 1958-1976. Vienna, Austria:
Edition Echoraum.
Ashby, W.R., 1956. An Introduction to Cybernetics. London: Chapman and Hall.
Ashby, W.R., 1960. Design for a Brain: The Origin of Adaptive Behavior. London:
Chapman and Hall.
Ashby, W.R., 1962. Principles of the Self-Organizing System. In: H. von Foerster and G.
Zopf, eds. Principles of Self-Organization. New York: Pergamon Press, 255-278.
Bateson, G., 1972. Steps to an Ecology of Mind. New York: Ballantine.
Berlinski, D., 1976. On Systems Analysis: An Essay Concerning the Limitations of Some
Mathematical Methods in the Social, Political, and Biological Sciences. Cambridge,
MA: MIT Press.
Conant, R. and Ashby, R.,1970. Every good regulator of a system must be a model of that
system. International Journal of Systems Science, 1 (2), 89-97.
Conant, R.C., 1981. Mechanisms of Intelligence: Ross Ashby’s Writings on Cybernetics.
Seaside, CA: Intersystems Publications.
Krajewski, W. 1977. Correspondence Principle and Growth of Science. Boston : D. Reidel
Pub. Co.
Meadows, D.L., Behrens, W.W., III, Meadows, D.H., Naill, R.F., Randers, J., and Zahn,
E.K.O., 1974. Dynamics of Growth in a Finite World. Cambridge, MA: Wright-
Allen Press.
10
Shannon, C., 1964. The Mathematical Theory of Communication. Urbana: University of
Illinois Press.
Umpleby, S.A., 1990. Strategies for Regulating the Global Economy. Cybernetics and
Systems, 21 (1), 99-108.
Umpleby, S.A., 2005. What I Learned from Heinz von Foerster about the Construction of
Science. Kybernetes, 34 (1) (2), 278-294.
von Foerster, H., 1962 Self-Organizing Systems and their Environments. In: Yovits and
Cameron, eds. Self-Organization. New York: Pergamon Press, 31-50.
von Foerster, H., 1979. Cybernetics of Cybernetics. In: Krippendorff, K.,ed.
Communication and Control. New York: Gordon and Breach, 5-8.
Waldrop, M., 1992. Complexity: The Emerging Science at the Edge of Order and
Chaos. New York: Simon & Schuster.
Warfield, J.N., 1996. The Wandwaver Solution: Creating the Great University [online].
George Mason University Available from: http://www.gmu.edu/departments/t-
iasis/wandwaver/wandw.htm. [Accessed 05 August 2008]
... Ross Ashby's (1956) theory of adaptive systems illuminated that a system's survival and stability necessitates the use of feedback mechanisms, which could be likened to a project team's motivation principles, with an ITP viewed as a complex system (Abyad, 2018;Umpleby, 2009). Practices in project management could include adequate training, positive feedback, and a sense of task ownership (Abyad, 2018). ...
... The general systems theory was developed in the 1900s by von Bertalanffy (1972), which elucidated that the ability of a system to achieve the purpose of its existence is influenced by the interaction of the system's elements because they are interconnected (Arnold & Wade, 2015;Sterman, 2001). Ross Ashby's (1956) general theory of adaptive systems extended the general systems theory focusing on systems dynamics to explore the mechanisms, in the form of feedback and interactions, which operate within the systems that ensure its survival, stability, and adaptability (Umpleby, 2009). ...
... The general systems theory elucidated that the ability of a system to achieve the purpose of its existence is influenced by the interaction of the system's elements because they are interconnected (Arnold & Wade, 2015;Sterman, 2001). Ross Ashby's (1956) general theory of adaptive systems extended the general systems theory focusing on systems dynamics to explore the mechanisms, in the form of feedback and interactions, which operate within the systems that ensure its survival, stability and adaptability (Umpleby, 2009). ...
Thesis
Full-text available
Globally, most information technology projects (ITPs) are reported as unsuccessful. Poor project management practices have consistently been identified as the leading cause of ITP failures. However, ITP practitioners manage project processes in diverse ways without clear guiding principles in terms of what does or does not work in practice for success. Process management practices in projects were explored in this grounded theory qualitative study from a systems theory perspective. The purpose was to understand from project practitioners' experiences what guiding principles potentially influenced ITPs to success. These experiences were then analyzed to develop a theory describing how to best use process management in projects to achieve success. The main research question addressed in this study examined participants’ view of practices in successful ITPs that effectively led to success. The data were collected during in-depth interviews of 14 project participants using semistructured questions and were coded using the grounded theory continuous-comparison approach until theoretical saturation and themes were generated. The finding is an emergent theory, which indicates that practices in ITP process management consisting of continuous learning, regular engagement, constant surveillance, process orchestration, and timely response positively impacts a successful outcome. Leveraging this finding, process management principles are recommended to better understand ITP process management in practice. This study contributes to positive social change by providing a guide for practitioners’ use, potentially resulting in more successful educational and healthcare ITPs, especially in Africa.
... Interaction rules serve as a mechanism through which individuals can seamlessly build learning relationships across hierarchical, functional, and structural boundaries. Interaction rules help to reduce conflict and complexity by allowing clear guidelines for sharing information across boundaries (Krippendorff, 2009;Umpleby, 2009). ...
Article
The multidimensional characteristic of learning has received little attention in the network literature, resulting in fragmented empirical evidence on learning networks. To address this gap, we introduce a framework that allows a better understanding of the multidimensionality of learning networks by employing the concept of multiplexity in the network literature. Our proposed conceptual framework for multiplex learning networks includes a 3-E typology (exploration, exploitation, and exaptation), which serve as distinct layers within the multiplex networks. We also provide a hypothetical scenario to demonstrate the potential of our multiplex learning networks framework for HRD scholars and practitioners. Moreover, we extend our framework to a multilevel model that connects individual-level learning relationships to team-level relationships. Our framework’s theoretical and practical implications are discussed, and future research directions are suggested.
... Individuals could regulate behavior by engaging in cyclical processes of self-observation (to diagnose ability, motivation, whether activities are perceived as positive/negative), judgment (perceiving one's own and social standards for performance, value one has for an activity, and whether one believes success occurs resulting from one's own ability), and self-reaction to appraise performance. This notion is based on negative feedback, like Ashby's cybernetics (Umpleby, 2009); with incongruity between performance and standards triggering self-regulation. ...
Chapter
Full-text available
This three part paper explores how the approaches of cybernetics (a field investigating how complex systems- brains, individuals, societies and machines navigate their realities) have influenced education and psychology over time. The first part recounts the establishment of first-order cybernetics, and the emergence of an observer driven approach to understanding the adaptation of living systems at the Macy Conferences. I suggest that psychology adopted the computational aspects of cybernetics models, paying attention to figure-ground relationships rather than emergent, integrated relationalities in human learning and adaptation, leading to the popularization of neuropsychological and information processing approaches in the 50s and 60s. The second part outlines emergence and sudden decline of second-order cybernetics through research efforts at the Biological Computer Laboratory, and suggests psychology and education bifurcated from this approach during the Cognitive Revolution, producing social cognitive and cognitivist approaches, direct instruction, and prescribed outcomes for learning and mental models. The third part suggests the aftereffects of the Cognitive Revolution led to (re)interpretation of constructivist approaches through a cognitivist lens by scholars like Jerome Bruner, and outlines current efforts to embrace the ethos of second-order cybernetics in educational and psychological research and treat learners as historical actors constantly evolving in a complex social world.
... The second feedback loop (i) operates infrequently, (ii) enables the organism to perceive the change of the environment has and that learning a new pattern of behaviour is required, and (iii) changes the structure of the system, when the "essential variables" go outside the bounds required for survival. In the 1960s, the concept of self-organization became a central topic of discussions (Umpleby, 2008). The addressed dilemma was whether intellectualized systems should be programmed to behave in an intelligent and autonomous manner, or they should be designed to be able to learn from their performance, state, and environments and to manage themselves. ...
Article
Cyber-physical systems (CPSs) are seen as one of the tangible results of the convergence of advanced information technology, nanotechnology, biotechnology, cognitive science, and social science in addition to conventional systems science, engineering, and technologies. Designing next-generation cyber-physical systems (NG-CPSs) is a challenging matter for abundant reasons. It is not possible to consider all reasons and to address their interplays simultaneously in one paper. Therefore, this position paper elaborates only on a selected number of topical issues and influential factors. The author claims that the shift of the paradigm of CPSs and the uncertainty related to the paradigmatic systems features of NG-CPSs are among the primary reasons. Since the future of CPSs will be influenced strongly by their intellectualization, adaptation/evolution, and automation, these aspects are also addressed. It is argued that interaction and cooperation with NG-CPSs should be seen from a multi-dimensional perspective and that socialization of NG-CPSs needs more attention in research. The need for aggregation, management, and exploitation of the growing amount of synthetic systems knowledge produced by smart CPSs is seen by the author as an important emerging concern.
... They provide controllable variety or features that guide activity towards certain outcomes. This criterion is derived from Ashby's law of requisite variety (Umpleby, 2009), which inspires models of self-regulation (Bandura, 1993). The law, explained by Ashby (1956) in Introduction to Cybernetics using the example of turn-taking games, suggests that to navigate the disturbances a social world produces, a system must have a set of responses at least as nuanced as these disturbances. ...
Article
Full-text available
This three-part paper reinforces crosscurrents between cybernetician Gordon Pask’s work towards creating responsive machines applied to theater and education, and Vygotsky’s theory, to advance sociohistorical approaches into the Internet age. We first outline Pask’s discovery of possibilities of a neoclassical cybernetic framework for human-human, human-machine, and machine-machine conversations. The second part outlines conversation theory as an elaboration of the reconstruction of mental models/concepts by observers through reliance on sociocultural psychological approaches, and applies concepts like the zone of proximal development, and perezhivanie to Paskian aesthetic technologies. The third part interprets Pask’s teaching/learning devices as zones of proximal development, and outlines how Paskian algorithms in digital devices like THOUGHTSTICKER have been generalized on today’s Internet, supplemented by corporate interests. We conclude Paskian theory may offer understandings of the roles of Internet technologies in transforming human thinking, and suggest (re)designing tools incorporating algorithms contextually advancing conceptual understanding deviating from current indexing approaches.
... We have not been in a combat situation with human-machine teams or autonomous vessels alone, but past experiences, e.g., the shootdown of an Iranian civilian aircraft (USS Vincennes incident) or the two 2017 collisions that left 17 sailors dead, experiences that tell us something about possibilities of emergent behaviors at the low-end of the technology. This discussion invokes Ashby's Law of Requisite Variety (Umpleby, 2004) ("every good regulator of a system must be a model of that system") to assure, without fail, that emergent behavior is addressed if and when it is manifested. The "not knowing the future" in past events was ignorance coupled with "what might happen." ...
Preprint
Full-text available
A classic idea from the cybernetics literature is the good regulator approach first formulated by Conant and Ashby in 1970, who proposed a formalism for good regulation. The Every Good Regulator Theorem (EGRT) provides a unique perspective on intelligent autonomous learning systems reliant on a type of compressed global representation (world model). We will discuss the concepts of modeling and regulation in the original EGRT, requiring a revisitation of the historical and technical underpinnings relevant to regulating a system with communication, equilibrium, and feedback. A homeomorphic mapping between controller and the controlled system (or model) provides a reduced representation that preserves useful variety for all possible outcomes of a system. Several toy models challenge the notion of tightly-coupled good regulation, and demonstrate how diverse models of physical systems can address the challenges of far-from-equilibrium and out-of-distribution phenomena. Of particular interest are learning systems that utilize physical phenomena such as diffusion, criticality, phase transitions, rotational forces, and bifurcation. The EGRT is then connected to a sampling of approaches and trends in machine learning (ML), deep learning (DL), and reinforcement learning (RL). We aim to recast the EGRT as a modern tool for ML and RL architectures by considering the role of good regulation and complexity in understanding the performance of intelligent systems.
Article
Full-text available
A subjetividade da doença mental é melhor compreendida e tratada quando abordamos sua complexidade no âmbito da teoria da informação. A estrutura nervosa em si organiza-se evolucionariamente como processadores de informação em diversos níveis da economia organísmica, culminando com a vida psíquica. Nossa vida mental é assim um complexo dinâmico de informações que moldam o Self. Perturbações na lógica como essas informações interagem pode trazer alterações psíquicas, assim como alterações nas vias neurais processadoras. A psiquiatria assim deve operar em duas vias paralelas tanto para o diagnóstico quanto para o tratamento. Na medida em que a compreensão de como a psique é informacionalmente gerada e organizada vem se ampliando, estamos obtendo insights importantes sobre a origem das doenças mentais e suas interrelações, criando psicoterapias mais objetivas e protocolos terapêuticos mais eficientes.
Article
Full-text available
Przedstawiony artykuł jest próbą ujęcia różnych obszarów funkcjonowania miasta poprzez Ogólną Teorię Systemów, opracowaną przez Bertalanffy w 1937 roku, jednak stale aktualną, a nawet zyskującą na znaczeniu w coraz szybciej żyjącym świecie. W rozdziale zostały poruszone zagadnienia oddziaływania na siebie rynku przedsiębiorstw, mieszkaniowego oraz rynku nieruchomości. Związek pomiędzy tymi trzema obszarami jest tak oczywisty i silny, że często aż pomijany w różnych prowadzonych analizach. Przedstawiony w niniejszej publikacji artykuł stara się pokazać relacje między tymi dwoma systemami i ich wpływ na rozwój miasta. Przyjętą metodą badawczą jest analiza tekstów, materiałów i danych z zakresu ekonomii i filozofii, a także przytoczonych fragmentów badań własnych przeprowadzonych w poszczególnych latach.
Article
In many tragic plays, the protagonist is brought down by a disaster that is a consequence of the protagonist's own error, his or her hamartia , the tragic flaw. Tragic flaws are disconcerting to the audience because they are not known or fully recognized by the protagonist—at least not until it is too late. In this essay, I take tragic flaws to be unreliable belief-forming dispositions that are unrecognized by us in some sense. I describe some different types of flaws and consider what we might do about them. Then I examine three types of policies for managing our tragic flaws: doxastic, dispositional, and methodological.
Article
Full-text available
Recent work in the science of cybernetics has identified four separate strategies for regulating complex systems composed of thinking participants. Using these strategies as a foundation, this article reviews the history of global development, summarizes current concerns, and then identifies several possible courses of action for regulating a global economy.
Chapter
Today, the principles of the self-organizing system are known with some completeness, in the sense that no major part of the subject is wholly mysterious. We have a secure base. Today we know extactly what we mean by "machine", by "organization", by "integration", and by "selforganization". We understand these concepts as thoroughly and as rigorously as the mathematician understands "continuity" or "convergence". In these terms we can see today that the artificial generation of dynamic systems with "life" and "intelligence" is not merely simple-it is unavoidable if only the basic requirements are met. These are not carbon, skater, or any other material entities but the persistence, over a long time, of the action of any operator that is both unchanging and single-valued. Every such operator forces the development of its own form of life and intelligence. But will the forms developed be of use to us? Here the situation is dominated by the basic law of requisite variety (and Shannon's Tenth Theorem), which says that the achieving of appropriate selection (to a degree better than chance) is absolutely dependent on the processing of at least that quantity of information. Future work must respect this law, or be marked as futile even before it has started. Finally, I commend as a program for research, the identification of the physical basis of the brain's memory stores. Our knowledge of the brain's functioning is today grossly out of balance. A vast amount is known about how the brain goes from state to state at about millisecond intervals; but when we consider our knowledge of the basis of the important long-term changes we find it to amount, practically, to nothing. I suggest it is time that we made some definite attempt to attack this problem. Surely it is time that the world had one team active in this direction?
Book
1. Correspondence Principle.- 1.1. Bohr's Principle.- 1.2. The Attitude of Philosophers.- 1.3. A General Methodological Principle in Physics.- 1.4. Descriptive and Normative Versions.- 1.5. Some Logical Difficulties.- Notes to Chapter 1.- 2. Idealization and Factualization.- 2.1. Scientific Law an an Implication.- 2.2. Factual and Idealizational Laws.- 2.3. Idealization in Science.- 2.4. The Attitude of Philosophers.- 2.5. Idealization and Factualization.- 2.6. Idealization and Essence.- 2.7. Some Controversial Issues.- Notes to Chapter 2.- 3. Reduction.- 3.1. The Concept of Reduction.- 3.2. Heterogeneous Reduction.- 3.3. Non-Mechanistic Reductionism.- 3.4. Trivial Homogeneous Reduction.- 3.5. Non-Trivial Homogeneous Reduction.- 3.6. Reduction of an Idealizational Law to a Factual One.- Notes to Chapter 3.- 4. Correspondence Relation.- 4.1. Definition.- 4.2. Simple Implicative Version.- 4.3. Approximative Version.- 4.4. Explanative Version.- 4.5. 'Dialectical' Version.- 4.6. Renewed Implicative Version.- 4.7. Some Formal Features.- 4.8. Correspondence Sequence and Correspondence Network.- Notes to Chapter 4.- 5. The Problem of the Incommensurability and Relations Among Theories.- 5.1. The Claim of Incommensurability.- 5.2. The Problem of Meaning Variance.- 5.3. The Problem of 'Untranslatable' Languages.- 5.4. The Problem of the 'Theory-Ladenness' of Facts.- 5.5. Various Relations Among Theories.- Notes to Chapter 5.- 6. The Types of Methodological Empiricism.- 6.1. Inductivism.- 6.2. Hypothetism.- 6.3. Pluralistic Hypothetism.- 6.4. Idealizational Hypothetism.- 6.5. Pluralistic Idealizational Hypothetism.- 6.6. A Confrontation: the Diversity of Methods.- Notes to Chapter 6.- 7. Revolutions and Continuity.- 7.1. Simple Cumulativism (No Revolutions or One Revolution).- 7.2. Simple Anticumulativism (Permanent Revolution or Occasional Revolutions Without Continuity).- 7.3. A Dialectical View (Revolutions and Continuity).- 7.4. The Threshold of Maturity (Two Kinds of Revolutions).- 7.5. Periods of Evolution and of Revolution.- 7.6. The Concept of Revolution and Anti-Cumulative Changes.- Notes to Chapter 7.- 8. Relative and Absolute Truth.- 8.1. Relative Truth.- 8.2. Absolute Truths in Science.- 8.3. Truth-Content and Approximate Truth.- 8.4. The Truth of Idealizational Laws and of Their Factualizations.- 8.5. Relative Truth and Essence.- 8.6. Towards the Absolute Truth.- Notes to Chapter 8.- 9. Internal and External History of Science.- 9.1. Internal and External Factors.- 9.2. The Problem of the Methodological Historicism.- 9.3. Internal History as an Idealization.- Notes to Chapter 9.- Index of Names.
Article
Purpose – To report on an empirical study in psycholinguistics that revealed a difference between European and American patterns of thinking and to provide a brief history of a 30‐year effort to modify the philosophy of science in order to make it more suitable as a guide to doing research in the social sciences. Design/methodology/approach – Assesses the approach of Heinz von Foerster, who used a deductive approach to science rather than an American empirical approach. Furthermore, von Foerster was willing to modify not only science but also the philosophy of science. By proposing that scientists pay attention to the observer as well as the observed, he added a dimension to the philosophy of science, which affects all disciplines. Findings – Proposes an additional dimension that might be added to the philosophy of science. Paying attention to both the observer and the receiving society suggests a communication metaphor rather than the photograph metaphor, which has prevailed in the philosophy of science. Examining the philosophical underpinnings of science rather than just testing or extending an existing theory is a type of inquiry that springs from von Foerster's enthusiasm for tackling interesting problems unimpeded by disciplinary boundaries. Originality/value – An assessment of the contribution to the multidisciplinary approach to science of von Foerster.
Article
The design of a complex regulator often includes the making of a model of the system to be regulated. The making of such a model has hitherto been regarded as optional, as merely one of many possible ways.In this paper a theorem is presented which shows, under very broad conditions, that any regulator that is maximally both successful and simple must be isomorphic with the system being regulated. (The exact assumptions are given.) Making a model is thus necessary.The theorem has the interesting corollary that the living brain, so far as it is to be successful and efficient as a regulator for survival, must proceed, in learning, by the formation of a model (or models) of its environment.
Article
This book is not a treatise on all cerebral mechanisms but a proposed solution of a specific problem: the origin of the nervous system's unique ability to produce adaptive behaviour. The work has as basis the fact that the nervous system behaves adaptively and the hypothesis that it is essentially mechanistic; it proceeds on the assumption that these two data are not irreconcilable. It attempts to deduce from the observed facts what sort of a mechanism it must be that behaves so differently from any machine made so far. Other proposed solutions have usually left open the question whether some different theory might not fit the facts equally well: I have attempted to deduce what is necessary, what properties the nervous system must have if it is to behave at once mechanistically and adaptively. The concepts of organisation, behaviour, change of behaviour, part, whole, dynamic system, co-ordination, etc.--notoriously elusive but essential--were successfully given rigorous definition and welded into a coherent whole. But the rigour and coherence depended on the mathematical form, which is not read with ease by everybody. As the basic thesis, however, rests on essentially commonsense reasoning, I have been able to divide the account into two parts. The main account (Chapters 1-18) is non-mathematical and is complete in itself. The Appendix (Chapters 19-22) contains the mathematical matter. (PsycINFO Database Record (c) 2012 APA, all rights reserved)