The concept of forty information becomes a cornerstone of
processing and handling linguistic data. As opposed to numeric
information whose processing is well known and fully supported by a vast
number of algorithms by entering the area of linguistic information
processing we are immediately faced with a genuine need to revisit the
fundamental concepts. We first review a notion of information
granularity as a primordial concept playing a key role in human
cognition. Dwelling on that, the study embarks on the concept of
communication with fuzzy sets. In particular, we discuss a so-called
fuzzy communication channel. The ideas of communication exploiting forty
information call for its efficient encoding and decoding that
subsequently leads to minimal losses of transmitted information.
Interestingly enough, the incurred losses depend heavily on the
granularity of the linguistic information involved-in this way one can
take advantage of the level of uncertainty residing within the
transmitted information granules
Studying on relationship of multiscale between power law distribution and correlation of the Shanghai Securities Exchange Compound Index (SSECI) through 5-min returns dataset, we find that accelerating decay of power-law distribution has little effect on the correlation properties of returns, but stabilizes correlation of absolute returns. Furthermore, we also analyze the impacts of short-range, nonlinear and nonlinear long-range correlation on power-law properties of returns by constructing three different types of shuffling series. The empirical results indicate that except for correlation, time scale may also be an important factor in maintaining power law property of returns. When time scale is less than the critical point, the effects of correlation are robust with nonlinear long-range correlation presenting the strongest influence. However, for time scale beyond the critical point, the impact of correlation begins to diminish or even finally to disappear, and then the power law property shows completely time scale dependent.
Manufacturing is a key to continuous economic growth. Fuzzy expert systems, fuzzy logics, fuzzy languages, fuzzy neural networks, and intelligent control are proposed as additional tools in manufacturing. Fuzzy logic is a new way to program computers and appliances to mimic the imprecise way humans make decisions. Fuzzy logic has been applied to cameras, subways, computers and air conditioners. Through the use of fuzzy logic, fuzzy expert systems can be built which add a new dimension in the technologies for intelligent factories.
"Mathematicians, like physicists, are pushed by a strong fascination.
Research in mathematics is hard, it is intellectually painful even if it is
rewarding, and you would not do it without some strong urge." [D. Ruelle]. We
shall give some examples from our experience, when we were able to simplify
some serious mathematical models to make them understandable by children,
preserving both aesthetic and intellectual value. The latter is in particularly
measured by whether a given simplification allows setting a sufficient list of
problems feasible for school students.
The defective coin problem involves identification of defective coin, if any, and ascertain the nature of the defect (heavier/lighter) from a set of coins containing at the most one defective coin, using an equal-arm pan-balance. The solution strategy to minimise number of weightings required to detect the defective coin is based on problem reduction approach involving successive decomposition of the problem into subproblems until it is trivially solved. One of the two types of subproblems is visualised as combination of pair of antithetic problems, leading to an optimal solution procedures which is simply a term by term merger of corresponding antithetic procedures. Also, the algorithm is capable of generating all possible optimal solutions.
Purpose: The mismatches between political discourse and military momentum in the American handling of the Cuban missile crisis are explained by using the model of the potential autopoiesis of subsystems. Under wartime conditions, the codes of political and military communications can increasingly be differentiated. Design/methodology/approach: The model of a further differentiation between political and military power is developed on the basis of a detailed description of the Cuban missile crisis. We introduce the concept of a "semi-dormant autopoiesis" for the difference in the dynamics between peacetime and wartime conditions. Findings: Several dangerous incidents during the crisis can be explained by a sociocybernetic model focusing on communication and control, but not by using an organization-theoretical approach. The further differentiation of the military as a subsystem became possible in the course of the twentieth century because of ongoing learning processes about previous wars.
Ashby's law of requisite variety states that a controller must have at least
as much variety (complexity) as the controlled. Maturana and Varela proposed
autopoiesis (self-production) to define living systems. Living systems also
require to fulfill the law of requisite variety. A measure of autopoiesis has
been proposed as the ratio between the complexity of a system and the
complexity of its environment. Self-organization can be used as a concept to
guide the design of systems towards higher values of autopoiesis, with the
potential of making technology more "living", i.e. adaptive and robust.
Following the initiative of McCulloch and Pitts (1943), there has been much speculation about the achievement of artificial
intelligence using networks of model neurons. The advent of the “perceptron” principle (Rosenblatt 1961; Nilsson 1965) crystallised
something definite and functional out of a mass of diffuse speculation, but it is not difficult to show that the “simple perceptron”
has limited capability (Minsky and Papert 1969). This can be attributed to the fact that all of the changes in weights constituting
its learning are restricted to a single functional layer. The simple training algorithm is possible because all the places
where changes occur are in this one layer and contribute directly to the output of the device, but the range of tasks that
can be learned is drastically limited.
In the implemented neural-based navigation, the two intelligent behaviors necessary to the navigation, are acquired by learning using GBP algorithm. They enable Robucar to be more autonomous and intelligent in partially structured environments. Nevertheless, there are a number of issues that need to be further investigated. At first, the Robucar must be endowed with one or several actions to come back to eliminate a stop in a dead zone situation. Another interesting alternative is the use of a better localization not only based on odometry but by fusing data of other sensors such as laser scanner.
– In the tradition of Spencer Brown's Laws of Form, observation was defined in Luhmann's social systems theory as the designation of a distinction. In the sociological design, however, the designation specifies only a category for the observation. The distinction between observation and expectation enables the sociologist to appreciate the processing of meaning in social systems. Seeks to address this issue.
– The specification of “the observer” in the tradition of systems theory is analyzed in historical detail. Inconsistencies and differences in perspectives are explicated, and the specificity of human language is further specified. The processing of meaning in social systems adds another layer to the communication.
– Reflexivity about the different perspectives of participant observers and an external observer is fundamental to the sociological discourse. The ranges of possible observations from different perspectives can be considered as second‐order observations or, equivalently, as the specification of an uncertainty in the observations. This specification of an uncertainty provides an expectation. The expectation can be provided with (one or more) values by observations. The significance of observations can be tested when the expectations are properly specified.
– The expectations (second‐order observations) are structured and therefore systemic attributes to the discourse. However, the metaphor of a (meta‐)biological observer has disturbed the translation of social systems theory into sociological discourse. Different discourses specify other expectations about possible observations. By specifying second‐order observations as expectations, social systems theory and sociocybernetics can combine the constructivist with an empirical approach.
The purpose of this paper is to present a new core hypothesis on laughter. It has been built by putting together ideas from several disciplines: neurodynamics, evolutionary neurobiology, social networks, and communication studies. The hypothesis focusses on the social nature of laughter and contributes to ascertain its evolutionary origins in connection with the cognitive and social-emotional functions it performs.
An in-depth examination of laughter in the social communication context and along the life cycle of the individual is performed. This instinctive behaviour that appears as a “virtual”, non-physical form of “grooming” would serve as a bond-making instrument in human groups. Further, the neurodynamic events underlying laughter production – and particularly the form of the neural entropy gradients – are congruent with a sentic hypothesis about the different emotional contents of laughter and their specific effects on bonding dynamics.
The new behavioural and neurodynamic tenets introduced about this unusual sound feature of our species justify the ubiquitous presence it has in social interactions at large and along the life cycle of the individual. Laughter, far from being a curious evolutionary relic or a rather inconsequential innate behaviour, should be considered as a highly efficient tool for inter-individual problem solving and for maintenance of social bonds.
Laughter, the authors would conclude, has been evolutionarily kept and augmented as an optimized tool for unconscious cognitive-emotional problem solving, and at the same time as a useful way to preserve the essential fabric of social bonds in close-knit groups and within human societies at large.
The sets of contexts and properties of a concept are embedded in the complex Hilbert space of quantum mechanics. States are unit vectors or density operators, and contexts and properties are orthogonal projections. The way calculations are done in Hilbert space makes it possible to model how context influences the state of a concept. Moreover, a solution to the combination of concepts is proposed. Using the tensor product, a procedure for describing combined concepts is elaborated, providing a natural solution to the pet fish problem. This procedure allows the modeling of an arbitrary number of combined concepts. By way of example, a model for a simple sentence containing a subject, a predicate and an object, is presented. Comment: 21 pages, to appear in the journal 'Kybernetes' in the Summer of 2004
We propose a theory for modeling concepts that uses the state-context-property theory (SCOP), a generalization of the quantum formalism, whose basic notions are states, contexts and properties. This theory enables us to incorporate context into the mathematical structure used to describe a concept, and thereby model how context influences the typicality of a single exemplar and the applicability of a single property of a concept. We introduce the notion `state of a concept' to account for this contextual influence, and show that the structure of the set of contexts and of the set of properties of a concept is a complete orthocomplemented lattice. The structural study in this article is a preparation for a numerical mathematical theory of concepts in the Hilbert space of quantum mechanics that allows the description of the combination of concepts (see quant-ph/0402205) Comment: 20 pages, to appear in the journal 'Kybernetes' in the Summer of 2004
Necessary and sufficient conditions allowing a previously unknown space to be explored through scanning operators are reexamined with respect to measure theory. Generalized conceptions of distances and dimensionality evaluation are proposed, together with their conditions of validity and range of application to topological spaces. The existence of a Boolean lattice with fractal properties originating from nonwellfounded properties of the empty set is demonstrated. This lattice provides a substrate with both discrete and continuous properties, from which existence of physical universes can be proved, up to the function of conscious perception. Spacetime emerges as an ordered sequence of mappings of closed 3-D Ponicare sections of a topological 4-space provided by the lattice. The possibility of existence of spaces with fuzzy dimension or with adjoined parts with decreasing dimensions is raised, together with possible tools for their study. The work provides the introductory foundations supporting a new theory of space whose physical predictions (suppressing the opposition of quantum and relativistic approaches) and experimental proofs are presented in details in Parts 2 and 3 of the study.
Spacetime is represented by ordered sequences of topologically closed Poincare sections of the primary space constructed of primary empty cells. These mappings are constrained to provide homeomorphic structures serving as frames of reference in order to account for the successive positions of any objects present in the system. Mappings from one to the next section involve morphisms of the general structures. Discrete properties of the lattice allow the prediction of scales at which microscopic to cosmic structures should occur. Deformations of primary cells by exchange of empty set cells allow a cell to be mapped into an image cell in the next section as far as mapped cells remain homeomorphic. If a deformation involves a fractal transformation to objects, there occurs a change in the dimension of the cell and the homeomorphism is not conserved. The fractal kernel stands for a "particle" and the reduction of its volume is compensated by morphic changes of a finite number of surrounding cells. Quanta of distances and quanta of fractality are demonstrated. The interaction of a moving particle-like deformation with the surrounding lattice involves a fractal decomposition process that supports the existence and properties of previously postulated inerton clouds as associated to particles. Experimental evidence and further possibilities of the existence of inertons are proposed.
The distribution of the deformations of elementary cells is studied in an abstract lattice constructed from the existence of the empty set. One combination rule determining oriented sequences with continuity of set-distance function in such spaces provides a particular kind of spacetime-like structure that favors the aggregation of such deformations into fractal forms standing for massive objects. A correlative dilatation of space appears outside the aggregates. At the large scale, this dilatation results in an apparent expansion, while at the submicroscopic scale the families of fractal deformations give raise to families of particle-like structure. The theory predicts the existence of classes of spin, charges, and magnetic properties, while quantum properties associated to mass have previously been shown to determine the inert mass and the gravitational effects. When applied to our observable spacetime, the model would provide the justifications for the existence of the creation of mass in a specified kind of "void", and the fractal properties of the embedding lattice extend the phenomenon to formal justifications of Big-Bang-like events without need for any supply of an extemporaneous energy.
In an excitable Delaunay triangulation every node takes three states
(resting, excited and refractory) and updates its state in discrete time
depending on a ratio of excited neighbours. All nodes update their states in
parallel. By varying excitability of nodes we produce a range of phenomena,
including reflection of excitation wave from edge of triangulation, backfire of
excitation, branching clusters of excitation and localized excitation domains.
Our findings contribute to studies of propagating perturbations and waves in
Provides an overview of major developments pertaining to generalized information theory during the lifetime of Kybernetes. Generalized information theory is viewed as a collection of concepts, theorems, principles, and methods for dealing with problems involving uncertainty‐based information that are beyond the narrow scope of classical information theory. Introduces well‐justified measures of uncertainty in fuzzy set theory, possibility theory, and Dempster‐Shafer theory. Shows how these measures are connected with the classical Hartley measure and Shannon entropy. Discusses basic issues regarding some principles of generalized uncertainty‐based information.
– The purpose of this paper is to develop experimental laboratory biological techniques for approximation of principle transport networks, optimizing transport links, and developing optimal solutions to current transport problems. It also aims to study how slime mould of Physarum polycephalum approximate autobahn networks in Germany.
– The paper considers the 21 most populous urban areas in Germany. It represents these areas with source of nutrients placed in the positions of slime mould growing substrate corresponding to the areas. At the beginning of each experiment slime mould is inoculated in the Berlin area. Slime mould exhibits foraging behavior and spans sources of nutrients (which represent urban areas) with a network of protoplasmic tubes (which approximate vehicular transport networks). The study analyzes structure of transport networks developed by slime mould and compares it with families of known proximity graphs. It also imitates slime‐mould response to simulated disaster by placing sources of chemo‐repellents in the positions of nuclear power plants.
– It is found that the plasmodium of Physarum polycephalum develops a minimal approximation of a transport network spanning urban areas. Physarum‐developed network matches autobahn network very well. The high degree of similarity is preserved even when we place high‐demand constraints on repeatability of links in the experiments. Physarum approximates almost all major transport links. In response to a sudden disaster, gradually spreading from its epicenter, the Physarum transport networks react by abandoning transport links affected by disaster zone, enhancement of those unaffected directly by the disaster, massive sprouting from the epicenter, and increase of scouting activity in the regions distant to the epicenter of the disaster.
– Experimental methods and computer analysis techniques presented in the paper lay a foundation of novel biological laboratory approaches to imitation and prognostication of socio‐economical developments.
This work has been supported by the State Committee for Scientific Research, from KBN research grant no. 8T11F 006 19, and by the Foundation for Polish Science, from subsidy no. 11/2001.
. The symbol-based, correspondence epistemology used in AI is contrasted with the constructivist, coherence epistemology promoted by cybernetics. The latter leads to bootstrapping knowledge representations, in which different parts of the cognitive system mutually support each other. Gordon Pask's entailment meshes and their implementation in the THOUGHTSTICKER program are reviewed as a basic application of this approach. Entailment meshes are then extended to entailment nets: directed graph representations governed by the bootstrapping axiom, determining which concepts are to be distinguished ormerged. This allows a constant restructuring and elicitation of the conceptual network. Semantic networks and frame-like representations with inheritance can be expressed in this very general scheme by introducing a basic ontology of node and link types. Entailment nets are then generalized to associative nets characterized by weighted links. Learning algorithms are presented which ...
This paper is devoted to a discussion of the relation between computer proof and human proof. It is a discussion of the relationship of persons and machines. In Section 2 we discuss these issues in general and specifically in regard to the recent solution of the Robbins Problem via a proof generated by computer. Section 3 gives the mathematical background for the Robbins problem. The Robbins problem was a longstanding open problem about axioms for Boolean algebra. One point of this paper is to show that the proof of the Robbins conjecture, generated by a computer, can be filled in and understood by human beings. We accomplish this aim in the present paper by presenting a notational reformulation of Boolean algebra and the Robbins Problem in Sections 4 and 5. This reformulation is called "box notation". Section 6 discusses cybernetic and semiotic issues. Section 7 is a formal presentation of a proof, in box notation, that Robbins algebras are Boolean. It should be mentioned that the notational reformulation given herein is most effective for a person when he or she actually uses the notation, writing out and making formulas and diagrams by hand. This activity allows the combination of hand and eye to correlate the patterns inherent in the formulas. Sections 5 and 7 should be regarded as guides and injunctions for such personal work. As far as this author knows, such work is indispensable for reaching understanding. Each person who encounters a piece of mathematics must hold a conversation with it. This conversation almost always includes the development of individual language and notation to enable that person to grasp the mathematics. This paper is an invitation to enter that domain of linguistic creativity through which mathematical understanding can arise. The issue d...
This paper describes a novel type of artistic Artificial Life environment. Evolving agents, who have the ability to make and listen to sound, populate a synthetic world. An evolvable, rule-based system drives agent behaviour. Agents compete for limited resources in a virtual environment that is influenced by the presence and movement of the artwork's audience. Through a link between the real and virtual spaces, virtual agents evolve implicitly to try to maintain the interest of the human audience. 1.
Purpose – This paper aims to review current research with particular reference to new research and development initiatives. Design/methodology/approach – A general review and survey of selected research and development topics is given and some new challenges and applications of future technologies are considered with particular reference to their impact and potential. Findings – The paper illustrates the multi- and trans-disciplinary nature of studies in cybernetics, systems and management sciences, with a view to further research and development activity. Practical implications – The choice of review provides an awareness of current trends in these areas of endeavour. Originality/value – The reviews are selected from a global database and give a studied assessment of current research and development initiatives.
Purpose – The purpose of this paper is to apply grey system theory to population system and project China's population. Design/methodology/approach – The paper applies the GM(1,1) model to China's population projections. Two key aspects of the method are crucial for obtaining best accuracy of prediction. They are the choice of the length for the original data to be used in the model and the adoption of the GM(1,1) metabolic model in prediction. The former determines what initial data to be used while the latter describes an iteration process on how to proceed to predict. Findings – The results show that in 2015 China's population will reach 1.37 billion and in 2050 it will be between 1.42 and 1.48 billion, which is in accordance with the latest projections from the UN. The findings show the GM(1,1) metabolic model is an effective mathematical means in population projections. Research limitations/implications – The paper suggests that GM(1,1) metabolic model can provide an effective simulation model for complicated systems with uncertainty and can be used in many fields. Practical implications – The paper provides useful advice for the department of population. Originality/value – Most population projections have been based on assumptions about fertility, mortality, and migration. The paper considers the population system as a grey system and introduces the GM(1,1) metabolic model to population projections.
Purpose – The purpose of this theoretical research paper in philosophy and theory of science is to argue for the necessity of developing transdisciplinary frameworks in order to be able to interact in an interdisciplinary fashion. Design/methodology/approach – Reflects on interdisciplinarity and the prerequisites of doing scientific and scholarly research and develops a non-reductionistic and transdisciplinary view on human knowing in the light of the growing development of interdisciplinary practices and sciences. Findings – It is argued that there is at present an incompatibility between scientific and phenomenological approaches to cognition and communication. A broader framework is, therefore, needed to encompass both, if one wants to make coherent theories and models in this subject area. The work, therefore, focuses on the relation between information science and semiotics and creates a framework for the analysis of both meaning and truth. Research limitations/implications – The framework is very abstract here in the description. It has to be developed in detail and its effect demonstrated in practical examples. Practical implications – They have to be judged both on how well their descriptions fit better than others with what actually goes on in the sciences and humanities and on their usefulness to function as a common map coordinating interdisciplinary work. Originality/value – A trans-scientific framework, which is suggested as a basis for the sciences and humanities to understand themselves in relation to other kinds of knowledge such as philosophy, art, religion, political ideology, etc. New also are the visual structural models.
Purpose – This paper identifies meta-level considerations long ignored in Sri Lanka's peace negotiations. Design/methodology/approach – Variety absorption being at the heart of manoeuvres by the various parties of the negotiation, Ashby's Law of Requisite Variety informs the diagnosis. Also used are Beer's viable systems model, Maturana's structural coupling and Stokes sociological thinking on identity which encompasses the nature of identity, levels of identity, organisation of identity. Findings – Whilst in Sri Lanka's conflict resolution parlance, identity has been pivotally limited to race, in socio-cybernetic terms it denotes much more. This leads to the recognition that relationships between structurally coupled entities change as negotiations progress thus calling upon them (and others) to dynamically adapt their identity in their endeavour to retain viability. Practical implications – The diagnosis shows the need to design negotiation processes capable of absorbing the variety needed to address the content of negotiations. Originality/value – Linking identity to viability explains why stakeholder relations move through emergent properties/relations and why negotiations between static protagonist, as done in all previous Sri Lankan peace negotiations, are doomed to failure. This work is useful for those who formulate the modalities of Sri Lanka's peace talks and for cyberneticians and people involved in conflict resolution – particularly those bedevilling sovereignty.
Purpose – Aims to review current trends in the development of robotics 2004-2008, from a cybernetic viewpoint, and provide; data from the UNECE/IFR World Robotics Survey. Design/methodology/approach – A general review and survey of selected research and development topics. Findings – Illustrates the multi- and trans-disciplinary interests of cybernetics and systems and aims to further research and development activity. Practical implications – The choice of reviews provides an awareness of the current initiatives and trends in these areas of research and endeavour. Originality/value – The reviews are selected from a global database and give a studied assessment of current research and development initiatives.
– The purpose of this paper is to discuss Dr Rose's paper which presented his views of the cybernetic revolution; education and the environmental barrier; and the purpose of education.
– This paper introduces the concepts held by systemists and cyberneticians in the late 1960s. Sections of the original publication by Dr Rose are included.
– Places in an historical context the concepts of the cyberneticians and systemists – of that time and re‐emphasizes their role in forming present day thinking.
– Forms part of the tributes paid to the role of Dr Rose and his contribution to these areas of research and development. The original paper discussed was republished in 1991 as a 20th Anniversary tribute to the founding editor of the journal Kybernetes.
Purpose – The purpose of this paper is to study the 2D hybrid linear model, which is a method of describing both continuous- and discrete-time dynamics in one system. Singularity of 2D hybrid linear models is a newly occurred problem and a very important question is how to compute the solution of the singular 2D hybrid linear model. Design/methodology/approach – Computation of the solution of mentioned system is based on Laplace transform, Z-transform and shifting algorithm. The inverse Laplace transform and inverse Z-transform are used in two cases. Findings – In this paper, a class of 2D singular hybrid linear systems is introduced. Two methods for computation of solutions of the singular hybrid system with nonzero boundary conditions are proposed. Both methods are illustrated by the examples. Originality/value – Presented methods are a new way for computing the solution of singular 2D hybrid linear systems.
Purpose – This paper aims to present lessons learned in applying 2nd-order cybernetics – specifically Maturana and Varela's “biology of cognition” – to the actual design of interactive decision support systems. Design/methodology/approach – This consists of a review of the rationale and bases for applying 2nd-order cybernetics in interactive IT design, the challenges in moving from theory to praxis, illustrative examples of tactics employed, and a summary of the successful outcomes achieved. Findings – The paper offers conclusions about the general applicability of such theories, two sample applications devised for actual projects, and discussion of these applications' perceived value. Research limitations/implications – The applications described are not claimed to represent a complete toolkit, and they may not readily generalize beyond the scope of interactive information systems design. On the other hand, the examples offered demonstrate that 2nd-order cybernetics can constructively inform such designs – advancing the focus of discussion from theory-based advocacy to praxis-based recommendations. Practical implications – The paper presents illustrative examples of the exigencies entailed in moving 2nd-order cybernetics ideas forward from theory to praxis and specific tactics for doing so. Originality/value – This paper addresses the persistent deficiencies in both concrete examples and guidance for practical applications of 2nd-order cybernetics theories. It will hopefully stimulate similar attempts to demonstrate such theories' practical benefits.
– An important lesson that philosophy can learn from the Turing test and computer science more generally concerns the careful use of the method of levels of abstraction (LoAs). The purpose of this paper is to summarize the method and apply it to the paper, modelling and analysis of phenomenological and conceptual systems showing its principal features and main advantages.
– The constituents of the method are “observables”, collected together and moderated by predicates restraining their “behaviour”. The resulting collection of sets of observables is called a “gradient of abstractions” (GoAs) and it formalises the minimum consistency conditions that the chosen abstractions must satisfy. Two useful kinds of GoA – disjoint and nested – are identified. It is then argued that in any discrete (as distinct from analogue) domain of discourse, a complex phenomenon may be explicated in terms of simple approximations organised together in a GoAs. Thus, the method replaces, for discrete disciplines, the differential and integral calculus, which form the basis for understanding the complex analogue phenomena of science and engineering.
– The result formalises an approach that is rather common in computer science but has hitherto found little application in philosophy. So the philosophical value of the method is demonstrated by showing how making the LoA of discourse explicit can be fruitful for phenomenological and conceptual analysis. To this end, the method is applied to the Turing test, the concept of agenthood, the definition of emergence, the notion of artificial life, quantum observation and decidable observation.
– This paper applies the method of abstraction to the paper, modelling and analysis of phenomenological and conceptual systems showing its principal features and main advantages. It is hoped that this treatment will promote the use of the method in certain areas of the humanities and especially in philosophy.
Purpose – The purpose of this paper is to report a further step in the authors' research and suggest a new – 4th order cybernetics, applying it to the issue of a sustainable future, that must unavoidably result from the current socio-economic crisis surfacing in 2008 as the top of an iceberg, or humankind of the current civilization has poor chances to survive. One-sided solutions do not prove to work; they make us think of systems and cybernetics. Design/methodology/approach – Qualitative research with application to real-life cases. Findings – While cybernetics is about steering, i.e. influencing, cybernetics of the 1st and 2nd order might be insufficient for solving the problem, and cybernetics of the 3rd order might serve us better, but not enough either. The authors' thesis reads: they might better be put in a new synergy with the (Universal) Dialectical Systems Theory and Cybernetics of Conceptual Systems to make a new kind of systems theory/cybernetics called cybernetics of the 4th order. It should help human beings to attain the requisite holism of the human approach and the requisite wholeness of outcomes of human action. Research limitations/implications – A more holistic concept of cybernetics is suggested. Practical implications – Control of ecological problems of today might be made easier. Originality/value – This is the first publication about the concept of the 4th order cybernetics, especially with application to issues of sustainability.
This paper marks the centenary year of W. Ross Ashby (1903‐1972), one of the founders of the interdisciplinary subject of cybernetics. Its purpose is to Ashby's cybernetics to construct a framework for understanding some of the features that presently characterise British higher education.
The contents of Ashby's 1956 book, An Introduction to Cybernetics , are outlined. Cybernetic concepts, principles, and laws are then applied to some of the features that presently characterise UK universities: growth in student numbers, the modularisation of curricula, concerns over academic standards, and bureaucracy.
The paper finds Ashby's writings to be critical to understanding the nature of many of the contemporary debates about UK higher education. A diagnosis and critical evaluation of the policy impetus to increase student numbers and modularise curricula is supplied. A cybernetic analysis in support of the current concerns over academic standards is provided. The paper demonstrates why the current higher education quality assurance regime produces a bureaucratised university .
The paper's framework is supported by an analysis of available national statistics and other secondary evidence, but more detailed, cross‐comparative, longitudinal studies of the UK labour market and educational attainment are required.
Given the economic perspective adopted by policy‐makers – the paper identifies three reasons why the current policy of expanding UK higher education may be flawed.
The paper marks the centenary year of W. Ross Ashby by demonstrating how his writings can supply a framework for understanding the current debates about UK higher education policy.
Purpose – The purpose of this paper is to examine the popular “information transmitted” interpretation of absolute judgments, and to provide an alternative interpretation if one is needed. Design/methodology/approach – The psychologists Garner and Hake and their successors used Shannon's Information Theory to quantify information transmitted in absolute judgments of sensory stimuli. Here, information theory is briefly reviewed, followed by a description of the absolute judgment experiment, and its information theory analysis. Empirical channel capacities are scrutinized. A remarkable coincidence, the similarity of maximum information transmitted to human memory capacity, is described. Over 60 representative psychology papers on “information transmitted” are inspected for evidence of memory involvement in absolute judgment. Finally, memory is conceptually integrated into absolute judgment through a novel qualitative model that correctly predicts how judgments change with increase in the number of judged stimuli. Findings – Garner and Hake gave conflicting accounts of how absolute judgments represent information transmission. Further, “channel capacity” is an illusion caused by sampling bias and wishful thinking; information transmitted actually peaks and then declines, the peak coinciding with memory capacity. Absolute judgments themselves have numerous idiosyncracies that are incompatible with a Shannon general communication system but which clearly imply memory dependence. Research limitations/implications – Memory capacity limits the correctness of absolute judgments. Memory capacity is already well measured by other means, making redundant the informational analysis of absolute judgments. Originality/value – This paper presents a long-overdue comprehensive critical review of the established interpretation of absolute judgments in terms of “information transmitted”. An inevitable conclusion is reached: that published measurements of information transmitted actually measure memory capacity. A new, qualitative model is offered for the role of memory in absolute judgments. The model is well supported by recently revealed empirical properties of absolute judgments.
Purpose – This paper connects the notions of abstract and actual based on a reflection of the Chinese notions of xiangsheng (mutual arising) and xushi (abstract/actual, empty/full). These word pairs enable a conception of abstract and actual that shows an alternative to, and which complements, distinctions of the terms that are based in dualism and rationalism. Design/methodology/approach – The author sidesteps methodological rigour as practiced in the West as the style of thought introduced here shows a picture of abstract and actual arising from mutual interdependence rather than attempting to describe and formally distinguish abstract and actual through an observer-independent methodology. Findings – Discussing the relationship of actual and abstract from the viewpoint of the Chinese cultural tradition, this paper shows how abstract and actual may be thought of as a mutually generating, dynamic and polar relationship. The discussion further provides a basis for understanding how perceptions of abstract and actual can be understood as choices made by observers. Research limitations/implications – This research is based on the limited personal experience of the author as a teacher of architectural design at one Taiwanese and one Chinese university. Originality/value – This paper reflects on the relationship of abstract and actual from a non-dualist viewpoint by introducing traditional Chinese ways of seeing and appreciating, and connecting this perspective to cybernetic and radical constructivist epistemologies. To show the relationship between abstract and actual as polar and mutually arising, the paper focuses particularly on making and experiencing in and through creative processes.
Purpose – The purpose of this paper is to describe the various movements from abstraction to actuality in the context of design, with particular reference to architecture, first in terms of the design process and second in terms of the interpretation of architecture by observers. Design/methodology/approach – The paper focuses on the designers' use of forms of representation, such as drawings, with reference to the cybernetic understanding of conversation. This account is then used to discuss the representational properties of architecture itself and to relate this back to the design process. Findings – It is argued that the forms of representation used by designers, such as drawings and physical models, have both abstract and actual properties and that this combination is important for their representational function. The ambiguity in the interpretation of drawings and models is not only useful in generating ideas but also appropriate given the ambiguity in the interpretation of the architecture they represent. Originality/value – The division between the abstract (understood in terms of representation) and the actual is challenged. A connection is proposed between architecture itself as a form of representation and the representation used in its design.
Purpose – The purpose of this paper is to ask whether a first-order-cybernetics concept, Shannon's Information Theory, actually allows a far-reaching mathematics of perception allegedly derived from it, Norwich et al.'s “Entropy Theory of Perception”. Design/methodology/approach – All of The Entropy Theory, 35 years of publications, was scrutinized for its characterization of what underlies Shannon Information Theory: Shannon's “general communication system”. There, “events” are passed by a “source” to a “transmitter”, thence through a “noisy channel” to a “receiver”, that passes “outcomes” (received events) to a “destination”. Findings – In the entropy theory, “events” were sometimes interactions with the stimulus, but could be microscopic stimulus conditions. “Outcomes” often went unnamed; sometimes, the stimulus, or the interaction with it, or the resulting sensation, were “outcomes”. A “source” was often implied to be a “transmitter”, which frequently was a primary afferent neuron; elsewhere, the stimulus was the “transmitter” and perhaps also the “source”. “Channel” was rarely named; once, it was the whole eye; once, the incident photons; elsewhere, the primary or secondary afferent. “Receiver” was usually the sensory receptor, but could be an afferent. “Destination” went unmentioned. In sum, the entropy theory's idea of Shannon's “general communication system” was entirely ambiguous. Research limitations/implications – The ambiguities indicate that, contrary to claim, the entropy theory cannot be an “information theoretical description of the process of perception”. Originality/value – Scrutiny of the entropy theory's use of information theory was overdue and reveals incompatibilities that force a reconsideration of information theory's possible role in perception models. A second-order-cybernetics approach is suggested.
Knowledge-based systems have been successfully utilised in the development of complex systems. In many cases, these systems have emphasised the need for techniques to integrate knowledge-based processing with methods for managing both large amounts of data and knowledge. However, many potential applications for expert systems are precluded by limitations in the ability of conventional expert system technology to functions in conjunction with data systems without manual intervention. The author focuses on the integration of knowledge-bases and databases with the capability to: store and context select between parallel, competing expert system rule structures; cascade variable rule structures; allow an expert system to be interrupted and to be subsequently restarted by storing the state of the inference engine; handle simple data storage and retrieval.
Purpose – To study the multiplicative perturbation of local C-regularized cosine functions associated with the following incomplete second order abstract differential equations in a Banach space X u?(t)=A(I+B)u(t), u(0)=x, u'(0)=y,(*) where A is a closed linear operator on X and B is a bounded linear operator on X. Design/methodology/approach – The multiplicative perturbation of exponentially bounded regularized C-cosine functions is generally studied by the Laplace transformation. However, C-cosine functions might not be exponentially bounded, so that the new method for the multiplicative perturbation of the nonexponentially bounded regularized C-cosine functions should be applied. In this paper, the property of regularized C-cosine functions is directly used to obtain the desired results. Findings – The new results of the multiplicative perturbations of the nonexponentially bounded C-cosine functions are obtained. Originality/value – The new techniques differing from those given previously in the literature are employed to deduce the desired conclusions. The results can be applied to deal with incomplete second order abstract differential equations which stem from cybernetics, engineering, physics, etc.
Purpose – The purpose of this paper is to suggest that physiological experience can contribute to the comprehension of visualisation of scientific data, especially artistic approaches. Design/methodology/approach – A number of relevant case studies are used to establish the rationale. Findings – “Objective” or neutral data visualisation does not exist. Every visualisation process relies on human definitions or mediation. Based on this premise, interdisciplinary collaborators can use methods from design and art to actively design the usability of data visualisation together with their perceived aesthetic, ambiguous and imaginative dimensions in order to create a multi-layered human experience of data. Research limitations/implications – Physiological engagement in combination with rich, ambiguous experiences are no substitute for scientific data visualisation but an evocative medium for the public communication of science, such as at science centres or science museums. Practical implications – Clear support of interdisciplinary collaborations and systematic application of methods to create aesthetic experiences from scientific data. Social implications – Potentially novel, engaging and evocative sensual experiences with data visualisations around themes, such as climate change, sustainability and ecology. Originality/value – The paper suggests that systematically complementing methods from art and design in order to emotionalise intellectual experiences could be considered an unorthodox yet highly effective novel approach.
Purpose – The purpose of this paper is to forecast the reliable storage life of a certain kind of equipment under the normal stress level. Design/methodology/approach – Through the stepping stress acceleration life test and the failure mechanism analysis, this paper aims to confirm the stress level for the stepping stress acceleration life test of a certain kind of equipment and establish the data processing mathematical model and storage life forecasting method. Findings – The stress level for the stepping stress acceleration life test of a certain kind of equipment is confirmed and the data processing mathematical model and storage life forecasting method is established. Research limitations/implications – Availability of data is the main limitation affecting which model will be applied. Practical implications – Useful advice for products' storage life forecasting. Originality/value – The paper presents a new approach to product storage life estimation.
Purpose – The purpose of this paper is to create a model of role-based access control (RBAC) based access control for virtual enterprise (VE). Design/methodology/approach – An access control model for security and management of VE is presented by integrating generic structure of VE and applying the principles of RBAC. In addition, the application of the model to a supply chain-oriented VE illustrates that a general access control scheme can ensure the running of VE. Findings – A theory base of access control for the realization of the VE is found. Originality/value – The paper presents a very useful new model of access control for VE. This paper is aimed at researchers and engineers.
Purpose - This piece seeks to reflect upon the nature of adaptation and our usage of it with relation to design, addiction, and final cause. Design/methodology/approach - This previously unpublished document was found amongst the manuscript papers for Mind and Nature in the Bateson Archives at the University of Santa Cruz Library Special Collections. Findings - It appears that "adaptation" was a concept generated by lineal thinking and that as we move forward into a world of causal circuits, i.e. of mental process as that notion is here defined, we discover that "adaptation" is only another face of addiction. Originality/value - It reflects on the issue of adaptation from a very different angle than in the usual scientific discourse.
Purpose – This paper seeks to develop an ontological approach, in order to make it possible to share a common understanding of accounting theory, in this case, the specific structure of the profit and loss account among people or software agents. Design/methodology/approach – This paper presents an ontology methodology (the Net technique) which represents a semi-structured element in the domain knowledge of accounting. More specifically, ontology will be used to explain the profit and loss account as a representation of the potential use of this methodology. Findings – To support ontology effectively, a strong accounting information support system in the organization is necessary. The ontology may be used by employees to navigate the information repository of an organization for the effective coordination. In addition, it might be possible for the WWW to be used to generate data, information and knowledge in the accounting domain. Practical implications – Software agents could extract and aggregate accounting information from numerous web sites, which in turn might answer research questions or be used as input data for other applications. Originality/value – The development of ontology expands the researcher's ability to generate information by using search methods beyond simple keywords. If only keywords are used in internet searches, then information that is retrieved will often lack the precision necessary for generating quality information. Therefore, in order to retrieve quality information more quickly and accurately, a broader and more extensive ontology development is required.
Purpose – To present empirical and theoretical evidence that our national accounts are improper for model applications. To propose a design for a micro foundation of the basic economic data that will give analytical national accounts. To present arguments for the need for an academic and scientific foundation of national accounting. Design/methodology/approach – Reviews published works in the field. In the theoretical discussion simple index algebra and Edgeworth-Bowley boxes are used. Findings – Shows that conventional accounting methods and deflation procedures imply inconsistent national accounts. Research limitations/implications – To fulfill the idea presented, that new designs for sampling of basic data-based on new data technology have to be considered and developed. Also a new accounting system has to be developed based on modern administration systems. Practical implications – If the idea is fulfilled a new kind of national accounts suited for meaningful and analytical model applications will be available. The need to revise national accounts will not be a case to consider. Originality/value – The paper presents theoretical evidence that our national accounts are not analytical, which is requirement for meaningful model applications. National account users should be interested in the proposal and result.
– The purpose of this paper is to introduce efficient methods for solving the 2D biharmonic equation with Dirichlet boundary conditions of second kind. This equation describes the deflection of loaded plate with boundary conditions of simply supported plate kind. Also it can be derived from the calculus of variations combined with the variational principle of minimum potential energy. Because of existing fourth derivatives in this equation, introducing high‐order accurate methods need to use artificial points. Also solving the resulted linear system of equations suffers from slow convergence when iterative methods are used. This paper aims to introduce efficient methods to overcome these problems.
– The paper considers several compact finite difference approximations that are derived on a nine‐point compact stencil using the values of the solution and its second derivatives as the unknowns. In these approximations there is no need to define special formulas near the boundaries and boundary conditions can be incorporated with these techniques. Several iterative linear systems solvers such as Krylov subspace and multigrid methods and their combination (with suitable preconditioner) have been developed to compare the efficiency of each method and to design powerful solvers.
– The paper finds that the combination of compact finite difference schemes with multigrid method and Krylov iteration methods preconditioned by multigrid have excellent results for the second biharmonic equation, and that Krylov iteration methods preconditioned by multigrid are the most efficient methods.
– The paper is of value in presenting, via some tables and figures, some numerical experiments which resulted from applying new methods on several test problems, and making comparison with conventional methods.
Purpose – The purpose of this paper is to review a recent electronic publication on the menace of spam and, related to previous work, look at the dangers the internet holds for children. A valuable source of information on history of chess is reviewed and the death of Prof. Russell Ackoff in Philadelphia is reported, with links to sources of obituaries and further details. Design/methodology/approach – The aim is to review developments on the internet, especially those of general cybernetic interest. Findings – The data on spam indicate, the nature and extent of the problem it presents. Chess is shown to have deeper historical roots than is widely known. With the death of Russ Ackoff another major figure has disappeared from the scene. Practical implications – The review of means of combating spam is of undoubted value, as is the mention of a means of seeking police help over internet abuse affecting children. The review of history of chess should be a valuable reference where the game is discussed in an AI context. Originality/value – It is hoped this is a valuable periodic review.
Purpose – The purpose of this paper is to study the issue of contingency evolution of acquisition organization systems with the game group link method. Design/methodology/approach – The concept and method of the game group link are used to analyze the evolution law of acquisition, to find a basis for the contingency evolution of acquisition organization system. Findings – Studies prove that the contingency evolution of acquisition organization systems has the following basic laws: continuous evolution laws adapted to environmental requirements; laws of coordinated evolution as a whole, law of evolution from gradual change to sudden change; evolution path is determined by concrete situations; evolution pattern is determined by dynamic conditions; and law of evolution promoted by innovation. Research limitations/implications – Research results of this paper have significance in inspiration and theoretical instruction in respect of effectively increasing evolution profits of acquisition organization systems, promoting realization of purpose of acquisition organization systems, actively boosting evolution, and continuous development of acquisition organization systems. Practical implications – It is shown that the acquisition process is the reasoning game process, development game process, and production game process conducted in turns by the main participating bodies of acquisition during the full life cycle of equipment. This process has clear features of “link,” and a correlated game link is formed among its different stages. Originality/value – The paper proves that the acquisition process is a process of multi-stage complex games and evolution game, with the full life cost control as the key issue in face of a fixed budget.
The purpose of this paper is to acquire doubly variable precision‐based knowledge rules from incomplete decision tables (IDTs) in the framework of pansystems methodology. It suggests a new variable precision limited tolerance – a special pansystems relation – rough set model with precision inclusion and a reduct procedure in which it overcomes the non‐monotony in forming tolerance classes when reducing an attribute from attribute set.
Through introducing variable precision and limited tolerance relation in IDT, it constructs symmetric binary relation, dissimilar to non‐symmetric relation proposed by others, and then forms tolerance classes. It proposes a new reduction procedure with absolute value calculation to avoid tolerance classes being non‐monotone. Using variable inclusion, it obtains lower and upper approximations with noises.
Tolerance classes are not monotone with the reduction of attribute from attribute set in the proposed variable precision and limited tolerance relation, but it remains symmetry. Proposed reduction procedure with absolute value calculation is a new approach in adjudging whether a reduct equals to the original whole attribute set within a error range or not.
Using variable precision and limited rough set model with variable inclusion to mine deep knowledge from IDT is a paradise in knowledge discovery in dealing with non‐determinative and vague problems.
The formation of symmetric tolerance relation is natural. The reduction procedure with absolute value calculation is new and not similar to those existed in literatures.