June 1997

·

40 Reads

Published by Emerald

Print ISSN: 0368-492X

June 1997

·

40 Reads

The concept of forty information becomes a cornerstone of
processing and handling linguistic data. As opposed to numeric
information whose processing is well known and fully supported by a vast
number of algorithms by entering the area of linguistic information
processing we are immediately faced with a genuine need to revisit the
fundamental concepts. We first review a notion of information
granularity as a primordial concept playing a key role in human
cognition. Dwelling on that, the study embarks on the concept of
communication with fuzzy sets. In particular, we discuss a so-called
fuzzy communication channel. The ideas of communication exploiting forty
information call for its efficient encoding and decoding that
subsequently leads to minimal losses of transmitted information.
Interestingly enough, the incurred losses depend heavily on the
granularity of the linguistic information involved-in this way one can
take advantage of the level of uncertainty residing within the
transmitted information granules

December 2007

·

16 Reads

Purpose
The purpose of this paper is to reveal the multi‐scale relation between power law distribution and correlation of stock returns and to figure out the determinants underlying capital markets.
Design/methodology/approach
The multi‐scale relation between power law distribution and correlation is investigated by comparing the original series with the special series. The eliminating intraday trend series approach developed by Liu et al. is utilized to analyze the effects of power law decay change on correlation properties, and shuffling series originated by Viswanathan et al. for the impacts of special type of correlation on power‐law distribution.
Findings
It is found that the accelerating decay of power law has an insignificant effect on correlation properties of returns and the empirical results indicate that time scale may also be an important factor maintaining power law property of returns besides correlation. When time scale is under critical point, the effects of correlation are crucial, and the correlation of nonlinear long‐range presents the strongest influence. However, for time scale beyond critical point, the impact of correlation begins to diminish or even finally disappear and then the power law property shows complete dependence on time scale.
Research limitations/implications
The 5‐min high frequency data of the Shanghai market as the empirical benchmark is insufficient to depict the relation over the entire time scale in the Chinese stock market.
Practical implications
The paper identifies the determinants of market dynamics to apply them to risk management through analysis of multi‐scale relations, and supports endeavors to introduce time parameter into further risk measures and control.
Originality/value
The paper provides the empirical evidence that time scale is one of the key determinants of market dynamics by analyzing the multi‐scale relation between power law distribution and correlation.

September 1992

·

22 Reads

Manufacturing is a key to continuous economic growth. Fuzzy expert systems, fuzzy logics, fuzzy languages, fuzzy neural networks, and intelligent control are proposed as additional tools in manufacturing. Fuzzy logic is a new way to program computers and appliances to mimic the imprecise way humans make decisions. Fuzzy logic has been applied to cameras, subways, computers and air conditioners. Through the use of fuzzy logic, fuzzy expert systems can be built which add a new dimension in the technologies for intelligent factories.

October 2009

·

30 Reads

"Mathematicians, like physicists, are pushed by a strong fascination.
Research in mathematics is hard, it is intellectually painful even if it is
rewarding, and you would not do it without some strong urge." [D. Ruelle]. We
shall give some examples from our experience, when we were able to simplify
some serious mathematical models to make them understandable by children,
preserving both aesthetic and intellectual value. The latter is in particularly
measured by whether a given simplification allows setting a sufficient list of
problems feasible for school students.

January 1985

·

24 Reads

The defective coin problem involves identification of defective coin, if any, and ascertain the nature of the defect (heavier/lighter) from a set of coins containing at the most one defective coin, using an equal-arm pan-balance. The solution strategy to minimise number of weightings required to detect the defective coin is based on problem reduction approach involving successive decomposition of the problem into subproblems until it is trivially solved. One of the two types of subproblems is visualised as combination of pair of antithetic problems, leading to an optimal solution procedures which is simply a term by term merger of corresponding antithetic procedures. Also, the algorithm is capable of generating all possible optimal solutions.

November 2009

·

259 Reads

Purpose: The mismatches between political discourse and military momentum in the American handling of the Cuban missile crisis are explained by using the model of the potential autopoiesis of subsystems. Under wartime conditions, the codes of political and military communications can increasingly be differentiated. Design/methodology/approach: The model of a further differentiation between political and military power is developed on the basis of a detailed description of the Cuban missile crisis. We introduce the concept of a "semi-dormant autopoiesis" for the difference in the dynamics between peacetime and wartime conditions. Findings: Several dangerous incidents during the crisis can be explained by a sociocybernetic model focusing on communication and control, but not by using an organization-theoretical approach. The further differentiation of the military as a subsystem became possible in the course of the twentieth century because of ongoing learning processes about previous wars.

September 2014

·

674 Reads

Ashby's law of requisite variety states that a controller must have at least
as much variety (complexity) as the controlled. Maturana and Varela proposed
autopoiesis (self-production) to define living systems. Living systems also
require to fulfill the law of requisite variety. A measure of autopoiesis has
been proposed as the ratio between the complexity of a system and the
complexity of its environment. Self-organization can be used as a concept to
guide the design of systems towards higher values of autopoiesis, with the
potential of making technology more "living", i.e. adaptive and robust.

April 2009

·

63 Reads

Following the initiative of McCulloch and Pitts (1943), there has been much speculation about the achievement of artificial
intelligence using networks of model neurons. The advent of the “perceptron” principle (Rosenblatt 1961; Nilsson 1965) crystallised
something definite and functional out of a mass of diffuse speculation, but it is not difficult to show that the “simple perceptron”
has limited capability (Minsky and Papert 1969). This can be attributed to the fact that all of the changes in weights constituting
its learning are restricted to a single functional layer. The simple training algorithm is possible because all the places
where changes occur are in this one layer and contribute directly to the output of the device, but the range of tasks that
can be learned is drastically limited.

May 2008

·

270 Reads

Purpose
The purpose of this paper is to present an implementation of a soft‐computing (SC) based navigation approach on a bi‐steerable mobile robot, Robucar. This approach must provide Robucar with capability to acquire the obstacle avoidance, target localization, decision‐making and action behaviors after learning and adaptation. This approach uses three neural networks (NN) and fuzzy logic (FL) controller to achieve the desired task. The NNs corresponding to the obstacle avoidance and target localization are trained using the back‐propagation algorithm and the last one is based on the reinforcement learning paradigm while the FL controller uses the Mamdani search and match algorithm. Simulation and experimental results are presented, showing the effectiveness of the overall navigation control system.
Design/methodology/approach
In this paper, an interesting navigation approach is applied to a car‐like robot, Robucar, with addition of an action behavior to deal with the generation of smooth motions. Indeed, this approach is based on four basic behaviors; three of them are fused under a neural paradigm using Gradient Back‐Propagation (GBP) and reinforcement learning (RL) algorithms and the last behavior uses a FL controller. It uses a set of suggested rules to describe the control policy to achieve the action behavior.
Findings
In the implemented SC‐based navigation, the intelligent behaviors necessary to the navigation are acquired by learning using GBP algorithm and adaptation using FL. The proposed approach provides Robucar with more autonomy, intelligence and real‐time processing capabilities. Indeed, the proposed NNs and FLC are able to remedy problems of analytical approaches, missing or incorrect environment knowledge and uncertainties which can lead to undesirable effects as the rough velocity changes. The simulation and experimental results display the ability of the proposed SC‐based navigation approach to provide Robucar with capability to intelligently navigate in a priori unknown environment, illustrating the robustness and adaptation capabilities of the approach.
Research limitations/implications
This work can be extended to consider mobile obstacles with a velocity higher than the velocity of the robot.
Originality/value
This paper presents a learning approach to navigating a bi‐steerable mobile robot in an unknown environment using GBP and RL paradigms.

November 2009

·

189 Reads

Purpose
– In the tradition of Spencer Brown's Laws of Form, observation was defined in Luhmann's social systems theory as the designation of a distinction. In the sociological design, however, the designation specifies only a category for the observation. The distinction between observation and expectation enables the sociologist to appreciate the processing of meaning in social systems. Seeks to address this issue.
Design/methodology/approach
– The specification of “the observer” in the tradition of systems theory is analyzed in historical detail. Inconsistencies and differences in perspectives are explicated, and the specificity of human language is further specified. The processing of meaning in social systems adds another layer to the communication.
Findings
– Reflexivity about the different perspectives of participant observers and an external observer is fundamental to the sociological discourse. The ranges of possible observations from different perspectives can be considered as second‐order observations or, equivalently, as the specification of an uncertainty in the observations. This specification of an uncertainty provides an expectation. The expectation can be provided with (one or more) values by observations. The significance of observations can be tested when the expectations are properly specified.
Originality/value
– The expectations (second‐order observations) are structured and therefore systemic attributes to the discourse. However, the metaphor of a (meta‐)biological observer has disturbed the translation of social systems theory into sociological discourse. Different discourses specify other expectations about possible observations. By specifying second‐order observations as expectations, social systems theory and sociocybernetics can combine the constructivist with an empirical approach.

October 2010

·

821 Reads

Purpose
The purpose of this paper is to present a new core hypothesis on laughter. It has been built by putting together ideas from several disciplines: neurodynamics, evolutionary neurobiology, social networks, and communication studies. The hypothesis focusses on the social nature of laughter and contributes to ascertain its evolutionary origins in connection with the cognitive and social-emotional functions it performs.
Design/methodology/approach
An in-depth examination of laughter in the social communication context and along the life cycle of the individual is performed. This instinctive behaviour that appears as a “virtual”, non-physical form of “grooming” would serve as a bond-making instrument in human groups. Further, the neurodynamic events underlying laughter production – and particularly the form of the neural entropy gradients – are congruent with a sentic hypothesis about the different emotional contents of laughter and their specific effects on bonding dynamics.
Findings
The new behavioural and neurodynamic tenets introduced about this unusual sound feature of our species justify the ubiquitous presence it has in social interactions at large and along the life cycle of the individual. Laughter, far from being a curious evolutionary relic or a rather inconsequential innate behaviour, should be considered as a highly efficient tool for inter-individual problem solving and for maintenance of social bonds.
Originality/value
Laughter, the authors would conclude, has been evolutionarily kept and augmented as an optimized tool for unconscious cognitive-emotional problem solving, and at the same time as a useful way to preserve the essential fabric of social bonds in close-knit groups and within human societies at large.

February 2004

·

119 Reads

The sets of contexts and properties of a concept are embedded in the complex Hilbert space of quantum mechanics. States are unit vectors or density operators, and contexts and properties are orthogonal projections. The way calculations are done in Hilbert space makes it possible to model how context influences the state of a concept. Moreover, a solution to the combination of concepts is proposed. Using the tensor product, a procedure for describing combined concepts is elaborated, providing a natural solution to the pet fish problem. This procedure allows the modeling of an arbitrary number of combined concepts. By way of example, a model for a simple sentence containing a subject, a predicate and an object, is presented. Comment: 21 pages, to appear in the journal 'Kybernetes' in the Summer of 2004

February 2004

·

393 Reads

We propose a theory for modeling concepts that uses the state-context-property theory (SCOP), a generalization of the quantum formalism, whose basic notions are states, contexts and properties. This theory enables us to incorporate context into the mathematical structure used to describe a concept, and thereby model how context influences the typicality of a single exemplar and the applicability of a single property of a concept. We introduce the notion `state of a concept' to account for this contextual influence, and show that the structure of the set of contexts and of the set of properties of a concept is a complete orthocomplemented lattice. The structural study in this article is a preparation for a numerical mathematical theory of concepts in the Hilbert space of quantum mechanics that allows the description of the combination of concepts (see quant-ph/0402205) Comment: 20 pages, to appear in the journal 'Kybernetes' in the Summer of 2004

December 2002

·

87 Reads

Necessary and sufficient conditions allowing a previously unknown space to be explored through scanning operators are reexamined with respect to measure theory. Generalized conceptions of distances and dimensionality evaluation are proposed, together with their conditions of validity and range of application to topological spaces. The existence of a Boolean lattice with fractal properties originating from nonwellfounded properties of the empty set is demonstrated. This lattice provides a substrate with both discrete and continuous properties, from which existence of physical universes can be proved, up to the function of conscious perception. Spacetime emerges as an ordered sequence of mappings of closed 3-D Ponicare sections of a topological 4-space provided by the lattice. The possibility of existence of spaces with fuzzy dimension or with adjoined parts with decreasing dimensions is raised, together with possible tools for their study. The work provides the introductory foundations supporting a new theory of space whose physical predictions (suppressing the opposition of quantum and relativistic approaches) and experimental proofs are presented in details in Parts 2 and 3 of the study.

January 2003

·

62 Reads

Spacetime is represented by ordered sequences of topologically closed Poincare sections of the primary space constructed of primary empty cells. These mappings are constrained to provide homeomorphic structures serving as frames of reference in order to account for the successive positions of any objects present in the system. Mappings from one to the next section involve morphisms of the general structures. Discrete properties of the lattice allow the prediction of scales at which microscopic to cosmic structures should occur. Deformations of primary cells by exchange of empty set cells allow a cell to be mapped into an image cell in the next section as far as mapped cells remain homeomorphic. If a deformation involves a fractal transformation to objects, there occurs a change in the dimension of the cell and the homeomorphism is not conserved. The fractal kernel stands for a "particle" and the reduction of its volume is compensated by morphic changes of a finite number of surrounding cells. Quanta of distances and quanta of fractality are demonstrated. The interaction of a moving particle-like deformation with the surrounding lattice involves a fractal decomposition process that supports the existence and properties of previously postulated inerton clouds as associated to particles. Experimental evidence and further possibilities of the existence of inertons are proposed.

February 2003

·

42 Reads

The distribution of the deformations of elementary cells is studied in an abstract lattice constructed from the existence of the empty set. One combination rule determining oriented sequences with continuity of set-distance function in such spaces provides a particular kind of spacetime-like structure that favors the aggregation of such deformations into fractal forms standing for massive objects. A correlative dilatation of space appears outside the aggregates. At the large scale, this dilatation results in an apparent expansion, while at the submicroscopic scale the families of fractal deformations give raise to families of particle-like structure. The theory predicts the existence of classes of spin, charges, and magnetic properties, while quantum properties associated to mass have previously been shown to determine the inert mass and the gravitational effects. When applied to our observable spacetime, the model would provide the justifications for the existence of the creation of mass in a specified kind of "void", and the fractal properties of the embedding lattice extend the phenomenon to formal justifications of Big-Bang-like events without need for any supply of an extemporaneous energy.

February 2011

·

126 Reads

In an excitable Delaunay triangulation every node takes three states
(resting, excited and refractory) and updates its state in discrete time
depending on a ratio of excited neighbours. All nodes update their states in
parallel. By varying excitability of nodes we produce a range of phenomena,
including reflection of excitation wave from edge of triangulation, backfire of
excitation, branching clusters of excitation and localized excitation domains.
Our findings contribute to studies of propagating perturbations and waves in
non-crystalline substrates.

October 1996

·

68 Reads

Provides an overview of major developments pertaining to generalized information theory during the lifetime of Kybernetes. Generalized information theory is viewed as a collection of concepts, theorems, principles, and methods for dealing with problems involving uncertainty‐based information that are beyond the narrow scope of classical information theory. Introduces well‐justified measures of uncertainty in fuzzy set theory, possibility theory, and Dempster‐Shafer theory. Shows how these measures are connected with the classical Hartley measure and Shannon entropy. Discusses basic issues regarding some principles of generalized uncertainty‐based information.

September 2012

·

642 Reads

Purpose
– The purpose of this paper is to develop experimental laboratory biological techniques for approximation of principle transport networks, optimizing transport links, and developing optimal solutions to current transport problems. It also aims to study how slime mould of Physarum polycephalum approximate autobahn networks in Germany.
Design/methodology/approach
– The paper considers the 21 most populous urban areas in Germany. It represents these areas with source of nutrients placed in the positions of slime mould growing substrate corresponding to the areas. At the beginning of each experiment slime mould is inoculated in the Berlin area. Slime mould exhibits foraging behavior and spans sources of nutrients (which represent urban areas) with a network of protoplasmic tubes (which approximate vehicular transport networks). The study analyzes structure of transport networks developed by slime mould and compares it with families of known proximity graphs. It also imitates slime‐mould response to simulated disaster by placing sources of chemo‐repellents in the positions of nuclear power plants.
Findings
– It is found that the plasmodium of Physarum polycephalum develops a minimal approximation of a transport network spanning urban areas. Physarum‐developed network matches autobahn network very well. The high degree of similarity is preserved even when we place high‐demand constraints on repeatability of links in the experiments. Physarum approximates almost all major transport links. In response to a sudden disaster, gradually spreading from its epicenter, the Physarum transport networks react by abandoning transport links affected by disaster zone, enhancement of those unaffected directly by the disaster, massive sprouting from the epicenter, and increase of scouting activity in the regions distant to the epicenter of the disaster.
Originality/value
– Experimental methods and computer analysis techniques presented in the paper lay a foundation of novel biological laboratory approaches to imitation and prognostication of socio‐economical developments.

March 2003

·

71 Reads

Various aspects of the Framsticks system are described. The system is a universal tool for modeling, simulating and optimizing virtual agents, with three‐dimensional body and embedded control system. Simulation model is described first. Then features of the system framework are presented, with typical and potential applications. Specific tools supporting human understanding of evolutionary processes, control and behaviors are also outlined. Finally, the most interesting research experiments are summarized.

June 1999

·

487 Reads

. The symbol-based, correspondence epistemology used in AI is contrasted with the constructivist, coherence epistemology promoted by cybernetics. The latter leads to bootstrapping knowledge representations, in which different parts of the cognitive system mutually support each other. Gordon Pask's entailment meshes and their implementation in the THOUGHTSTICKER program are reviewed as a basic application of this approach. Entailment meshes are then extended to entailment nets: directed graph representations governed by the bootstrapping axiom, determining which concepts are to be distinguished ormerged. This allows a constant restructuring and elicitation of the conceptual network. Semantic networks and frame-like representations with inheritance can be expressed in this very general scheme by introducing a basic ontology of node and link types. Entailment nets are then generalized to associative nets characterized by weighted links. Learning algorithms are presented which ...

November 1998

·

258 Reads

This paper is devoted to a discussion of the relation between computer proof and human proof. It is a discussion of the relationship of persons and machines. In Section 2 we discuss these issues in general and specifically in regard to the recent solution of the Robbins Problem via a proof generated by computer. Section 3 gives the mathematical background for the Robbins problem. The Robbins problem was a longstanding open problem about axioms for Boolean algebra. One point of this paper is to show that the proof of the Robbins conjecture, generated by a computer, can be filled in and understood by human beings. We accomplish this aim in the present paper by presenting a notational reformulation of Boolean algebra and the Robbins Problem in Sections 4 and 5. This reformulation is called "box notation". Section 6 discusses cybernetic and semiotic issues. Section 7 is a formal presentation of a proof, in box notation, that Robbins algebras are Boolean. It should be mentioned that the notational reformulation given herein is most effective for a person when he or she actually uses the notation, writing out and making formulas and diagrams by hand. This activity allows the combination of hand and eye to correlate the patterns inherent in the formulas. Sections 5 and 7 should be regarded as guides and injunctions for such personal work. As far as this author knows, such work is indispensable for reaching understanding. Each person who encounters a piece of mathematics must hold a conversation with it. This conversation almost always includes the development of individual language and notation to enable that person to grasp the mathematics. This paper is an invitation to enter that domain of linguistic creativity through which mathematical understanding can arise. The issue d...

November 2002

·

43 Reads

This paper describes a novel type of artistic Artificial Life environment. Evolving agents, who have the ability to make and listen to sound, populate a synthetic world. An evolvable, rule-based system drives agent behaviour. Agents compete for limited resources in a virtual environment that is influenced by the presence and movement of the artwork's audience. Through a link between the real and virtual spaces, virtual agents evolve implicitly to try to maintain the interest of the human audience. 1.

May 2011

·

37 Reads

Purpose
This paper aims to review current research with particular reference to new research and development initiatives.
Design/methodology/approach
A general review and survey of selected research and development topics is given and some new challenges and applications of future technologies are considered with particular reference to their impact and potential.
Findings
The paper illustrates the multi‐ and trans‐disciplinary nature of studies in cybernetics, systems and management sciences, with a view to further research and development activity.
Practical implications
The choice of review provides an awareness of current trends in these areas of endeavour.
Originality/value
The reviews are selected from a global database and give a studied assessment of current research and development initiatives.

April 2009

·

59 Reads

Purpose
The purpose of this paper is to apply grey system theory to population system and project China's population.
Design/methodology/approach
The paper applies the GM(1,1) model to China's population projections. Two key aspects of the method are crucial for obtaining best accuracy of prediction. They are the choice of the length for the original data to be used in the model and the adoption of the GM(1,1) metabolic model in prediction. The former determines what initial data to be used while the latter describes an iteration process on how to proceed to predict.
Findings
The results show that in 2015 China's population will reach 1.37 billion and in 2050 it will be between 1.42 and 1.48 billion, which is in accordance with the latest projections from the UN. The findings show the GM(1,1) metabolic model is an effective mathematical means in population projections.
Research limitations/implications
The paper suggests that GM(1,1) metabolic model can provide an effective simulation model for complicated systems with uncertainty and can be used in many fields.
Practical implications
The paper provides useful advice for the department of population.
Originality/value
Most population projections have been based on assumptions about fertility, mortality, and migration. The paper considers the population system as a grey system and introduces the GM(1,1) metabolic model to population projections.

Yves Cherruault

1834 citations

Matjaz --------- Mulej

University of Maribor, Slovenia + IRDO Institute for the development of social respnsibility

700 citations