ArticlePDF Available

Abstract and Figures

Information is usually related to knowledge. However, the recent development of information theory demonstrated that information is a much broader concept, being actually present in and virtually related to everything. As a result, many unknown types and kinds of information have been discovered. Nevertheless, information that acts on knowledge, bringing new and updating existing knowledge, is of primary importance to people. It is called epistemic information, which is studied in this paper based on the general theory of information and further developing its mathematical stratum. As a synthetic approach, which reveals the essence of information, organizing and encompassing all main directions in information theory, the general theory of information provides efficient means for such a study. Different types of information dynamics representation use tools of mathematical disciplines such as the theory of categories, functional analysis, mathematical logic and algebra. Here we employ algebraic structures for exploration of information and knowledge dynamics. In Introduction (Section 1), we discuss previous studies of epistemic information. Section 2 gives a compressed description of the parametric phenomenological definition of information in the general theory of information. In Section 3, anthropic information, which is received, exchanged, processed and used by people is singled out and studied based on the Componential Triune Brain model. One of the basic forms of anthropic information called epistemic information, which is related to knowledge, is analyzed in Section 4. Mathematical models of epistemic information are studied in Section 5. In Conclusion, some open problems related to epistemic information are given.
Content may be subject to copyright.
Information 2011, 2, 697-726; doi:10.3390/info2040697
information
ISSN 2078-2489
www.mdpi.com/journal/information
Article
Epistemic Information in Stratified M-Spaces
Mark Burgin
Department of Mathematics, University of California, Los Angeles, 405 Hilgard Ave. Los Angeles,
CA 90095, USA; E-Mail: markburg@cs.ucla.edu
Received: 15 September 2011; in revised form: 24 November 2011 / Accepted: 1 December 2011 /
Published: 16 December 2011
Abstract: Information is usually related to knowledge. However, the recent development
of information theory demonstrated that information is a much broader concept, being
actually present in and virtually related to everything. As a result, many unknown types
and kinds of information have been discovered. Nevertheless, information that acts on
knowledge, bringing new and updating existing knowledge, is of primary importance to
people. It is called epistemic information, which is studied in this paper based on the
general theory of information and further developing its mathematical stratum. As a
synthetic approach, which reveals the essence of information, organizing and
encompassing all main directions in information theory, the general theory of information
provides efficient means for such a study. Different types of information dynamics
representation use tools of mathematical disciplines such as the theory of categories,
functional analysis, mathematical logic and algebra. Here we employ algebraic structures
for exploration of information and knowledge dynamics. In Introduction (Section 1), we
discuss previous studies of epistemic information. Section 2 gives a compressed
description of the parametric phenomenological definition of information in the general
theory of information. In Section 3, anthropic information, which is received, exchanged,
processed and used by people is singled out and studied based on the Componential Triune
Brain model. One of the basic forms of anthropic information called epistemic information,
which is related to knowledge, is analyzed in Section 4. Mathematical models of epistemic
information are studied in Section 5. In Conclusion, some open problems related to
epistemic information are given.
Keywords: information; knowledge; information operator; stratification; epistemic
information; composition; cognition; algebra; equivalence
OPEN ACCESS
Information 2011, 2
698
1. Introduction
Explication and clarification of the concept of information requires not only elaboration and
exploration of a general unified definition of information but also needs separation and examination of
basic special types of information. The general theory of information gives the most general definition
of information, organizing and encompassing all main directions in information theory [1]. According
to the main principle of general theory of information, which impeccably correlates with people’s
practice and observations, when information comes to a system and is accepted, it causes changes in
the system. This feature of information makes it reasonable to model information by operators and
information dynamics by actions of these operators. This approach is adopted in the general theory
of information.
In addition, the general theory of information provides efficient means for information classification
and study of the basic information types. One of such types is information that is received, exchanged,
processed and used by people. It is called anthropic information. We explain the methodological
principles of the anthropic information explication and classification, which bring us to cognitive
information and its important subclass called epistemic information.
Portions of epistemic information are modeled/represented by epistemic information operators
acting in spaces of knowledge, which are represented by a formal construction called a Mizzaro space.
These spaces consist of knowledge items often unified by structural relations.
In a general setting, epistemic information has been studied by different authors. Bar-Hillel and
Carnap [2], Hintikka [3-5] and Israel and Perry [6] explored information in knowledge represented by
means of mathematical logic. Shreider [7], Mackay [8], Brookes [9], Mizzaro [10,11] and
Gackowski [12] base their theories on the following assumption:
Information is a change in a knowledge system
Later this principle has been made more exact [13] and formulated as
Epistemic information is a change in a knowledge system
The general theory of information [1] makes the next step to the better understanding of epistemic
information. Namely, it is explained that
Epistemic information is a capacity to cause changes
in a knowledge system and it is possible to measure
this capacity by changes in the knowledge system
impacted by information
Such changes in the knowledge system of a system R reflect accepted information. Note that
information can be accepted not only as true information but also as false information. In this case,
changes in the knowledge system can result in exclusion of some knowledge or in labeling this
knowledge as false, e.g., treating it as a misconception or blunder.
The approach of Mizzaro [10,11,13] and Burgin [1] to epistemic information does not consider
knowledge in general, but makes use of the term a knowledge state (KS) of a dynamic knowledge
system or a cognitive agent. It is postulated that knowledge of a knowledge system/agent consists of
atomic components called knowledge items (KI) and the number of these items in a portion of
knowledge gives an adequate measure of this knowledge. Such knowledge states are set-theoretical or
unstructured Mizzaro spaces defined in [1].
Information 2011, 2
699
This is the first approximation to modeling information processes related to knowledge
transformations. In it, the substantial dimensions of knowledge are reflected, while the relational
dimensions, which describe relations between knowledge items, are studied in higher-level models of
operational information theory, which goes beyond the scope of this paper. Taking into account such
relations allows one to achieve higher precision in measuring information and knowledge. However,
there are many situations where precision provided by quantities of knowledge items is sufficient. For
instance, exactly this level of precision is successfully used in software engineering and technology
where program instructions, which belong to the procedural type of knowledge, are used as knowledge
items that determine measures of information and knowledge transformations [14].
Taking knowledge states as the base, Mizzaro [13] considers and utilizes natural set-theoretical
operations, such as union, difference, complement, and intersection, as well as set-theoretical relations,
such as inclusion, emptiness, and membership, on the set of all knowledge states. These and more
sophisticated operations and relations are advanced to the level of local information operators and
studied in [1]. Namely, in the context of the general theory of information, transformations in
knowledge states caused by receiving data (messages) are represented by local information operators,
which are projections of global information operators also described in [1]. Consequently, the theory
of Mizzaro spaces and local information operators in these spaces is a localization of the mathematical
(formalized) stratum of the general theory of information.
In this paper, we continue exploration of epistemic information, extending Mizzaro spaces to M-
spaces, which represent and model information dynamics by algebraic structures. In addition to
uniform Mizzaro spaces studied in [1,11,13], here we consider Mizzaro spaces and M-spaces with
stratification and extend the scope of epistemic information operators. In [1,11,13], only two operators,
addition and deletion are studied. Here we consider five basic epistemic information operators:
addition, deletion, moving, replication, and transformation, as well as their compositions. Section 2
gives a compressed description of the parametric phenomenological definition of information in the
general theory of information. In Section 3, anthropic information that is received, exchanged,
processed and used by people is separated and studied based on the Componential Triune Brain model
introduced in [15] and further developed in [1]. This model allows overcoming the pitfalls of the
simple linear hierarchy suggested by MacLean [16,17] by considering not anatomically localized but
distributed in the brain basic systems and including such an important psychological construct as the
Will [18,19] in the brain structure. Based on this model, one of the basic forms of anthropic
information called epistemic information, which is related to knowledge, is analyzed in Section 4.
Mathematical models of epistemic information are studied in Section 5. In Conclusion, we give some
open problems related to epistemic information.
2. The Concept of Information
Our study of epistemic information is based on the system of ontological principles from the general
theory of information. All these principles serve as an extended definition of information. A concise
definition is given in the second ontological principle, which has several forms.
Ontological Principle O2 (the General Transformation Principle). In a broad sense, information
for a system R is a capacity to cause changes in the system R.
Information 2011, 2
700
Thus, we may understand information in a broad sense as a capacity (ability or potency) of things,
both material and abstract, to change other things. Information exists in the form of portions of
information. Informally, a portion of information is such information that can be separated from other
information. Information is, as a rule, about something. What information is about is called an object
of this information.
The Ontological Principle O2 has several consequences. First, it demonstrates that information is
closely connected to transformation. Namely, it means that information and transformation are
functionally similar because they both point to changes in a system [20]. At the same time, they are
different because information is potency for (or in some sense, cause of) change, while transformation
is the change itself, or in other words, transformation is an operation, while information is what
induces this operation.
Second, the Ontological Principle O2 explains why information influences society and individuals
all the time, as well as why this influence grows with the development of society. Namely, reception of
information by individuals and social groups induces transformation. In this sense, information is
similar to energy. Moreover, according to the Ontological Principle O2, energy is a kind of
information in a broad sense. This well correlates with the Carl Friedrich von Weizsäcker's idea [21,22]
that energy might in the end turn out to be information.
Third, the Ontological Principle O2 makes it possible to separate different kinds of information. For
instance, people, as well as any computer, have many kinds of memory. It is even supposed that each
part of the brain has several types of memory agencies that work in somewhat different ways, to suit
particular purposes [23]. It is possible to consider each of these memory agencies as a separate system
and to study differences between information that changes each type of memory. This might help to
understand the interplay between stability and flexibility of mind, in general, and memory,
in particular.
In essence, we can see that all kinds and types of information are encompassed by the Ontological
Principle O2.
However, the common usage of the word information does not imply such wide generalizations as
the Ontological Principle O2 implies. Thus, we need a more restricted theoretical meaning because an
adequate theory, whether of the information or of anything else, must be in significant accord with our
common ways of thinking and talking about what the theory is about, else there is the danger that
theory is not about what it purports to be about. To achieve this goal, we use the concept of an
infological system IF(R) of the system R for the information definition. It is done in two steps. At first,
we make the concept of information relative and then we choose a specific class of infological systems
to specify information in the strict sense. That is why it is impossible and, as well as,
counterproductive to give an exact and thus, too rigid and restricted definition of an infological system.
Indeed, information is a very rich and widespread phenomenon to be reflected by a restricted rigid
definition (cf., for example, [24,25]).
The concept of infological system plays the role of a free parameter in the general theory of
information, providing for representation of different kinds and types of information in this theory.
That is why the concept of infological system, in general, should not be limited by boundaries of exact
definitions. A free parameter must really be free. Identifying an infological system IF(R) of a system R,
Information 2011, 2
701
we can define information relative to this system. This definition is expressed in the following
principle.
Ontological Principle O2g (the Relativized Transformation Principle). Information for a system
R relative to the infological system IF(R) is a capacity to cause changes in the system IF(R).
As a model example of an infological system IF(R) of an intelligent system R, we take the system of
knowledge of R. In cybernetics, it is called the thesaurus Th(R) of the system R. Another example of
an infological system is the memory of a computer. Such a memory is a place in which data and
programs are stored and is a complex system of diverse components and processes.
Elements from IF(R) are called infological elements.
There is no exact definition of infological elements although there are various entities that are
naturally considered as infological elements as they allow one to build theories of information that
inherit conventional meanings of the word information. For instance, knowledge, data, images,
algorithms, procedures, scenarios, ideas, values, goals, ideals, fantasies, abstractions, beliefs, and
similar objects are standard examples of infological elements.
When we take a physical system D as the infological system and allow only for physical changes,
we see that information with respect to D coincides with (physical) energy.
Taking a mental system B as the infological system and considering only mental changes,
information with respect to B coincides with mental energy.
These ideas are crystallized in the following principle.
Ontological Principle O2a (the Special Transformation Principle). Information in the strict sense
or proper information or, simply, information for a system R, is a capacity to change structural
infological elements from an infological system IF(R) of the system R.
As the concept of mental energy is much less understood than the concept of physical energy, we
give some explanations based on the origin, development and contemporary understanding of
mental energy.
Considering mental energy on the level of individual mentality, it is possible, as the first
approximation, to equate it with psychic energy. The concept of psychic energy first entered
physiology and to some extent psychology through the discussions of Ernst Brücke, Herman
Helmholtz, and Emil Du Bois-Reymond, who during the years 1838-1842, worked in the laboratory of
the German physiologist Johannes Muller. At the same time, according to Jung [26], Nicolas von Grot
was the first to explicitly define the concept of psychic. He wrote [27]:
“The concept of psychic energy is as much justified in science as that of physical energy, and
psychic energy has just as many quantitative measurements and different forms as has
physical energy.”
Later this concept was further developed by Sigmund Freud [28,29], Brücke’s student, and then by
Carl Jung, Freud’s student. Jung [26] regarded psychic energy as a basic life-force, manifesting itself
through actions, such as eating, moving, thinking, sex, remembering, etc.
Contemporary understanding determines mental energy as the ability to perform mental tasks, the
intensity of feelings of energy/fatigue, and the motivation to accomplish mental and physical
tasks [30,31]. This shows that basic aspects of mental energy manifestation include: (a) cognition
(knowledge that is gained through perception, reasoning or intuition), (b) changing of moods or
feelings (states of mind), and (c) motivation (an incentive for action). Factors that can influence mental
Information 2011, 2
702
energy include, among others, psychological issues such as interest, passion, desire, concern, and
biological issues such as genetics, nutrition, pain, and sleep [31].
Often psychic/mental energy is confused with psychic/mental force. For instance, some
psychoanalysts do not distinguish energy and force, particularly, when they follow Freud's occasional
practice of calling libido itself a force. Actually, as in physics, energy and force are different
phenomena. Energy is a potential to make changes, while force is what is making changes. This allows
one to take the Will as a representative of force in the individual mentality and above. This is
expressed in the remark attributed to Loewenstein that "force is energy in action" (cf. [32]).
3. The phenomenological stratum of the general theory of information
In this paper, we are primarily interested in information received, exchanged, processed and used by
people. We call it anthropic information. A relevant infological system IF(R) for anthropic information
is the human brain. Thus, to further classify and study anthropic information, we need to utilize our
knowledge about the structure and functions of the brain.
In our case, the most relevant is the Triune Brain model introduced and studied by Paul MacLean
[33,34]. The main conception of his approach is existence of three levels of perception and action that
are controlled by three corresponding centers of perception in the human brain. These three centers
together form the Triune Brain and have the structure of a triad. It is natural to call the initial
MacLean’s structure by the name Anatomic Triune Brain model because it is based on the anatomy of
the brain where three indispensable parts are distinguished: the neocortex, limbic system and
R-complex.
According to the theory of MacLean, the neural basis, or framework, of the brain consists of three
parts: the spinal cord, hindbrain, and midbrain. In addition to it, centuries of evolution have endowed
people with three distinct cerebral systems. The oldest of these is called the reptilian brain or
R-complex. It programs behavior that is primarily related to instinctual actions based on ancestral
learning and memories, satisfying basic needs such as self-defense, reproduction and digestion. The
reptilian brain is fundamental in acts such as primary motor functions, primitive sensations,
dominance, establishing territory, hunting, breeding, and mating.
Through evolution, people have developed a second cerebral system, the limbic system, which
MacLean refers to as the paleomammalian brain and which contains hippocampus, amygdala,
hypothalamus, pituitary gland, and thalamus. This system is situated around the R-complex, is shared
by humans with other mammals, and plays an important role in human emotional behavior.
The most recent addition to the cerebral hierarchy is called the neomammalian brain, or the
neocortex. It constitutes 85% of the whole human brain mass and receives its information from the
external environment through the eyes, ears, and other organs of senses. This brain component
(neocortex) contains cerebrum, corpus callosum, and cerebral cortex. The cerebrum and cerebral
cortex are divided into two hemispheres, while the corpus callosum connects these hemispheres. The
neocortex deals with information in a logical and algorithmic way. It governs people creative and
intellectual functions like social interaction and advance planning. The left hemisphere works with
symbolic information, applying step-by-step reasoning, while the right hemisphere handles images
processed by massively parallel (gestalt) algorithms.
Information 2011, 2
703
Even psychologists who have objections to the Anatomic Triune Brain model admit that it is a
useful, although oversimplified, metaphor, as the structure presented as the triune brain is based on a
sound idea of three functional subsystems of the brain [16,35]. In the development of neurophysiology
and neuropsychology, MacLean’s theory was used as a base for the Whole Brain model, developed by
Herrmann [36]. The main idea of this development is a synthesis of the Anatomic Triune Brain model
with the two-hemisphere approach to the brain functioning.
The theory of the triune brain (reptilian, old mammalian and new mammalian) is used as a metaphor
and a model of the interplay between instinct, emotion, and rationality in humans. Cory [37] applied
this model to economic and political structures. In Cory’s schema, the reptilian brain mediates the
claims of self-interest, whereas the old mammalian brain mediates the claims of empathy. If selfish
interests of an individual are denied for too long, there is discontent due to a feeling of being unjustly
treated. If empathic interests are denied for too long, there is discontent due to guilt. In either case, the
center of intelligence at the prefrontal cortex plays the role of a mediator. Its executive function is
required to restore balance, generating the reciprocity required for effective social and
economic structures.
The Triune Brain model is used to explain hyperactivity of youngsters studied by Zametkin [38] and
other researchers. Peter Levine bases his approach to trauma treatment on the Triune Brain model [39].
According to Levine, there are tree types of the uniform stress and relaxation responses to a
threatening situation that are active in all animal species through the autonomic nervous system. In the
everyday language, these responses are metaphorically called fight, flight or freeze. The first two of
them are well-known, while the third one was introduced by Levine. The freezing or immobility
response has evolved over millions of years and it has served an adaptive purpose well for all
species—except humans. In an individual, it can lead to trauma. Many physical ailments are actually
residues of thwarted trauma reactions incurred during stressful events. What usually happens to non-
human species is that after the threatening situation resolves itself, the animal forgets the stress and
goes on its way without being traumatized.
In contrast to this, people can get stuck in the freezing response, while the reasoning mind resists or
blocks the natural bodily sensations and fine motor movements needed to come out of the freeze
response. The contemporary rationalistic culture is not helpful in supporting people in such a
traumatizing situation. The feelings that people go through after experiencing a traumatic event are
outside of their voluntary control, often being frightening and even potentially re-traumatizing.
Levine [39] postulates that trauma exists not in the event or in the story of the event, but is stored
within the nervous system. The main principle of the Levine’s treatment approach is that the body has
a natural, innate, and miraculous capacity to heal once these reactions are understood and guided.
Although the Triune Brain has become a well-known model in contemporary psychology, it caused
several objections on the ground of the development and structure of the triune brain system [16,40].
First, there is evidence that the, so-called, paleomammalian and neomammalian brains appeared,
although in an undeveloped form, on much earlier stages of evolution than it is assumed by MacLean.
Second, there are experimental data that in the neocortex, regions that are homological to the,
so-called, paleomammalian and reptilian brains exist. For instance, neuropsyhological data give
evidence that amygdala, which is a part of a limbic system, performs the low-level emotion processing,
while the ventromedial cortex performs the high-level emotion processing. This shows that emotions
Information 2011, 2
704
exist, at least, on three levels: on the subconscious level of limbic system, on the conscious intuitive
level, and on the conscious rational level in the cortex. The first level utilizes direct affective
information, while the second and, to some extent, the third levels make use of cognitive emotional
information [1].
At the same time, the development of the system of Will demands inclusion of some regions that
are not included into the R-complex (the reptilian brain in MacLean’s theory) into this system. It
means that the centers of rational intelligence, emotion and will are not concentrated in three separate
regions of the brain but are highly distributed among several components of the brain. Thus, it is better
to call them not centers but systems of intelligence, emotion and will. This extension of functional
characteristics results in the Componential Triune Brain model described in [1], making this model
more adequate to experimental data and overcoming the pitfalls of the simple linear hierarchy
suggested by MacLean [16,40].
Thus, the Componential Triune Brain model consists of three basic systems of the brain:
- the System of Rational Intelligence (also called System of Reasoning) (SRI);
- the System of Emotions (or more generally, of Affective States) (SAS);
- the System of Will and Instinct (SWI).
All three systems of the brain are schemes in the sense of the schema theory, which is developed as
a specific direction of the brain theory [41-44]. According to this theory, brain schemes are
anatomically distributed in the brain and interact in a way of concurrent competition and coordination.
All these interactions are based on physical processes but have an inherent informational essence
related to a specific type of information. Information processes in the brain are more exactly reflected
by the theory of the triadic mental information than by the conventional information theory that deals
only with cognitive information.
In standard structuring of the brain, we also find these three systems. In the conventional setting
(cf., for example, [40,45-47] , the brain includes three components: the forebrain, midbrain,
and hindbrain.
The forebrain is the largest division of the brain involved in a wide range of activities that make
people human. The forebrain has a developed inner structure. It includes the cerebrum, which
consists of two cerebral hemispheres. The cerebrum is the nucleus of the System (Center) of
Rational Intelligence.
Under the cerebrum, is the diencephalon, which contains the thalamus and hypothalamus. The
thalamus is the main relay center between the medulla and the cerebrum. The hypothalamus is an
important control center for sex drive, pleasure, pain, hunger, thirst, blood pressure, body temperature,
and other visceral functions. The forebrain also contains the limbic system, which is directly linked to
the experience of emotion. The limbic system is the nucleus of the System (Center) of Emotions (or
more generally, of Affective States).
The midbrain is the smallest division and it makes connections with the other two divisions—
forebrain and hindbrainand alerts the forebrain to incoming sensations.
The hindbrain is involved in sleeping, waking, body movements and the control of vital reflexes
such as heart rate, blood pressure. The structures of the hindbrain include the pons, medulla and
cerebellum. The hindbrain is the nucleus of the System (Center) of Will and Instinct.
Information 2011, 2
705
The System of Rational Intelligence realizes rational thinking. It includes both symbol and image
processing, which go on in different hemispheres of the brain [48]. The System of Emotions governs
sensibility and the emotional sphere of personality. The System of Will and Instincts directs behavior
and thinking. Two other systems influence behavior only through the will. For instance, a person can
know that it is necessary to help others, especially, those who are in need and deserve helping.
However, in many cases, this person does nothing without a will to help. In a similar way, we know
situations when an individual loves somebody but neither tells this nor explicitly shows this due to an
absence of a sufficient will.
It is necessary to remark that discussing the will of an individual we distinguish conscious will,
unconscious will, and instinct. All of them are controlled by the Center of Will and Instincts (SWI). In
addition, it is necessary to make distinctions between thoughts about intentions to do something and
the actual will to do this. Thoughts are generated in the Center of Rational Intelligence, while the will
dwells in the Center of Will and Instincts. In other words, thoughts and words about wills, wishes, and
intentions may be deceptive if they are not based on a Will.
Will is a direct internal injunction, as well as any kind of motivation [19]. That is, the forces that act
on or within an organism to initiate and direct behavior, has to be transformed into a will in order to
cause the corresponding action. Will is one of the basic components of different models of personality.
For instance, in psychosynthesis, which is a holistic transpersonal psychology and philosophy of life
developed by Roberto Assagioli, a contemporary of both Freud and Jung, the Will is the basic
component of the self [18].
Assagioli explicated essential properties of the Willit can be assertive, aggressive, and
controlling. In addition, there are three categories of Will: the accepting Will, yielding Will, and
dedicated Will.
Thus, we can see that the Componential Triune Brain model corresponds to the three basic
functions of the brain:
- Reasoning as symbolic information processing and information processing of images.
- Emotions and feelings as affective states of the brain.
- Will and instinct as forces of the psyche.
The Anatomic Triune Brain model of MacLean also corresponds to the basic functions of the brain
but misses one of them, namely, the Will.
Many other psychological theories and psychiatric tecniques consider the Will as a primary factor
of human behavior and dispositions [19,49].
Often the Will is considered as a process that deliberates on what is to be done [50].
The Componential Triune Brain model is not only a necessary extension of the Triune Brain model
but it also continues a long standing approach to the brain stratification. As Smith [35] demonstrates,
triadic models of the brain and psyche have featured through two and half millennia of Western
thought, starting with works of Pythagoras, Plato and Aristotle and receiving a modern airing in Paul
MacLean's the Triune Brain model. A generation later after Pythagoras, Plato and Aristotle,
Herophilus and Erasistratus from Alexandrian put together a more anatomically informed triadic
theory, which was modified by Galen in the 2nd century and remained the prevailing paradigm for
nearly fifteen hundred years until it was overturned by the great thinkers of the Renaissance.
Nonetheless, the notion that the human neuropsychological system is somehow best thought of as
Information 2011, 2
706
having a triadic (tripartite) structure has remained remarkably resilient and has reappeared time and
again in modern and early modern times. For instance, the Triune Brain model well correlates with the
Freud’s model of personality, which has the structure of the triad (Id, Ego, Super-ego). In the
correspondence between the Triune Brain model and the Freud’s model, the reptilian complex
corresponds to Id, the limbic system corresponds to Ego, and the neocortical complex corresponds to
Super-ego. In the context of triadic models, it is also possible to consider the triarchic theory of
intelligence developed by Robert Sternberg [51].
Taking each of these the centers, SRI, SAS, and SWI, as a specific infological system, we find three
types of information. One is the conventional information that acts on the center of reasoning and of
other higher brain functions (SRI), which is situated in the neocortex. This information gives
knowledge, changes beliefs and generates ideas. Thus, it is natural to call it cognitive information.
Information of the second type acts on the system of emotions (SAS), which includes the
paleomammalian brain. It is natural to call this information by the name direct emotional information,
or direct affective information or emotive information. Information of the third type acts on the System
of Will and Instinct (SWI), which contains the reptilian brain. It is natural to call this information by
the name direct regulative or direct effective information.
Thus, anthropic information has three dimensions:
Cognitive information changes the content of the SRI, which includes the knowledge system
(thesaurus) and neocortex (neomammalian brain) as its carrier.
Direct emotional/affective information changes the content of the SAS, which includes the
paleomammalian brain (limbic system).
Direct regulative/effective information changes the content of the SWI, which includes the reptilian
brain or R-complex.
However, in general, emotions constitute only one part of affective states, which also include
moods, feelings, etc. That is why in general, direct affective information is more general than direct
emotional information. However, as there is no consensus on the differences between emotions and
affective states, these two types of information are used without differentiation.
Interactions between the basic brain systems imply dependencies between thinking, emotions, and
actions of people. Emphasizing some of these relations, psychologists build their theories and
psychotherapists develop their therapeutic approaches. Giving priority to the System of Rational
Intelligence (SRI), the so-called “cognitive revolution” has taken hold around the world. It influenced
both psychology, resulting in the emergence of cognitive psychology [52], and psychotherapy,
inspiring creation of cognitive therapy [53]. In psychology, the word cognitive often means thinking in
many contexts of contemporary life [54]. The cognitive therapeutic approach begins by using the
extremely powerful reasoning abilities of the human brain. This is important because our emotions and
our actions are not separate from our thoughts. They are all interrelated. Thinking (SRI) is the gateway
to our emotions (SAS)—and our emotions are the gateway to our actions through motivation and will
(SWI). This is only another way of saying that information from the System of Rational Intelligence
(SRI) goes to the System of Emotions (SAS)—and from it to the System of Will and Instincts (SWI)
that controls our actions. Consequently, the cognitive psychotherapeutic approach, which has been
successfully utilized for treating many mental disorders, gives additional supportive evidence for the
theory of the triune brain and behavior, as well as for the theory of the triadic mental information. The
Information 2011, 2
707
latter explains that while going from the System of Rational Intelligence to the System of Emotions
and to the System of Will and Instincts, information is transformed from cognitive information, to
direct emotional/affective information to direct effective/regulative information. As a result, we have
Interaction of the personality components presented in Figure 1.
Figure 1. Interaction between components of personality.
Reasoning Emotions/Affective states
Will/Instinct
Behavior
We can see that cognitive information is one of the three basic types of anthropic information with
the corresponding infological system CIF(R), which contains (stores and processes) cognitive elements
or constituents, such as knowledge, data, ideas, beliefs, images, algorithms, procedures, scenarios,
ideas, values, goals, ideals, fantasies, abstractions, etc. Cognitive infological systems are very
important, especially, for intelligent systems as the majority of researchers believe that information is
intrinsically connected to cognition. This peculiarity is reflected in the following Cognitive
Transformation Principle.
Ontological Principle O2c (the Cognitive Transformation Principle). Cognitive information for a
system R, is a capacity to cause changes in the cognitive infological system CIF(R) of the system R.
In the case of a cognitive infological system CIF(R), it looks like it is possible to give an exact
definition of cognitive information. However, now cognitive sciences do not know all structural
elements involved in cognition. A straightforward definition specifies cognition as activity (process)
that gives knowledge. At the same time, we know that knowledge, as a rule, comes through data and
with data. So, data are also involved in cognition and thus, have to be included in cognitive infological
systems. In addition, cognitive processes employ such structures as ideas, images, texts, beliefs,
values, measures, problems, schemas, procedures, tasks, goals, etc. Thus, to comprehensively represent
cognitive information, it is imperative to include all such objects in cognitive infological systems.
As the cognitive infological system contains knowledge of the system it belongs, cognitive
information is the source of knowledge changes.
There are also different types of cognitive information. One approach to classification is based on
the structures in the brain. Researchers found that a longitudinal fissure separates the human brain into
two distinct cerebral hemispheres, connected by the corpus callosum. The sides resemble each other
and each hemisphere's structure is generally mirrored by the other side. Yet despite the strong
similarities, the functions of each cortical hemisphere are different. As it is known (cf., for example,
[55]), the left hemisphere operates with symbolic information representation, performing logical and
analytical functions, such as linear reasoning, numeric manipulation, language processing, and mental
arithmetic. It is also supposed that the left hemisphere works in the sequential mode on the level of
Information 2011, 2
708
consciousness. The right hemisphere processes images and realizes intuitive, creative and synthesizing
functions of the brain, such as processing of visual and audiological stimuli, spatial manipulation,
facial perception, and artistic performance. It is also supposed that the right hemisphere works in the
parallel (concurrent) mode on the level of consciousness. In a similar way, Herrmann [36]
differentiates functioning of the left parts and right parts of both the cerebral and limbic regions of
the brain.
Taking each of these hemispheres as an infological system, we come to two types of
cognitive information.
Symbolic (discrete) cognitive information transforms (or can transform) the symbolic content of the
left hemisphere.
Holistic (continuous) cognitive information transforms (or can transform) the integral (gestalt)
content of the right hemisphere.
This classification of cognitive information is complementary to another classification, in which
one of the basic types of cognitive information is epistemic information, which can be both symbolic
and holistic, belonging to both, the right hemisphere and left hemisphere. It is studied in Section 5.
4. Stratification of Knowledge Systems
There are different kinds of stratification.
In physical stratification, each stratum is a separate physical system. Any distributed database is
physically stratified.
In analytical stratification, each stratum is determined by a specific name (label) and all elements
from this stratum have this label (name). Knowledge base stratification used for handling inconsistent
knowledge bases [56,57], for constructing models of a knowledge base [58] and for merging multiple
knowledge bases [59,60] is analytical. The same is logic stratification used for formalization of
commonsense reasoning [61]. In this section, we are mostly interested in analytical stratification based
on knowledge classification. Different classes of knowledge form corresponding strata of
knowledge systems.
There are different principles of knowledge classification, which allow us to build several types of
knowledge system stratifications.
Time is an important characteristic of knowledge, giving different stratifications.
The temporal stratification.
1. The past stratum of knowledge consists of knowledge obtained/accepted in the past.
2. The current stratum of knowledge consists of the actual (used now) knowledge.
3. The future stratum of knowledge.
For instance, the knowledge “the Earth is flat” is past knowledge, while the knowledge “the Earth is
round” is current knowledge.
The past stratum of knowledge consists of three substrata: the forgotten past knowledge, outdated
but preserved past knowledge and still actual past knowledge.
The current stratum of knowledge consists of three substrata: the disappearing current knowledge,
consolidated current knowledge, and emergent current knowledge.
Information 2011, 2
709
The future stratum of knowledge consists of three substrata: the tentative/potential future
knowledge, realizable future knowledge and emergent future knowledge.
More precise temporal stratifications are used in temporal the knowledge and databases. A temporal
knowledge/database is a database with built-in time aspects. In particular, it supports a temporal
knowledge/data model and has a temporal version of the query language [62,63]. Temporal
knowledge/data stored in a temporal knowledge/database are different from the knowledge/data stored
in non-temporal knowledge/database in that a time coordinate is attached to the knowledge/data. This
is different from the conventional knowledge/data, which are usually considered to be valid now. Past
and future knowledge/data are not stored. Usually past knowledge/data are modified, overwritten with
new (updated) knowledge/data or deleted to achieve their temporal relevancy. Future knowledge/data
are not considered because it is assumed that we do not receive information about the future.
There are many complexity measures of algorithms, methods and procedures. Taking a complexity
measure C, it is possible to partition all algorithms (methods or/and procedures) into separate classes
that have different complexity measures. Each such a partition induces a corresponding stratification of
knowledge with respect to such knowledge characteristics as accessibility, inference, and generation,
which are specific forms of knowledge acquisition. Here are some examples of such stratifications.
The accessibility stratification.
1. Directly accessible knowledge.
2. n-step accessible knowledge.
Another stratification is based on complexity of knowledge inference.
The inference stratification.
1. Directly implied knowledge.
2. n-step inferable knowledge.
One more stratification is based on complexity of knowledge generation.
The generation stratification.
1. Directly generable/computable knowledge.
2. n-step generable/computable knowledge.
Steps in generation, inference and access may be determined by:
- Time slicing when each step is assigned some period of time for realization.
- Elementary operations.
For instance, it is possible to assume that knowledge acquisition is direct if it demands less than 3
seconds. The first step of knowledge acquisition can be estimated as an interval from 3 seconds to 30
seconds. The second step of knowledge acquisition can be estimated as an interval from 30 seconds to
1 minute. The third step of knowledge acquisition can be estimated as an interval from 1 minute to 3
minutes and so on.
It is also possible to measure complexity, e.g., effort in generation, by the power of algorithms [64].
In this case, we have an algorithmic ladder, which consists of classes of algorithms with increasing
computing power.
The traditional algorithmic ladders have one of the following forms:
(1) Finite automata, deterministic pushdown automata, pushdown automata, and
Turing machines.
Information 2011, 2
710
(2) Regular, or linear grammars, context-free grammars, context-sensitive grammars, and
unrestricted, or phrase-structure grammars.
New achievements of the theory of algorithms and computation extend these ladders:
(1’) Finite automata, deterministic pushdown automata, pushdown automata, Turing machines,
inductive Turing machines [65], and infinite-time Turing machines [66].
(2’) Regular, or linear grammars, context-free grammars, context-sensitive grammars,
unrestricted, or phrase-structure grammars, grammars with prohibition [67], and Boolean
grammars [68].
Inductive Turing machines give an example of an algorithmic ladder. Namely, the n-th strata of the
inductive algorithmic ladder consists of inductive Turing machines with the structured memory that
have order n but do not have order n + 1 [65]. It is also possible to build an algorithmic ladder using
inductive or limit) Turing machine have structured program (rules for computation) or structured
(heads) operating devices [65].
5. Stratified M-spaces and Information Operators
Let us consider a universal set or multiset W of knowledge items (units). It is possible to take the set
W
C
of elementary knowledge units mathematically modeled in [1,69] as a universal set (multiset) W,
obtaining a reasonable formalization of the concept of a knowledge state. Another possibility for W is
realized by the set (multiset) W
L
of propositions and/or predicates from some logical language L. This
logical approach was adopted in works of Bar-Hillel and Carnap [2], Hintikka [3-5] and some other
authors. Shreider [7] interpreted knowledge items as texts in a thesaurus. Many researchers employ
schemas as knowledge items in the brain (cf., for example, [41-44]. One more possibility for W is the
set, or more exactly, a multiset, W
S
of situations possible in a world U, which are taken as knowledge
items or knowledge units (cf., for example, [70,71].
The set W is called universal because we assume that the following axiom is true.
MA1 (the Internal Representation Axiom). For any cognitive system (agent) A, knowledge states
K
Ai
of A are subsets (submultisets) of the set (multiset) W.
It is possible to interpret W as the base of all knowledge that agents are able to have about
their environment.
Another aspect of universality of the set (multiset) W may be in the possibility to describe all
possible (existing) worlds utilizing knowledge only from W. For instance, when W is the set (multiset)
W
L
of propositions and/or predicates from some logical language L, then it is possible to build all
descriptions of all possible worlds by combining elements from W
L
. This possibility is reflected in the
following axiom.
MA2 (the External Representation Axiom). For any environment (situation) D, there is a subset
(submultiset) W
D
of the set (multiset) W that contains all accessible knowledge about D.
Taking these two axioms as the foundation, we develop a theory of cognitive systems/agents called
the theory of M-spaces.
Definition 5.1 [1]. a) Subsets of W are called Mizzaro spaces.
b) Submultisets of W are called Mizzaro multispaces.
Information 2011, 2
711
In some cases, only specific subsets (submultisets) of W are used in the theory. For instance, if
elements of W are propositions and the model satisfies conditions of classical logical calculi, then only
consistent subsets of propositions are acceptable as Mizzaro spaces. When, in addition, all deducible
propositions are also included in such a logical Mizzaro space, then Mizzaro spaces are components of
a logical variety [72].
Taking a knowledge system K, we model the states of K by Mizzaro spaces (Mizzaro multispaces),
i.e., each knowledge state is represented by a Mizzaro space (Mizzaro multispace) K
Ki
. The whole
knowledge system K is modeled by an M-space.
In modeling knowledge systems and information processes, we consider two structures sets and
multisets – because using the classical background it is possible to consider only sets, which make the
model simpler. However, many real cognitive systems contain several copies of the same element. For
instance, the same element of knowledge can be stored in different parts of a computer memory or of
the brain. This makes utilization of multisets necessary.
Definition 5.2. A type of structures is a system of conditions (axioms) that all these structures, i.e.,
sets with relations, satisfy.
To define an M-space, we consider a type θ of structures in Mizzaro spaces (Mizzaro multispaces).
Definition 5.3. An M-space (M-multispace) M is a structure of the form
M = {KS
M
; OS
M
}
where KS consists of Mizzaro spaces (Mizzaro multispaces) K with the structure of the type θ, and
KS
M
is a system of information operators in OS
M
acting on Mizzaro spaces (Mizzaro multispaces).
Thus, KS
M
= {K
i
| i I} and OS
M
= {A
t
| t T}.
Example 5.1. It is possible to represent a logical variety or a prevariety V [72] as an M-space where
KS
M
consists of the components of V and operators from OS
M
apply mappings f
i
: A
i
L and g
i
: T
i
L (i I), which form connections between components of the variety (prevariety).
Definition 5.4. a) The set KS
M
is called the state space of the M-space M.
b) The set U
M
=
iI
K
i
is called the universe of the M-space M.
K
i
KS
M
c) The system OS
M
is called the operating system of the M-space M.
The simplest structure of Mizzaro spaces K
i
of the type θ is the structure of a set and the simplest
structure of Mizzaro multispaces Ki
of the type θ is the structure of a multiset. However, Mizzaro
spaces K
i
can be logical calculi, linear spaces or groups.
In the algebraic context, each M-space M is a universal algebra [73] with the support KS
M
and
system of operations OS
M
. In this paper, we consider only unary Mizzaro spaces (unary Mizzaro
multispaces) in which each information operator maps one Mizzaro space (Mizzaro multispace) K
i
into
another (may be the same) Mizzaro space (Mizzaro multispace) K
j
.
Information operators from OS
M
are global epistemic information operators in KS
M
. When an
operator acts only on one Mizzaro space (Mizzaro multispace), then it is a local epistemic information
operator. Local epistemic information operators in non-structured Mizzaro spaces are studied in
[1,10,11,13].
Note that it is possible to consider any system that contains a knowledge system as a knowledge
system itself. Thus, any intelligent agent is a knowledge system.
Information 2011, 2
712
There are two basic types of epistemic information operators: content, bond and mixed operators.
Definition 5.5. A content epistemic information operator acts on knowledge items in a
knowledge state.
For instance, all information operators studied in [1,10,11,13] are content epistemic
information operators.
Definition 5.6. A bond epistemic information operator acts on connections (bonds or relations)
between knowledge items in a knowledge state.
Such operators as interpretation and reinterpretation of information/knowledge items are bond
epistemic information operators.
Definition 5.7. A mixed epistemic information operator acts both on knowledge items and on
connections (bonds or relations) between knowledge items in a knowledge state.
Operators of logical inference, such as rules of deduction, are mixed epistemic information
operators act because they add new knowledge items in the form of propositions or/and predicates and
establish relations of inferrability/deducibility between propositions or/and predicates.
Here we are mostly interested in content epistemic information operators, which we simply call
epistemic information operators.
To correctly model stratified knowledge system, the modeling M-space also has to be stratified.
Definition 5.8. a) An M-space M = {KS
M
; OS
M
} is stratified if there is a set J and each Mizzaro
space (Mizzaro multispace) K
i
from KS
M
has the form
K
i
=
jJ
K
ij
b) A stratification of an M-space M = { KS
M
; OS
M
} is strict if for each Mizzaro space (Mizzaro
multispace) K
i
from KS
M
, K
ij
K
ik
= when j k J.
c) An M-space M = {KS
M
; OS
M
} is linearly stratified if each Mizzaro space (Mizzaro multispace)
K
i
from MK has the form
K
i
=
n=1
K
in
in the case when the stratification is infinite and the form
K
i
=
n=1m
K
in
in the case when the stratification is finite (cf. Figure 2).
Linear stratification means that the set of stratum indices J is finite or countable and
linearly ordered.
Information 2011, 2
713
Figure 2. A finite M-space stratification with the linear topology.
K
n
K
n-1
………………….
K
3
K
2
K
1
Figure 3. A finite M-space stratification with the cyclic topology.
K
3
K
1
K
2
There are M-space stratifications with a non-linear topology. For instance, the stratification in
Figure 3 has a cyclic topology. Important cases of M-space stratifications have structures of a tree or
a forest.
Example 5.2. People, as well as computers, have many kinds of memory. It is even supposed that
each part of the brain has several types of memory agencies that work in somewhat different ways, to
suit particular purposes [23]. It is possible to consider each of these memory agencies as a separate
system and to study differences between information that changes each type of memory. This might
Information 2011, 2
714
help to understand the interplay between stability and flexibility of the mind, in general, and memory,
in particular.
Psychologists differentiate three types of human memory: sensory memory, short-term memory, and
long-term memory. It is the most important and best documented by scientific research memory
stratification. However, memory researchers do not employ uniform terminology. Sensory memory is
also known as sensory register, sensory store, sensory information storage, eidetic memory and echoic
memory. Short- and long-term memories are also referred to as primary memory and secondary
memory, correspondingly. Each component of memory differs with respect to its function,
the form of information held, the length of time information is retained, and the amount of
information-handling capacity.
Thus, human memory has three basic strata. As a result, all knowledge in the memory of an
individual is also stratified into three components: knowledge/data in the sensory memory, knowledge
in the short-term memory, and knowledge in the long-term memory of this individual (cf. Figure 4).
Additional stratification of human memory as a knowledge space induces additional stratification of
knowledge.
Figure 4. The human memory hierarchy.
Sensory memory
Short-term memory
Long-term memory
Sensory memory acts as a buffer for stimuli received through the senses, which are then filtered and
passed from sensory memory into short-term memory by attention.
In tern, sensory memory is also stratified by different sensory channels. There are iconic memory
for visual stimuli, echoic memory for aural stimuli and haptic memory for touch.
Long-term memory is naturally stratified. The most popular stratification divides it into two parts:
episodic memory and semantic memory. Episodic memory stores knowledge of events and experiences
in a serial form. In contrast to this, semantic memory is a structured record of facts, concepts and skills
Information 2011, 2
715
that people have acquired. The information in semantic memory is derived from that in the episodic
memory of the same individual.
Neuroscientists distinguish three main activities related to long term memory: storage, deletion and
retrieval. These operations are modeled by epistemic information operators. Storage is modeled by the
epistemic information operator REPL. In information storage, information from sensory memory, at
first goes to short-term memory and then is stored in long-term memory, usually by the process called
rehearsal. Rehearsal is the repeated exposure to a stimulus of a knowledge/data portion, which
transfers it into long-term memory. Deletion of a knowledge/data portion is modeled by the epistemic
information operator DEL and is mainly caused by decay and interference. Emotional factors
essentially affect the long-term memory functioning. According to contemporary there are two types of
information retrieval: recall and recognition. In knowledge/data recall, the information is reproduced
from memory. Recall is modeled by the epistemic information operator COPY. Knowledge/data
recognition is based on information that this knowledge/data portion has been seen before and is
assisted by the provision of retrieval cues to enable better access in the long-term memory.
Recognition is modeled by the epistemic information operator GEN.
Scientists also use another stratification of the human memory: personal memory, semantic
memory, perceptual memory, and skill memory, which includes, motor skill memory, cognitive skill
memory, and rote linguistic skill memory.
Example 5.3. The computer memory is also a complex system of diverse components and
processes. Memory of a computer includes such three basic components as the random access memory
(RAM), read-only memory (ROM), and secondary storage. While RAM forgets everything whenever
the computer is turned off and ROM cannot learn anything new, secondary storage devices allow the
computer to record information for as long period of time as we want and change it whenever we want.
Now the following devices are utilized for log-term computer memory: magnetic tapes and
corresponding drives, magnetic disks and corresponding drives, flash memory storage devices and
corresponding drives, and optical disks and corresponding drives.
Computer memory is intrinsically stratified by the hierarchy in which levels are distinguished by the
response time, complexity, and capacity. The overall goal of using a memory stratification is to obtain
the higher possible average access performance, while minimizing the cost of the entire
memory system.
Often four major memory levels are separated:
1. Internal memory – Processor registers and cache.
2. Main memory – the system RAM and controller cards.
3. On-line mass storage – Secondary storage.
4. Off-line bulk storage – Tertiary and Off-line storage.
At the same time, other experts use another stratification of computer memory:
1. Processor registers have the fastest possible access (in just a few cycles) and usually stores tens
of kilobytes
2. Level 2 (L2) cache usually stores 512 KiB or more
3. Level 3 (L3) cache usually stores 2048 KiB or more
4. Main memory, access to which may take hundreds of cycles, but which usually stores multiple
gigabytes.
Information 2011, 2
716
5. Disk storage, access to which may take millions of cycles latency if not cached, but which
usually is very large
6. Tertiary storage, access to which may take several seconds latency and which can be huge
Such memory stratifications are linear, reflecting the access time with the fast CPU registers at the
top and the slow hard drive at the bottom.
Another, although related, stratification is induced by different electronic devices, which include
CPU registers, on-die SRAM caches, external caches, DRAM, paging systems, and virtual memory or
swap space on a hard drive. These devices are called RAM (random access memory) by many
developers, even though the various subsystems can have very different access times, violating the
original concept behind the random access term in RAM. RAM consists of two strata: dynamic
random access memory, which requires the stored information to be periodically re-read and re-
written, or refreshed in order not to lose it, and static memory, never needs to be refreshed as long as
power is applied, although it can lose its content if power is removed.
Usually each stratum in RAM is also stratified. For instance, in DRAM, strata are defined by the
row, column, bank, rank, and channel.
In addition, computer memory is also stratified by data storage technologies, such as
semiconductor, magnetic, and optical technologies. In modern computers, primary storage almost
exclusively consists of dynamic volatile semiconductor memory, which uses semiconductor-based
integrated circuits to store information. Since the turn of the century, a type of non-volatile
semiconductor memory known as flash memory has steadily gained share as off-line storage in various
advanced electronic devices and computers.
Magnetic storage, which is non-volatile, uses different types of magnetization on a magnetically
coated surface to store information. The information is accessed using one or more read/write heads
which may contain one or more recording transducers. A read/write head only covers a part of the
surface so that the head or medium or both must be moved relative to another in order to access data.
In modern computers, there are following kinds of magnetic storage devices:
Magnetic disk, such as sloppy disks, used for off-line storage, and the hard disk drive, used
for secondary storage
Magnetic tape data storage, used for tertiary and off-line storage
At the beginning of computer era, magnetic storage was also used for primary storage in a form of a
magnetic drum, or core memory, core rope memory, thin-film memory, twistor memory or bubble
memory, while magnetic tapes were often used for secondary storage.
Another popular type of storage is optical discs, which stores information in deformities on the
surface of a circular disc, reading this information by illuminating the surface with a laser diode and
observing the reflection. In modern computers, there are following kinds of optical storage devices:
CD, CD-ROM, DVD, BD-ROM: Read only storage, used for mass distribution of digital
information (music, video, computer programs)
CD-R, DVD-R, DVD+R, BD-R: Write once storage, used for tertiary and off-line storage
CD-RW, DVD-RW, DVD+RW, DVD-RAM, BD-RE: Slow write, fast read storage, used for
tertiary and off-line storage
Ultra Density Optical or UDO is similar in capacity to BD-R or BD-RE and is slow write, fast
read storage used for tertiary and off-line storage.
Information 2011, 2
717
In magneto-optical disc storage, the information is read optically and written into the magnetic state
on a ferromagnetic surface by combining magnetic and optical methods. It is usually used for tertiary
and off-line storage.
Paper data storage, typically in the form of paper tape or punched cards, has long been used to
store information for automatic processing, particularly before electronic computers were invented.
There are also such memory devices as the vacuum tube memory, electro-acoustic memory, optical
tapes, phase-change memory, holographic data storage, and molecular memory, which stores
information optically inside crystals or photopolymers.
All these memory devices and components determine the corresponding stratification of knowledge
stored in computers, e.g., of knowledge bases.
Example 5.4. Stratification is a popular technique in knowledge base theory and practice. For
instance, Hunter and Liu [59] introduce knowledge base stratification to solve the problem of merging
multiple knowledge bases. Benferhat and Baida [56] use stratified first order logic for access control in
knowledge bases. Benferhat and Garcia [57] employ stratification for handling inconsistent knowledge
bases. Lassez, et al, [58] show how stratification can be used as a tool in the interactive model-building
process. Namely it is possible to reduce the computational complexity of the process by the use of
stratification, which limits consistency checking to minimal strata.
Definition 5.9. a) An M-space M is a subspace of an M-space H if the state space KS
M
is a
substructure of the state space KS
H
and the operational system OS
M
is a substructure of the operational
system OS
H
.
b) If an M-space M is a subspace of an M-space H, then H is called a superspace
of M.
In particular, the stratification of the knowledge space KS
M
is induced by the stratification of the
knowledge space KS
H
.
Example 5.5. If a structured M-space H models the group memory of a group G of several people,
then a structured M-space M that models the memory of one individual from this group G is a
subspace of H.
Subspaces of M-spaces and M-multispaces represent subsystems of knowledge systems. For
instance, in large knowledge systems, such as a scientific theory, it is possible to separate the
subsystem of denotational knowledge and the subsystem of operational knowledge.
Definition 5.10. If X is a structure, i.e., a set/multiset with relations, then X is the set of all elements
from X and X is the multiset of all elements from X, while RelX is the set of all relations from X.
In such a way, ignoring the M-space stratification, it is possible to represent structured knowledge
systems by uniform M-spaces, in which all knowledge states are sets or multisets. In this setting,
content epistemic information operators act on elements from sets K or multisets K, while bond
epistemic information operators act on elements from RelK.
Definition 5.11. a) A knowledge system (agent) A is called locally finite if any knowledge state of A
is finite.
b) A knowledge system (agent) A is called finite if it has only a finite number of knowledge states
and any knowledge state of A is finite.
c) An M-space M is called locally finite if each K from KS
M
contains only a finite number of
knowledge items.
Information 2011, 2
718
d) An M-space M is called finite if it has only a finite number of Mizzaro spaces (Mizzaro
multispaces) K each of which is also finite.
It looks like it might be sufficient to consider only finite or at least, locally finite agents. However,
if knowledge is represented by logical statements and it is assumed (as, for example, in the theory of
Bar-Hillel and Carnap [2]) that any knowledge system contains all logical consequences of all its
elements, then an agent with such knowledge system is infinite. In information algebras, portions of
information are represented by close subsets of sentences from a logical language L [74].
However, in conventional logics, closed with respect to such information operators as deduction
sets are infinite because any sentence p implies pq for any sentence q from L, which is, as a rule,
infinite (cf., for example, [75]). Thus, in the context of classical logic and information algebras any
portion of information has infinitely many representations. Consequently, such a portion generates a
system with the infinite number of knowledge items.
Lemma 5.1. An M-space M is finite if and only if it has a finite universe.
Stratification of the knowledge system and the corresponding M-space allows defining specific
classes of epistemic information operators.
Definition 5.12. An epistemic information operator A is called stratified if for any
jJ, there is kJ
such that for any K
i
from KS
M
, we have A(K
ij
) K
ik
.
Stratified information operators preserve the structure, i.e., stratification, of knowledge states. Note
that adding and deletion operators are intrinsically stratified.
Definition 5.13. a) An epistemic information operator A is called closed if for any
jJ and for any
K
i
from KS
M
, we have A(K
ij
) K
ij
.
b) An epistemic information operator A is called closed in a Mizzaro space K
i
from KS
M
if A(K
i
)
K
i
.
Lemma 5.2. Any closed epistemic information operator A is stratified.
Definition 5.14. a) A stratified epistemic information operator A in a linearly stratified M-space M
is called monotone (antitone) if for any
n N, there is k N such that k n (k n) and for any K
i
from KS
M
, we have A(K
ij
) K
ik
.
b) A stratified epistemic information operator A in a linearly stratified M-space M is called strictly
monotone (strictly antitone) if for any
n N, there is k N such that k > n (k < n) and for any K
i
from
KS
M
, we have A(K
ij
) K
ik
.
Definitions imply the following property of epistemic information operators.
Lemma 5.3. Any strictly monotone (strictly antitone) epistemic information operator A is monotone
(antitone).
Lemma 5.4. In a finite linearly stratified M-space M, there are no strictly monotone and strictly
antitone information operators.
Definition 5.15. An epistemic information operator A is called contracting if there is kJ such that
for any K
i
from KS
M
, we have A(K
ij
) K
ik
.
Definitions imply the following result.
Lemma 5.5. Any contracting epistemic information operator A is stratified.
There are five types of basic epistemic operations: adding, deleting, moving, replicating, and
ttransforming knowledge, and five types of corresponding basic epistemic information operators:
Information 2011, 2
719
adding AD, deletion DL, moving MV, replication REPL, generation GEN and transformation TR
epistemic information operators.
Definition 5.16. A transformation epistemic information operator TR takes a group of knowledge
items (may be, one item) from the current knowledge state and transforms it into another group of
knowledge items (may be, into one item).
Definition 5.17. A generation epistemic information operator TR takes a group of knowledge items
(may be, one item) from the current knowledge state and generates another group of knowledge items
(may be, one item).
The difference between transformation and generation is that in generation the initial group of
knowledge items is preserved, while in transformation it is not preserved.
Lemma 5.4. AD is equal to GEN with the empty set of the initial knowledge items.
Definition 5.18. A moving epistemic information operator MV moves a knowledge item from one
stratum into another one.
Definition 5.19. a) A replica of a knowledge item is another knowledge item equivalent to the
initial one.
b) A replication epistemic information operator REPL makes a replica of a knowledge item and
adds it to the current knowledge state.
Example 5.5. Let us consider logical knowledge representation in which knowledge items are
propositions. Then according to laws of logic there are equivalent propositions. For instance, taking the
proposition (1) B implies A“, we have equivalent propositions (2) “A follows from B“, (3) “If B, then
A“, and (4) A is a consequence of B“. All of them are replicas of one another although they are
not copies.
If the proposition (1) belongs to the stratum K
1
, then its replication to the stratum K
2
can introduce
either proposition (1) or proposition (2) or proposition (3) to the stratum K
2
, while its copying to the
stratum K
2
can introduce only proposition (1) to the stratum K
2
.
An important special case of a replication epistemic information operator is a copying epistemic
information operator COPY, which makes a copy of a knowledge item and adds it to the current
knowledge state.
Another important special case of a replication epistemic information operator is a restricted
replication epistemic information operator REPL
0
, which replicates a knowledge item and adds it only
to a stratum of the current knowledge state that does not have the same replica. One of its special cases
is a restricted copying epistemic information operator COPY
0
, which makes a copy of a knowledge
item and adds it only to a stratum of the current knowledge state that does not have the same replica.
Operators REPL
0
and COPY
0
are used in stratified M-spaces not to make these spaces stratified M-
multispaces.
Lemma 5.5. The operator COPY
0
can copy a knowledge item only to a different stratum, i.e., if a
K
i
and COPY
0
a K
j
, then i j.
Indeed, if this condition is violated, then the initial M-space is converted to an M-multispace.
Complex information operations and operators are studied in [76].
Definition 5.20. An epistemic information operator C is called the sequential composition of an
epistemic information operator A with an epistemic information operator B if C(x) is defined and equal
Information 2011, 2
720
to B(A(x)) when: 1) A(x) is defined and belongs to the domain of B ; 2) B(A(x)) is defined. Otherwise,
C gives no result being applied to x, i.e., C(x) =
*
.
Taking sequential composition of an epistemic information operator A with itself, we obtain
sequential powers A
n
of the operator A.
In the general case, the sequential composition of epistemic information operators is not
commutative in M-spaces as the following example demonstrates.
Example 5.1. Let us consider a structured M-space M = {KS
M
; OS
M
} where KS
M
= =
iI
KS
Mi
.
In this space, the operator MV
ija
moves an element a from the stratum KS
Mi
to the stratum KS
Mj
and
does not change other elements from KS
M
. Taking the sequential composition of such operators, we
have
MV
ija
º
MV
ika
= MV
ija
MV
ika
º
MV
ija
= MV
ika
if i j, k j, and i k. Thus, operators MV
ija
do not commute with one another.
At the same time, all these operators are idempotents, i.e., MV
ija
º
MV
ija
= MV
ija
.
It is necessary to remark that in a structured M-multispace M with an infinite number of elements a
in each stratum KS
Mi
, operators MV
ija
and MV
ika
commute with one another. This demonstrates
difference between M-spaces and M-multispaces.
Proposition 5.1. If A and B are closed (closed in a Mizzaro space K
i
) operators, then their
sequential composition A
º
B is also a closed (closed in a Mizzaro space K
i
) operator.
Indeed, if A and B are closed epistemic information operators in a structured M-multispace M, then
for any
jJ and for any K
i
from KS
M
, we have A(K
ij
) K
ij
and B(K
ij
) K
ij
. Thus, (A
º
B)(K
ij
) =
B(A(K
ij
) B(K
ij
) K
ij
.
For closed in a Mizzaro space K
i
operators, the proof is similar.
Proposition 5.2. If A and B are contracting operators, then their sequential composition A
º
B is also
a contracting operator.
Proof is similar to the proof of Proposition 5.1.
Proposition 5.3. If A and B are stratified operators, then their sequential composition A
º
B is also a
stratified operator.
Proof is similar to the proof of Proposition 5.1.
Proposition 5.4. If A and B are (strictly) monotone [antitone] operators, then their sequential
composition A
º
B is also a (strictly) monotone [antitone] operator.
Indeed, if A and B are monotone epistemic information operators in a structured M-space M, then
for any K
i
from KS
M
, we have A(K
ij
) K
ik
with k j and B(K
ik
) K
ih
with h k. Thus, (A
º
B)(K
ij
) =
B(A(K
ij
) B(K
ik
) K
ih
with h j.
Considerations for strictly monotone, antitone and strictly antitone epistemic information operators
are similar.
Let us consider an M-space M with a finite linear stratification.
Proposition 5.5. For any monotone and any antitone epistemic information operator A, there is a
number n such that the sequential power A
n
is also a closed epistemic information operator.
Indeed, if A is a monotone epistemic information operator in a structured M-space M, then making
each step, it either increases the number of the stratum or the image of a stratum remains in the same
stratum. If the second case is true for all strata of M, then A itself is a closed epistemic information
operator. Otherwise, A can increase the number of a stratum only for a finite number of steps because
Information 2011, 2
721
there are only a finite number of strata in M. Thus, after some number of repetitions, the image of a
stratum remains in the same stratum. Taking the largest number of such steps, we obtain the necessary
number n.
Note that n cannot be larger than the number of strata in M.
Let us explore relations between basic epistemic information operators.
Definition 5.21 [77]. Two operators A and B are functionally equivalent if they have the same
definability domain D and A(x) = B(x) for any element x from D.
Proposition 5.6. A transformation epistemic information operator TR is functionally equivalent to
the sequential composition of a deletion epistemic information operator DEL and adding epistemic
information operator AD that act in the same stratum of the M-space.
Indeed, if TR takes items a
1
, a
2
, ... , a
n
from KS
M
and transforms them into b
1
, b
2
, ... , b
m
, then it is
possible to achieve the same result by deleting a
1
, a
2
, ... , a
n
and adding b
1
, b
2
, ... , b
m
to the
corresponding stratum of KS
M
.
Proposition 5.7. A moving epistemic information operator MV is functionally equivalent to
deletion of a knowledge item in one stratum and adding the same knowledge item to another stratum.
Proposition 5.8. A replication epistemic information operator REPL is functionally equivalent to
adding an equivalent knowledge item to the corresponding stratum.
Proposition 5.9. For any M-space M, there is a superspace H, in which all deletion and adding
epistemic information operators DEL and AD in M are functionally equivalent to moving epistemic
information operators MV in H.
Proof. To build a superspace H with the necessary properties, we add one more stratum E called the
external stratum to the initial M-space M. In addition, we assume that E contains all elements from the
universal set (multiset) W and each element has infinitely many copies in E. In this case, any deletion
of an element a from a state K from M is equivalent to moving the same element a to the stratum E. In
a similar way, any addition of an element a to a state K from M is equivalent to moving the same
element a from the stratum E to the state K.
Proposition is proved.
Proposition 5.10. A moving epistemic information operator MV can be (functionally) simulated by
copy COPY and deletion DEL epistemic information operators.
Indeed, instead of moving a knowledge item a from a stratum K
i
of a state K to a stratum K’
j
of a
state K’, it is possible to copy a from K
i
to K’
i
and then to delete this element from K
i
.
Proposition 5.11. A generation epistemic information operator GEN is functionally equivalent to
the sequential composition of a transformation epistemic information operator TR and adding
epistemic information operator AD that act in the same stratum of the M-space.
Proof is similar to the proof of Proposition 5.10.
Proposition 5.12. A transformation epistemic information operator TR is functionally equivalent to
the sequential composition of a generation epistemic information operator GEN and deletion epistemic
information operator DEL that act in the same stratum of the M-space.
Proof is similar to the proof of Proposition 5.10.
Definition 5.22. A system B of epistemic information operators is an operator basis of an M-space
M if any A from OS
M
is a composition of elements from B.
Information 2011, 2
722
Operator bases can be useful in many situations. For instance, knowing properties of operators from
such a base and properties of compositions, we can find properties of other operators.
Assuming that all operators in an M-space M are compositions of basic epistemic information
operators, we have the following results.
Proposition 5.13. a) {AD, DEL} is an operator basis of an arbitrary (stratified) M-space M.
b) {TR, MV} is an operator basis of an arbitrary (stratified) M-space M.
c) {TR} is an operator basis of an arbitrary (i.e., non-stratified) M-space M.
Proof is based on Propositions 5.7 - 5.12.
Proposition 5.14. a) In a stratified M-space M with the external stratum, {REPL
0
, DEL} is an
operator basis.
b) In a stratified M-space M with the external stratum, {MV} is an operator basis.
c) In a stratified M-multispace M with the external stratum, {REPL, DEL} is an
operator basis.
Proof is based on Propositions 5.7 - 5.12.
6. Conclusions
Based on the principles of the general theory of information, epistemic information is singled out as
a kind of antropic information and modeled by the algebraic construction of M-spaces. M-spaces
represent information dynamics by information operators acting in knowledge spaces. The main
emphasis of this study is made on stratified knowledge spaces and algebras of epistemic information
operators in such spaces.
Obtained results bring us to the following problems.
It is possible to consider not only knowledge but also beliefs as basic components of cognitive
infological systems and call information that acts on such systems by the name plausible epistemic
information.
Problem 1. Mathematically describe and study plausible epistemic information.
Problem 2. Study other types of M-space stratifications and operators in these spaces.
Problem 3. Study M-spaces in which knowledge items are elements of logics, e.g., propositions or
predicates and stratification of which includes the structure of the corresponding logic, e.g., the
propositional logic or the first-order predicate logic.
Problem 4. Study categories of M-spaces and functors between these categories.
Problem 5. Study operations with M-spaces.
In this paper, we studied only content epistemic information operators, while bond epistemic
information operators, which act on connections and relations between knowledge items, are also very
important.
Problem 6. Study bond epistemic information operators.
In this paper, we studied only sequential composition of epistemic information operators, while
other types of composition are also very important.
Problem 7. Study other compositions of epistemic information operators.
In [1,13], information explications of epistemic information operators is studied in uniform
Mizzaro spaces.
Information 2011, 2
723
Problem 8. Explore information explications of epistemic information operators in stratified M-
spaces and Mizzaro spaces.
References
1. Burgin, M. Theory of Information: Fundamentality, Diversity and Unification; World Scientific:
Singapore, 2010.
2. Bar-Hillel, Y.; Carnap, R. Semantic information. Br. J. Philos. Sci. 1958, 4, 147-157.
3. Hintikka, J. The Varieties of Information and Scientific Explanation. In Logic, Methodology and
Philosophy of Science III; van Rootselaar, B., Staal, J.F., Eds.; North-Holland Publishing
Company: Amsterdam, The Netherlands, 1968; pp. 311-331.
4. Hintikka, J. Surface Information and Depth Information. In Information and Inference; Synthese
Library, Humanities Press: New York, NY, USA, 1970; 263-297.
5. Hintikka, J. On Defining Information. Ajatus 1971, 33, 271–273.
6. Israel, D.; Perry, J. What is Information? In Information, Language and Cognition; University of
British Columbia Press: Vancouver, BC, Canada, 1990; 1-19.
7. Shreider, Y.A. On the semantic characteristics of information. Inf. Storage Retr. 1965, 2, 221-233.
8. MacKay, D.M. Information, Mechanism and Meaning; The MIT Press: Cambridge, MA, USA,
1969.
9. Brookes, B.C. The foundations of information science, pt. 1, Philosophical aspects. J. Inf. Sci.
1980, 2, 125-133.
10. Mizzaro, S. On the Foundations of Information Retrieval. In Proceedings of the Atti del Congresso
Nazionale (AICA’96), Roma, Italy, 25–27 September 1996; pp. 363-386.
11. Mizzaro, S. How many relevances in information retrieval? Interact. Comput. 1998, 10, 303–320.
12. Gackowski, Z.J. What to teach business students in mis courses about data and information. Issues
Informing Sci. Inf. Technol. 2004, 1, 845-867.
13. Mizzaro, S. Towards a Theory of Epistemic Information. In Information Modelling and
Knowledge Bases; IOS Press: Amsterdam, The Netherlands, 2001; Volume 12, pp. 1-20.
14. Baer, N.; Zeidman, B. Measuring Software Evolution with Changing Lines of Code. In Proceedings
of the 24th International Conference on Computers and Their Applications (CATA-2009), New
Orleans, LA, USA, 8–10 April 2009; pp. 264-170.
15. Burgin, M. Fundamental Structures of Knowledge and Information; Academy for Information
Sciences: Kiev, Ukraine, 1997; (in Russian).
16. Gardner, R.; Cory, G.A. The Evolutionary Neuroethology of Paul MacLean: Convergences and
Frontiers; Praeger: New York, NY, USA, 2002.
17. Russell, P. The Brain Book; Penguin Books: London, UK, 1992.
18. Assagioli, R. Psychosynthesis: A Collection of Basic Writings; Penguin Books: London, UK, 1993.
19. Kenny, A. Action, Emotion and Will; Routledge: London, UK, 2003.
20. Burgin, M. Information and transformation. Transformation 1998/1999, 1, 48-53, (in Polish).
21. Von Weizsäcker, C.F. Die Einheit der Natur; Deutscher Taschenbuch Verlag: Munich, Germany,
1974.
Information 2011, 2
724
22. Von Weizsäcker, C.F. Aufbau der Physik; Hanser: Munich, Germany, 1985; (Eglish translation:
The Structure of Physics; Springer: Berlin, Germany, Heidelberg, Germany and New York, NY,
USA, 2006).
23. Minsky, M. The Society of Mind; Simon and Schuster: New York, NY, USA, 1986.
24. Capurro, R., Fleissner, P., and Hofkirchner, W. Is a Unified Theory of Information Feasible? In
The Quest for a unified theory of information; Routledge: London, UK, 1999; pp. 9-30.
25. Melik-Gaikazyan, I.V. Information Processes and Reality; Nauka: Moscow, Russia, 1997; (in
Russian, English summary).
26. Jung, C. On Psychic Energy. In On the Nature of the Psyche; Princeton University Press:
Princeton, NJ, USA, 1928/1960.
27. Von Grot, N. Die begriffe der seele und der psychischen energie in der psychologie. Arch. fur
Syst. Philos. 1898, IV.
28. Colby, K. Energy and Structure in Psychoanalysis; Ronald: New York, NY, USA, 1955.
29. Freud, S. The Standard Edition of the Complete Psychological Works of Sigmund Freud, Hogarth
and the Institute of Psycho-Analysis: London, UK, 1954.
30. Lieberman, H.R. Cognitive methods for assessing mental energy. Nutr. Neurosci. 2007, 10,
229-242.
31. O’Connor, P.J. Mental energy: Assessing the mood dimension. Nutr. Rev. 2006, 64, S7-S9.
32. Dahl, H. The panel on “Psychoanalytic Theory of the Instinctual Drives in Relation to Recent
Developments”. J. Am. Psychoanal. Assoc. 1968, XVI, 613-637.
33. MacLean, P.D. A Triune Concept of the Brain and Behavior; University of Toronto Press:
Toronto, ON, Canada, 1973.
34. MacLean, P.D. On the Origin and Progressive Evolution of the Triune Brain. In Primate Brain
Evolution; Plenum Press: New York, NY, USA, 1982.
35. Smith, C.U.M. The triune brain in antiquity: Plato, Aristotle, Erasistratus. J. Hist. Neurosci. 2010,
19, 1-14.
36. Herrmann, N. The Creative Brain; Brain Books: Lake Lure, NC, USA, 1990.
37. Cory, G.A. The Reciprocal Modular Brain in Economics and Politics: Shaping the Rational and
Moral Basis of Organization, Exchange, and Choice; Kluwer Academic/Plenum Publishers: New
York, NY, USA, 1999.
38. Zametkin, A.J. Cerebral glucose metabolism in adults with hyperactivity of childhood onset.
N. Engl. J. Med. 1990, 323, 1361-1366.
39. Levine, P.A. Waking the Tiger: Healing Trauma; North Atlantic Books: Berkeley, CA, USA,
1999.
40. Patton, P. One world, many minds: Intelligence in the animal kingdom. Sci. Am. 2008, 19, 72-79.
41. Anderson, R.C. The Notion of Schemata and the Educational Enterprise. In Schooling and the
Acquisition of Knowledge; Anderson, R.C., Spiro, R.J., Montague, W.E., Eds.; Lawrence Erlbaum:
Hillsdale, NJ, USA, 1977.
42. Arbib, M. Schema Theory. In The Encyclopedia of AI; Wiley-Interscience: New York, NY, USA,
1992; pp. 1427-1443.
43. Armbruster, B. Schema theory and the design of content-area textbooks. Educ. Psychol. 1996, 21,
253-276.
Information 2011, 2
725
44. Burgin, M. Mathematical Schema Theory for Modeling in Business and Industry. In Proceedings
of the 2006 Spring Simulation Multi Conference (Spring Sim ’06), Huntsville, AL, USA, 2006; pp.
229-234.
45. Baars, B.J.; Gage, N.M. Cognition, Brain, and Consciousness: Introduction to Cognitive
Neuroscience; Elsevier Science/Academic Press: Amsterdam, The Netherlands, 2007.
46. Carter, R. Mapping the Brain; Phoenix Books: Junction, VT, USA, 2003.
47. DeArmond, S.J.; Fusco, M.M.; Dewey, M. Structure of the Human Brain: A Photographic Atlas;
Oxford University Press: New York, NY, USA, 1989.
48. Dehaene, S. Reading in the Brain: The Science and Evolution of a Human Invention; Viking: New
York, NY, USA, 2009.
49. Berrios G.E.; Gili M. Will and its disorders. A conceptual history. Hist. Psychiatry 1995, 6,
87-104.
50. Spence, S.A. Between will and action. J. Neurol. Neurol. Psychiatry 2000, 69,
doi:10.1136/jnnp.69.5.702.
51. Sternberg, R.J. Beyond IQ: A Triarchic Theory of Intelligence; Cambridge University Press:
Cambridge, UK, 1985.
52. Neisser, U. Cognition and Reality: Principles and Implications of Cognitive Psychology; W.H.
Freeman and Company: San Fransisco, CA, USA, 1976.
53. Beck, J. Cognitive Therapy: Basics and Beyond; Guilford: New York, NY, USA, 1995.
54. Freeman, A.; DeWolf, R. The 10 Dumbest Mistakes Smart People Make and How to Avoid Them;
Harper Collins Publ.: New York, NY, USA, 1992.
55. Kolb, B.; Whishaw, I.Q. Fundamentals of Human Neuropsychology; W.H. Freeman and Co.: New
York, NY, USA, 1990.
56. Benferhat, S.; Baida, R. A stratified first order logic approach for access control. Int. J. Int. Syst.
2004, 19, 817-836.
57. Benferhat, S.; Garcia, L. Handling locally stratified inconsistent knowledge bases. Stud. Log.
2002, 70, 77-104.
58. Lassez, C.; McAloon, K.; Port, G.S. Stratification and knowledge base management. J. Symb.
Comput. 1989, 7, 509-522.
59. Hunter, A.; Liu, W.R. Knowledge Base Stratification and Merging Based on Degree of Support. In
Symbolic and Quantitative Approaches to Reasoning with Uncertainty; Springer-Verlag: Berlin,
Germany, 2009; pp. 383-395.
60. Yue, A.; Liu, W.; Hunter, A. Approaches to Constructing a Stratified Merged Knowledge Base. In
Proceedings of the 9th European Conference on Symbolic and Quantitative Approaches to
Reasoning with Uncertainty (ECSQARU’07), Hammamet, Tunisia, 31 October–2 November 2007;
pp. 54-65.
61. Cholewinski, P. Stratified default logic. Proc. Comp. Sci. Logic’94 1994, 933, 456-470.
62. Burgin, M. Structural Organization of Temporal Databases. In Proceedings of the 17th International
Conference on Software Engineering and Data Engineering (SEDE-2008), ISCA, Los Angeles,
CA, USA, 30 June–2 July 2008; pp. 68-73.
63. Snodgrass, R.T.; Jensen, C.S. Developing Time-Oriented Database Applications in SQL; Morgan
Kaufmann: San Francisco, CA, USA, 1999.
Information 2011, 2
726
64. Burgin, M. Measuring Power of Algorithms, Programs, and Automata. In Artificial Intelligence
and Computer Science; Nova Science Publishers: New York, NY, USA, 2005; pp. 1-61.
65. Burgin, M. Super-Recursive Algorithms; Springer: Heidelberg, Germany, 2005.
66. Hamkins, J.D.; Lewis, A. Infinite time turing machines. J. Symb. Log. 2000, 65, 567-604.
67. Burgin, M. Grammars with Prohibition and Human-Computer Interaction. In Proceedings of the
Business and Industry Simulation Symposium, Society for Modeling and Simulation International,
San Diego, CA, USA, 3–7 April 2005b; pp. 143-147.
68. Okhotin, A. Boolean grammars. Inf. Comput. 2004, 194, 19-48.
69. Burgin, M. Data, information, and knowledge. Information 2004, 7, 47-57.
70. Barwise, J.; Perry, J. Situations and Attitudes; MIT Press: Cambridge, MA, USA, 1983.
71. Dretske, F.I. Knowledge and the Flow of Information; Basil Blackwell: Oxford, UK, 1981.
72. Burgin, M. Logical Tools for Program Integration and Interoperability. In Proceedings of the
IASTED International Conference on Software Engineering and Applications, MIT, CA, USA,
9–11 November 2004a; pp. 743-748.
73. Cohn, P.M. Universal Algebra; Harper&Row, Publ.: New York, NY, USA, 1965.
74. Kohlas, J.; Stärk, R.F. Information algebras and consequence operators. Log. Universalis 2007, 1,
139-165.
75. Shoenfield, J.R. Mathematical Logic; Addison-Wesley: Reading, MA, USA, 1967.
76. Burgin, M. Information algebras. Control Syst. Mach. 1997, 6, 5-16, (in Russian).
77. Burgin, M. Functional equivalence of operators and parallel computations. Program. Comput.
Softw. 1980, 6, 283-294.
© 2010 by the author; licensee MDPI, Basel, Switzerland. This article is an open access article
distributed under the terms and conditions of the Creative Commons Attribution license
(http://creativecommons.org/licenses/by/3.0/).
... Epistemic information operators are transformations and mappings of epistemic spaces. A special case of epistemic spaces-knowledge spaces-and knowledge information operators were studied in [1,3,4]. Here, we study knowledge spaces and information operators in a more general context of epistemic structures, epistemic spaces, and epistemic information operators. ...
... In this context, a cognitive infological system CIF(R) contains, acquires, stores and processes various epistemic structures, such as knowledge, data, ideas, beliefs, images, algorithms, tasks, procedures, problems, schemas, scenarios, values, measures, opinions, goals, ideals, fantasies, abstractions, etc. Cognitive infological systems are very important, especially, for intelligent systems. Indeed, the majority of researchers believe that information in general is intrinsically connected to cognition, while cognitive information is one of the three basic types of anthropic information studied in [3]. Moreover, some researchers believe that people's knowledge about physical reality is the result of information they obtain from external sources [7][8][9][10]. ...
... For instance, stratified M-spaces [3] give an example of structured epistemic spaces. Let us consider other examples of epistemic spaces. ...
Article
Full-text available
Information is usually related to knowledge. Here, we present a broader picture in which information is associated with epistemic structures, which form cognitive infological systems as basic recipients and creators of cognitive information. Infological systems are modeled by epistemic spaces, while operators in these spaces are mathematical models of information. Information that acts on epistemic structures is called cognitive information, while information that acts on knowledge structures is called epistemic information. The latter brings new and updates existing knowledge, being of primary importance to people. In this paper, both types of information are studied as operators in epistemic spaces based on the general theory of information. As a synthetic approach, which reveals the essence of information, organizing and encompassing all main directions in information theory, the general theory of information provides efficient means for such a study. Different types of information dynamics representation use tools from various mathematical disciplines, such as the theory of categories, functional analysis, mathematical logic and algebra. In this paper, we base our exploration of information and knowledge dynamics on functional analysis further developing the mathematical stratum of the general theory of information.
... Before looking in more detail at the roles of the different parts of an information transfer process, we introduce (Section 2) the relations between structures, symbols, information and prediction in the context of the general theory of information (GTI) (cf. for example, [18][19][20][21]) and the general theory of structures (GTS) developed in [1]. Different types of information and its comprehension are carefully differentiated in our exploration of structural information presented in Sections 3 and 4. Section 3 deals with descriptive structural information while Section 4 examines inherent structural information. ...
... They allow to achieve the comprehensive definition of information as a foremost phenomenon in the world, to discern information, information representation and information carrier and to efficiently study information measures, processes, and systems by means of mathematical models and experiments (cf. for example, [19][20][21]). ...
Article
The general theory of information, which includes syntactic, semantic, pragmatic, and many other special theories of information, provides theoretical and practical tools for discerning a very large diversity of different kinds, types, and classes of information. Some of these kinds, types, and classes are more important and some are less important. Two basic classes are formed by structural and symbolic information. While structural information is intrinsically imbedded in the structure of the corresponding object or domain, symbolic information is represented by symbols, the meaning of which is subject to arbitrary conventions between people. As a result, symbolic information exists only in the context of life, including technical and theoretical constructs created by humans. Structural information is related to any objects, systems, and processes regardless of the existence or presence of life. In this paper, properties of structural and symbolic information are explored in the formal framework of the general theory of information developed by Burgin because this theory offers more powerful instruments for this inquiry. Structural information is further differentiated into inherent, descriptive, and constructive types. Properties of correctness and uniqueness of these types are investigated. In addition, predictive power of symbolic information accumulated in the course of natural evolution is considered. The phenomenon of ritualization is described as a general transition process from structural to symbolic information. Free PDF Version: http://www.mdpi.com/2078-2489/8/4/139/pdf
... The role of operators is especially important in epistemology where cognitive and epistemic operators model cognitive processes providing efficient means for exploration of knowledge and information [68][69][70][71]. ...
Article
Full-text available
The goal of this paper is to represent two approaches to the phenomenon of information, explicating its nature and essence. In this context, Mark Burgin demonstrates how the general theory of information (GTI) describes and elucidates the phenomenon of information by explaining the axiomatic foundations for information studies and presenting the comprising mathematical theory of information. The perspective promoted by Jaime F. Cárdenas-García is based on Gregory Bateson's description of information as "difference which makes a difference" and involves the process of info-autopoiesis as a sensory commensurable, self-referential feedback process.
Article
Full-text available
Defining computation as information processing (information dynamics) with information as a relational property of data structures (the difference in one system that makes a difference in another system) makes it very suitable to use operator formulation, with similarities to category theory. The concept of the operator is exceedingly important in many knowledge areas as a tool of theoretical studies and practical applications. Here we introduce the operator theory of computing, opening new opportunities for the exploration of computing devices, processes, and their networks.