ArticlePDF Available

Computing Curricula: Teaching Theory of Science to Computer Science Students (Science Education / Curriculum, Research and Development)

Authors:

Abstract and Figures

An ideal Science for the existing Theory of Science (Popper, Carnap, Kuhn, Chalmers) is Physics. Not many modern Sciences conform to that ideal, however. Philosophy of Science (Theory of Science) as it is today is not of much help when trying to understand e.g. Computer Science. There is an urgent need to broaden the Theory of Science perspective in order to match the present situation within the area, as well as to help its further development. Computer Science has its basis in Logic and Mathematics, and in many cases its theoretical and experimental research methods follow patterns of classical scientific fields of Logic/Mathematics and Natural Sciences. On the other hand, computer modeling and simulation which is specific for the discipline and it is rapidly grow- ing in importance, applied to computers, as well as to other scientific and artistic fields, hardly corresponds to traditional definition of scientific method. Situation gets even more complicated in the field of Intelligent Systems (Artifi- cial Intelligence, AI). AI is generally associated with Computer Science, but it has many important links with other fields such as Mathematics, Psychology, Cognition, Biology, Behavioral and Brain Sciences, Linguistics and Philosophy, among others. This paper addresses the need for paradigm shift within Theory of Science. It shows that it is essential for students of Computer Science to not only acquire the concepts from Theory of Science within its conventional domain, but also widen the perspective and see the field in its context of other scientific traditions.
Computational Science Mastery of Computational Science tools, such as modeling with 3D visualization and computer simulation, efficient handling of large data sets, ability to access a variety of distributed resources and collaborate with other experts over the Internet, etc. are now expected of university graduates, not necessarily Computer Science majors. Those skills are becoming a part of scientific culture. Today, computing environments and methods for using them have become powerful enough to tackle problems of great complexity. With the dramatic changes in computing, the need for dynamic and flexible Computational Science b ecomes ever more obvious. Computer simulation makes it possible to investigate regimes that are beyond current experimental capabilities and to study phenomena that cannot be replicated in laboratories, such as the evolution of the universe. In the realm of Science, computer simulations are guided by theory as well as experimental results, while the computational results often suggest new experiments and theoretical models. In engineering, many more design options can be e xplored through computer models than by building physical ones, usually at a small fraction of the cost and elapsed time. Even though the term ''simulation'' is old, it reflects the way in which a good deal of Science will be done in the next century. Scie ntists will perform computer experiments in addition to testing scientific hypotheses by performing experiments on actual physical objects of investigation. One can say that simulation represents a fundamental discipline in its own right regardless of the specific applic ation. Computational Science involves the use of computers (''supercomputers'') for visualization and simulation of complex and large-scale phenomena. Studies involving N body simulations,
… 
Content may be subject to copyright.
1
Teaching Theory of Science to Computer Science Students
Gordana Dodig-Crnkovic, Ivica Crnkovic
Department of Computer Science and Engineering, Mälardalen University,
S-721 67 Västerås, Sweden, Tel: +46 21 15 17 25, Fax: +46 21 10 14 60
gordana.dodig-crnkovic@mdh.se, ivica.crnkovic@mdh.se
Abstract An ideal Science for the existing
Theory of Science (Popper, Carnap, Kuhn,
Chalmers) is Physics. Not many modern Sci-
ences conform to that ideal, however. Philoso-
phy of Science (Theory of Science) as it is to-
day is not of much help when trying to under-
stand e.g. Computer Science. There is an ur-
gent need to broaden the Theory of Science
perspective in order to match the present situa-
tion within the area, as well as to help its fur-
ther development.
Computer Science has its basis in Logic and
Mathematics, and in many cases its theoretical
and experimental research methods follow pat-
terns of classical scientific fields of
Logic/Mathematics and Natural Sciences. On
the other hand, computer modeling and simula-
tion which is specific for the discipline and it is
rapidly growing in importance, applied to com-
puters, as well as to other scientific and artistic
fields, hardly corresponds to traditional
definition of scientific method. Situation gets
even more complicated in the field of In-
telligent Systems (Artificial Intelligence, AI.
This paper addresses the need for paradigm
shift within Theory of Science. It shows that it
is essential for students of Computer Science to
not only acquire the concepts from Theory of
Science within its conventional domain, but
also widen the perspective and see the field in
its context of other scientific traditions.
Introduction
It is not so obvious, as the name might sug-
gest that the Computer Science qualifies as “Sci-
ence” in a sense traditional theory of Science [1-
4] defines the term. Computer Science (CS) is a
young discipline and necessarily starting from the
outset very different from Mathematics, Physics
and similar “classic” Sciences, that all have their
origins in the Philosophy of ancient Greece.
Emerging in modern time (in 1940's the first
electronic digital computer was built), CS has
necessarily other already existing Sciences in the
background. Computer Science draws its founda-
tions from a wide variety of disciplines [5], [6],
[7]. Study of Computer Science consequently
requires utilizing concepts from many different
fields. Computer Science integrates theory and
practice, abstraction (general) and design (spe-
cific).
The historical development has led to emer-
gence of a big number of Sciences that in our
time communicate more and more not only be-
cause the means of communication are becoming
very convenient and effective, but also because a
need increases for getting a holistic view of our
world that is presently strongly dominated by re-
ductionism.
1. What Is Computer Science?
According to the present view, Computer
Science can be situated in a broader context of
Computing in the following way (see Figure 1)
[8]. The discipline of Computing thus encom-
passes Computer Science, Computer Engineering,
Software Engineering and Information Systems.
The German and French use the respective
terms "Informatik" and "Informatique" to denote
Computer Science. It is interesting to observe that
the British term "Computer Science" has an em-
pirical orientation, while the corresponding Ger-
man and French term “Informatics” has an ab-
stract orientation. This difference in terminology
appears to support the view that the nineteenth-
century characters of British empiricism and con-
tinental abstraction have persisted.
2
Figure 1 Computer Science within the field of Computing
The view that information is the central idea
of Computer Science is both scientifically and
sociologically indicative. Scientifically, it suggests
a view of Computer Science as a generalization
of information theory that is concerned not only
with the transmission of information but also with
its transformation and interpretation. Sociologi-
cally, it suggests an analogy between the indus-
trial revolution, which is concerned with the utiliz-
ing of energy, and the computer revolution, which
is concerned with the utilizing of information.
It is argued in [9] that Computer Science
was dominated by empirical research paradigms
in the 1950s, by mathematical research paradigms
in the 1960s and by engineering oriented para-
digms beginning with the 1970s.
The diversity of research paradigms within
Computer Science may be responsible for the
divergences of opinion concerning the nature of
Computer Science research.
Sub-areas of Computer Science
Dijkstra said that to call the field "Computer Sci-
ence" is like calling surgery "Knife Science". He
noted that departments of Computer Science are
exposed to a permanent pressure to overempha-
size the "Computer" and to underemphasize the
"Science". This tendency matches the inclination
to appreciate the significance of computers
solely in their capacity of tools.
According to [8], sub-areas of Computer Sci-
ence curricula are:
1. Discrete Structures
2. Programming Fundamentals
3. Algorithms and Complexity
4. Programming Languages
5. Architecture and Organization
6. Operating Systems
7. Net-Centric Computing
8. Human-Computer Interaction
9. Graphics and Visual Computing
10. Intelligent Systems
11. Information Management
12. Software Engineering
13. Social and Professional Issues
14. Computational Science and Nu-
merical Methods
As Computer Science develops, the list is
expanding. Fields 7, 8 and 9 e.g. are new com-
pared to predecessor [17](Denning report) list.
3
2. Scientific Methods of Computer Science
The Traditional View
What is specific for CS is that its objects of
investigation are artifacts (computer-related phe-
nomena) that change concurrently with the devel-
opment of theories describing them and simulta-
neously with the growing practical experience in
their usage.
A computer from the 1940s is not the same
as a computer from the 1970s, which in its turn is
different from a computer in 2002. Even the task
of defining what a computer is in the year 2002 is
far from trivial!
With respect to methodology, Computer Sci-
ence can be divided into Theoretical, Experimen-
tal and Simulation CS.
2.1 Theoretical Computer Science
Concerning Theoretical Computer Science,
which adheres to the traditions of Logic and
Mathematics, we can conclude that it follows the
classical methodology of building theories as logi-
cal systems with stringent definitions of objects
(axioms) and operations (rules) for deriv-
ing/proving theorems.
Logic is important for computing not only be-
cause it forms the basis of every programming
language, or because of its investigating into the
limits of automatic calculation, but also because of
its insight that strings of symbols (also encoded as
numbers) can be interpreted both as data and as
programs.
Theory creates methodologies, Logics and
various semantic models to help design programs,
to reason about programs, to prove their correct-
ness, and to guide the design of new programming
languages.
However, CS theories do not compete with
each other as to which better explains the funda-
mental nature of information. Nor are new theo-
ries developed to reconcile theory with experi-
mental results that reveal unexplained anomalies
or new, unexpected phenomena, as in Physics. In
Computer Science there is no history of critical
experiments that decide between the validity of
various theories, as there are in physical Sci-
ences. The basic, underlying mathematical model
of digital computing is not seriously challenged by
theory or experiments.
In Computer Science, results of theory are
judged by the insights they reveal about the
mathematical nature of various models of com-
puting and/or by their utility to the practice of
computing and their ease of application. Do the
models conceptualize and capture the aspects
computer scientists are interested in, do they yield
insights in design problems, and do they aid rea-
soning and communication about relevant prob-
lems?
The design and analysis of algorithms is a
central topic in theoretical Computer Science.
Methods are developed for algorithm design,
measures are defined for various computational
resources, tradeoffs between different resources
are explored, and upper- and lower-resource
bounds are proved for the solutions of various
problems. In the design and analysis of algorithms
measures of performance are well defined, and
results can be compared quite easily in some of
these measures (which may or may not fully re-
flect their performance on typical problems). Ex-
periments with algorithms are used to test imple-
mentations and compare their “practical” per-
formance on the subsets of problems considered
important.
Theoretical Computer Science seeks to un-
derstand both the limits of computation and the
power of computational paradigms. Theoreticians
also develop general approaches to problem solv-
ing. Some of the main methodological themes in
Theoretical Computer Science (inherited from
Mathematics) are iteration, induction and re-
cursion.
One of theoretical Computer Science's most
important functions is the distillation of knowledge
acquired through conceptualization, modeling and
analysis. Knowledge is accumulating so rapidly
that it must be collected, condensed and struc-
tured in order to get useful.
2.2 Experimental Computer Science
The subject of inquiry in the field of Com-
puter Science is information rather than energy or
matter which is characteristic of classical Sci-
ences. However, it makes no difference in the
applicability of the traditional scientific method.
To understand the nature of information proc-
esses, computer scientists must observe phenom-
ena, formulate explanations and theories, and test
them.
4
Experiments are used both for theory testing
and for exploration [10], [11], [12]. Experiments
test theoretical predictions against reality. A sci-
entific community gradually accepts a theory if
the known facts within its domain can be deduced
from the theory, if it has withstood experimental
tests, and if it correctly predicts new phenomena.
Conditio sine qua non of any experiment is re-
peatability/reproducibility. Repeatability ensures
that results can be checked independently and
thus raises confidence in the results.
Nevertheless, there is always an element of
uncertainty in experiments and tests as well: To
paraphrase Edsger Dijkstra, an experiment can
only show the presence of bugs (flaws) in a
theory, not their absence. Scientists are keenly
aware of this uncertainty and are therefore ready
to disqualify a theory if contradicting evidence
shows up.
A good example of theory falsification in
Computer Science is the famous Knight and
Leveson experiment, [13] which analyzed the
failure probabilities of multiversion programs.
Conventional theory predicted that the failure
probability of a multiversion program was the
product of the failure probabilities of the individual
versions. However, John Knight and Nancy
Leveson observed that real multiversion programs
had significantly higher failure probabilities. In
fact, the experiment falsified the basic assumption
of the conventional theory, namely that faults in
different program versions are statistically
independent.
Experiments are also used in areas to which
theory and deductive analysis do not reach. Ex-
periments probe the influence of assumptions,
eliminate alternative explanations of phenomena,
and unearth new phenomena in need of explana-
tion. In this mode, experiments help with induc-
tion: deriving theories from observation.
Artificial neural networks (ANN) are a good
example of the explorative mode of experimenta-
tion. After ANN having been discarded on theo-
retical grounds, experiments have demonstrated
properties better than those theoretically pre-
dicted. Researchers are now developing better
theories of ANN in order to account for these
observed properties [12].
Experiments are made in many different
fields of CS such as search, automatic theorem
proving, planning, NP-complete problems, natural
language, vision, games, neural
nets/connectionism, and machine learning. Fur-
thermore, analyzing performance behavior on
networked environments in the presence of re-
source contention from many users is a new and
complex field of experimental Computer Science.
In this context it is important to mention Internet.
Yet, there are plenty of Computer Science
theories that haven’t been tested. For instance,
functional programming, object-oriented pro-
gramming, and formal methods are all thought to
improve programmer productivity, program qual-
ity, or both. Yet, none of these obviously impor-
tant claims have ever been tested systematically,
even though they are all 30 years old and a lot of
effort has gone into developing programming lan-
guages and formal techniques [12]. One impor-
tant reason is the difficulty in devising quantitative
methods to measure programmer productivity,
program quality and alike. Here the human as-
pects are obviously an inseparable part of the
problem.
Even some other fields of Computing such as
Human-Computer Interaction and parts of Soft-
ware Engineering have to take into consideration
humans (users, programmers) in their models of
the investigated phenomena.
The consequence of widening the problem
domain to include humans is introduction of a
“soft” empirical approach more characteristic for
Humanities and Social Sciences, with methodo-
logical tools such as interviews and case studies.
2.3 Computer Simulation
In recent years computation which com-
prises computer-based modeling and simulation,
see Figure 2, has become the third research
methodology within CS, complementing theory
and experiment.
Computational Science has emerged, at the
intersection of Computer Science, applied Mathe-
matics, and Science disciplines in both theoretical
investigation and experimentation.
5
Science Disciplines:
Physics, Chemistry,
Biology, etc.
Computational Science
(Teamwork and
Collaboration)
Applied Mathematics
Numerical Analysis,
Modeling, Simulation
Computer Science
Science Disciplines:
Physics, Chemistry,
Biology, etc.
Computational Science
(Teamwork and
Collaboration)
Applied Mathematics
Numerical Analysis,
Modeling, Simulation
Computer Science
Figure 2 Computational Science
Mastery of Computational Science tools,
such as modeling with 3D visualization and com-
puter simulation, efficient handling of large data
sets, ability to access a variety of distributed re-
sources and collaborate with other experts over
the Internet, etc. are now expected of university
graduates, not necessarily Computer Science ma-
jors. Those skills are becoming a part of scientific
culture.
Today, computing environments and methods
for using them have become powerful enough to
tackle problems of great complexity. With the
dramatic changes in computing, the need for dy-
namic and flexible Computational Science be-
comes ever more obvious.
Computer simulation makes it possible to in-
vestigate regimes that are beyond current ex-
perimental capabilities and to study phenomena
that cannot be replicated in laboratories, such as
the evolution of the universe. In the realm of Sci-
ence, computer simulations are guided by theory
as well as experimental results, while the compu-
tational results often suggest new experiments
and theoretical models. In engineering, many
more design options can be explored through
computer models than by building physical ones,
usually at a small fraction of the cost and elapsed
time.
Even though the term ''simulation'' is old, it
reflects the way in which a good deal of Science
will be done in the next century. Scientists will
perform computer experiments in addition to test-
ing scientific hypotheses by performing experi-
ments on actual physical objects of investigation.
One can say that simulation represents a funda-
mental discipline in its own right regardless of the
specific application.
Computational Science involves the use of
computers (''supercomputers'') for visualization
and simulation of complex and large-scale phe-
nomena. Studies involving N body simulations,
molecular dynamics, weather prediction and finite
element analysis are within the thrust of Compu-
tational Science. If Computer Science has its ba-
sis in computability theory, then Computational
Science has its basis in computer simulation.
Some of the key focus areas for simulation
are: Chaos and Complex Systems, Virtual Reality,
Artificial Life, Physically Based Modeling and
Computer Animation.
The computing power of present day ma-
chines enables us to simulate an increasing num-
ber of phenomena and processes; especially the
non-linear ones. Modern graphic capabilities
makes this method a very attractive and user
friendly.
3. Bird’s Eye View of Science
The whole is more than the sum of its
parts. Aristotle, Metaphysica
In order to be able to talk about Computer
Science, let us take a closer look at the very defi-
nition of Science
SayingScience” we actually mean plurality
of different Sciences. Different Sciences differ
very much from each other. The definition of Sci-
ence is therefore neither simple nor unambiguous.
See [14] and [15] for several possible classifica-
tions. For example, History and Linguistics are
often but not always catalogued as Sciences.
6
3.1 Classical Sciences
Culture
(Religion, Art, …)
5
Natural Sciences
(Physics,
Chemistry,
Biology, …)
2
Social Sciences
(Economics,
Sociology,
Anthropology, …)
3
The Humanities
(Philosophy, History,
Linguistics …)
4
Logic
&
Mathematics
1
Culture
(Religion, Art, …)
5
Natural Sciences
(Physics,
Chemistry,
Biology, …)
2
Social Sciences
(Economics,
Sociology,
Anthropology, …)
3
The Humanities
(Philosophy, History,
Linguistics …)
4
Logic
&
Mathematics
1
Figure 3 What is Science?
The figure above suggests that traditional
Sciences have specific areas of validity. The
Logic and Mathematics (the most abstract and at
the same time the most exact Sciences) are more
or less important part of every other Science.
They are very essential for Physics, less impor-
tant for Chemistry and Biology1, and their signifi-
cance continues to decrease towards the outer
regions of our scheme.
The logical reasoning as a basis of all human
knowledge is of course present in every kind of
Science as well as in Philosophy.
The figure above may be seen in analogy
with a microscope view. With the highest resolu-
tion we can reach the innermost region. Inside the
central region Logic is not only the tool used to
make conclusions. It is at the same time the ob-
ject of investigation. Even though big parts of
Mathematics can be reduced to Logic (Frege,
Rusell and Whitehead) the complete reduction is
impossible.
On every step of zooming out, the inner re-
gions are given as prerequisites for the outer
ones. Physics is using Mathematics and Logic as
tools, without questioning their internal structure.
In that way information about the deeper struc-
ture of Mathematics and Logic is hidden looking
from the outside. In much the same way, Physics
is a prerequisite for Chemistry that is a hidden
level inside Biology etc.
The basic idea of Figure 3 is to show in a
schematic way the relation between the three
main groups of Sciences (Logic & Mathematics,
Natural Sciences and Social Sciences) as well as
the connections to thought systems represented
by the Humanities. Finally the whole body of hu-
man knowledge, scientific and speculative is im-
mersed in and impregnated by the cultural envi-
ronment.
3.2 The Scientific Method
The scientific method is the logical scheme
used by scientists searching for answers to the
questions posed within Science. Scientific method
is used to produce scientific theories, including
both scientific meta-theories (theories about theo-
ries) as well as the theories used to design the
tools for producing theories (instruments, algo-
rithms, etc). The simple version looks something
like this (see also Figure 4):
1. Pose the question in the context of existing
knowledge (theory & observations).
2. Formulate a hypothesis as a tentative an-
swer.
3. Deduce consequences and make predic-
tions.
4. Test the hypothesis in a specific experi-
ment/theory field.
5. When consistency is obtained the hypothesis
becomes a theory. The results have to be
published.
6. Theory is subject to process of ”natural se-
lection” among competing theories. A win-
ning theory is becoming a new framework
within which observations/theoretical facts
are explained and predictions are made. The
process can start from the beginning, but the
state 1 has changed to include the new the-
ory/improvements of old theory.
It is crucial to understand that the Logic of
Science is recursive. Prior to every observa-
tion/experiment/theoretical test there is a hy-
pothesis (2) that has its origins in the pre-existing
body of knowledge (1). Every experimen-
7
tal/observational result has a certain world-view
built-in. Or, to say it by Feyerabend [16], every
experimental data is “theory-contaminated”.
The scheme of the scientific method in Fig-
ure 4 is without a doubt an abstraction and simpli-
fication. Critics of the hypothetico-deductive
method would argue that there is in fact no such
thing as “the scientific method” [16]. By the term
“the scientific method” they actually mean the
concrete set of rules defining how to proceed in
posing new relevant questions and formulating
successful hypotheses. Of course, no such magic
recipe exists!
The important advantage of the scientific
method is that it is impartial:2 one does not have to
believe a given researcher, one can (in principle)
repeat the experiment/theoretical derivation and
determine whether certain results are valid or not
(the hypotetico-deductive cycle of Figure 4). The
question of impartiality is closely related to open-
ness and universality of Science, which are its
fundamental qualities. A theory is accepted based
in the first place on the results obtained through
logical reasoning, observations and/or experi-
ments. The results obtained using the scientific
method has to be reproducible.
All scientific truths are provisional. But for a
hypothesis to get the status of a theory it is nec-
essary to win the confidence of the scientific
community (the scientific community cycle of
Figure 4).
3.3 Sciences Belonging to Several Fields
The development of human thought parallel
to the development of human society has led to an
emergence of Sciences that do not belong to any
of the classic types we have described earlier, but
rather share common parts with several of these.
THE SCIENTIFIC METHOD
The hypotetico -deductive cycle
EXISTING THEORIES
AND OBSERVATIONS
1
SELECTION AMONG
COMPETING THEORIES
6
EXISTING THEORY CONFIRMED
(within a new context) or
NEW THEORY PUBLISHED
5
Hypotesen
måste
justeras
PREDICTIONS
3
HYPOTHESIS
2
TESTS AND NEW
OBSERVATIONS
4
Hypothesis must
be redefined
Hypothesis must
be adjusted
The scientific-community cycle
Consistency achieved
Figure 4 Diagram describing iterative nature of the hypothetico-deductive method
Many of the modern Sciences are of inter-
disciplinary, eclectic type. It is a trend for new
Sciences to search their methods and even ques-
tions in very broad areas. It can be seen as a re-
sult of the fact that the communications across
the borders of different scientific fields are nowa-
days much easier and more intense than before.
Computer Science for example includes the
field of Artificial Intelligence that has its roots in
Mathematical Logic and Mathematics but uses
Physics, Chemistry and Biology and even has
parts where medicine and Psychology are very
important.
We seem to be witnessing an exciting para-
digm shift:
We should, by the way, be prepared for some
radical, and perhaps surprising, transforma-
8
tions of the disciplinary structure of Science
(Technology included) as information proc-
essing pervades it. In particular, as we be-
come more aware of the detailed information
processes that go on in doing Science, the
Sciences will find themselves increasingly
taking a meta-position, in which doing Sci-
ence (observing, experimenting, theorizing,
testing, archiving,) will involve understand-
ing these information processes, and building
systems that do the object-level Science.
Then the boundaries between the enterprise
of Science as a whole (the acquisition and
organization of knowledge of the world) and
AI (the understanding of how knowledge is
acquired and organized) will become increas-
ingly fuzzy.
Allen Newell, Artif. Intell. 25 (1985) 3.
Here we can find a potential of the new syn-
thetic (holistic) worldview that is about to emerge
in the future.
4. Science, Research, Technolgy
The traditional Aristotelian sharp distinction
between Science and Technology seem to fail
when applied to contemporary Science, because
the underlying concepts have changed over time.
Today's Science is much more complex and het-
erogeneous than Science of the Aristotle’s time
(that emerged as a part of Philosophy)
.
Figure 5 Relations between Science,
Research, Development and Technology
The figure 5 above illustrates the fact that
there is an essential overlap between contempo-
rary Science, Research, Development and Tech-
nology.
That is one of the reasons why Philosophy of
Science is in vital need of a deeper, more realistic
understanding of contemporary Sciences.
5. Problem with the Traditional View: In
what way is CS a Science? AI example
Let us take as an example Artificial Intelli-
gence (AI) that is a branch of Computer Science
according to Computing Curricula [8].
AI is a discipline with two distinct facets:
Science and Engineering which is the case for CS
in general. The scientific facet of AI attempts to
understand intelligence in humans, other animals
information processing machines and robots. The
engineering facet attempts to apply such
knowledge in designing new kinds of machines.
AI is generally associated with Computer
Science, but it has many important links with
other fields such as Maths, Psychology, Cognition,
Biology, Linguistics and Philosophy, Behavioral
and Brain Sciences among many others. Our abil-
ity to combine knowledge from all these fields will
ultimately benefit our progress in the quest of
creating an intelligent artificial being.
The scientific facet, which has motivated
most of the pioneers and leaders in the field, is
concerned with two main goals (a) attempting to
understand and model the information processing
capabilities of typical human minds, (b) attempting
to understand the general principles for explaining
and modelling intelligent systems, whether human,
animal or artificial. This work is often inspired by
research in Philosophy, Linguistics, Psychology,
NeuroScience or Social Science. It can also lead
to new theories and predictions in those fields.
The engineering facet, which motivates most
of the funding agencies and (consequently)
younger researchers, is concerned with attempt-
ing to design new kinds of machines able to do
things previously done only by humans and other
animals and also new tasks that lie beyond human
intelligence.
There is another engineering application of
AI: using the results of the scientific facet to help
design machines and environments that can help
human beings. This may, including the production
of intelligent machines.
Table 1
9
Sub-fields of AI Related Fields
Perception, espe-
cially vision but also
auditory and tactile per-
ception, and more re-
cently taste and smell.
Philosophy,
Cognition, Psychol-
ogy, Mathematics,
Biology, Medicine,
Behavioral Sciences,
Brain Sciences
Natural language
processing, including
production and interpre-
tation of spoken and
written language,
whether hand-written,
printed, or electronic
throughout (e.g. email).
Linguistics, Psy-
chology, Philosophy,
Logic, Mathematics,
Behavioral Sciences,
Brain Sciences
Learning and de-
velopment, including
symbolic learning proc-
esses, the use of neural
nets), the use of evolu-
tionary algorithms, self-
debugging systems, and
various kinds of self-
organization.
Logic, Philoso-
phy, Mathematics,
Biology, Medicine,
Behavioral Sciences,
Brain Sciences
Planning, problem
solving, automatic de-
sign: given a complex
problem and a collection
of resources, constraints
and evaluation criteria
create a solution which
meets the constraints
and does well or is op-
timal according to the
criteria, or if that cannot
be done propose some
good alternatives.
Logic, Mathe-
matics, Philosophy
Robotics: is studied
for the purpose of pro-
ducing new kinds of
machines, and because
designing complete
working robots provides
a test bed for integrating
theories and techniques
from various sub-areas
of AI, e.g. perception,
learning, memory, motor
control, planning, etc.
Philosophy,
Cognition, Psychol-
ogy, Mathematics,
Biology, Medicine,
Behavioral Sciences,
Brain Sciences
I.e. it is a context for
exploring ideas about
complete systems.
Table 1 suggests how complex the field of
AI is, and how many connections to other scien-
tific and further cultural phenomena it has. For a
more comprehensive survey see [20].
6. Conclusions
Computer Science is a new field and its ob-
ject of investigation (Universe) is a computer,
which is an ever-developing artifact, the materi-
alization of the ideas that try to structure knowl-
edge and the information about the world, includ-
ing computers themselves. Already the subject of
investigation of CS suggests that the traditional
Science paradigm may not apply for CS.
However, in spite of all characteristics that
differ the young field of Computer Science from
several thousand years old Sciences such as
Mathematics, Logic, and Natural Sciences we
can draw a conclusion that Computer Science
contains a critical mass of scientific features to
qualify as a Science. CS has a traditional core of
“hard” (exact) Sciences.
From the principal point of view it is impor-
tant to point out that all modern Sciences are very
strongly connected to Technology. This is very
much the case for Biology, Chemistry and Phys-
ics, and even more the case for Computer Sci-
ence.
The engineering parts in the Computer Sci-
ence have both connection to the hardware
(physical) aspects of computer and software.
The important difference is that the com-
puter (the physical object that is directly related
to the theory) is not a focus of investigation (not
even in the sense of being the cause of certain
algorithm proceeding in certain way) but it is
rather theory materialized, a tool always capa-
ble of changing in order to accommodate even
more powerful theoretical concepts.
Computer Science in general and especially
its field of Intelligent Systems show methodologi-
cal and thematic features that are essentially dif-
ferent from Physics and other traditional Sci-
ences. There are two alternatives at present: (i)
deny the Computer Science the scientific status
(ii) accept CS as Science of a special eclectic
kind that incorporates both “hard” and “soft” sci-
entific traditions and even inherits common ques-
10
tions, themes and methods from such fields as
Linguistics, Psychology, Anthropology, Philosophy
or even other Arts.
Actually, taking into account the present de-
velopment within different scientific fields, the
above dilemma appears rhetoric. Science is sim-
ply not the same thing it was in the last century.
For Computer Science students in order to
be able to perceive the holistic view of their field
it is essential to be educated in Theory of Science
that takes into account reality of contemporary
Science. The time is ripe for paradigm shift in
Philosophy of Science!
References
[1] The Logic of Scientific Discovery, Popper, K.R.
NY: Routledge, 1999
[2] An Introduction to The Philosophy of Science,
Carnap, R. NY: Basic Books, 1994
[3] The Structure of Scientific Revolutions, Kuhn, T.
Chicago: Univ. Chicago Press, 1962
[4] What is This Thing Called Science?, Chalmers, A.
Hackett Publishing Co., 1990
[5] Computer Science; The Search for the Mathe-
matical Theory, Michael S. Mahoney, in Science
in the Twentieth Century
[6] http://www.well.com/user/hlr/texts/tftindex.html,
Tools For Thought: The History and Future of
Mind-Expanding Technology, Howard Rheingold,
Simon & Schuster, 1985
[7] Exploring the Power of Computing, J.E. Savage,
Addison-Wesley,1998.
[8] http://www.computer.org/education/cc2001/index.
htm, Computing Curricula 2001
[9] Observations About the Development of Theo-
retical Computer Science, J Hartmanis, in Foun-
dations of Computer Science. 20th Annual Symp o-
sium Papers, 1979
[10] http://books.nap.edu/html/acesc/, Academic Ca-
reers for Experimental Computer Scientists and
Engineers, National Research Council Washing-
ton, D.C.
[11] ACM president's letter: performance analysis:
experimental Computer Science as its best, Peter
J. Denning, ACM Communications, Vol 24, Issue
11, November 1981.
[12] Should Computer Scientists Experiment More?,
Tichy, W.F., Computer , Vol 31 Issue 5 , May 1998.
[13] An Experimental Evaluation of the Assumption of
Independence in Multiversion Programming,”
J.C. Knight and N.G. Leveson, IEEE Trans. Soft-
ware Eng., Jan. 1986, pp. 96-109.
[14] http://www.chem.ualberta.ca/~plambeck/udc/,
Universal Decimal Classification (UDC)
[15] http://www.tnrdlib.bc.ca/dewey.html, Dewey
Decimal Classification
[16] Against Method, Feyerabend, P. London, U.K.:
Verso, 2000
[17] Computing as a Discipline, Denning, P.J. et al.
Commun. ACM 32, 1 (January 1989), 9
[18] Research Paradigms In Computer Science, Peter
Wegner, Proc. 2nd Int. Conference on Software En-
gineering, 1976, San Francisco, California
[19] Foundations of Computer Science, A. V. Aho, J.
D. Ullman, W.H. Freeman, New York, 1992.
[20] http://www.cs.bham.ac.uk/~axs/courses/ai.h
tml AI An IllustrativeOverview
Aaron Sloman School of Computer Science
The University of Birmingham
1 This is obviously a gross simplification. For e.g. computational biology and bioinformatics is mathematics the
very essence of the field!
2 Impartial is used here as synonymous for objective, unbiased, unprejudiced, and dispassionate. Note, how-
ever that this is the statement about science, not about individual scientists whose attitude to their pursuit is
on the contrary as a rule indeed passionate. The fact that science is shared by the whole scientific community
results in theories that are in a great extent free from individual bias. On the other hand the whole of scientific
community use to share common paradigms, which are the very broad concepts deeply rooted in the culture.
Paradigm shift is a process that occurs in a very dramatic way, partly because of cultural (not strictly ra-
tional) nature of paradigm, (Kuhn).
Full-text available
Article
Computing is changing the traditional field of Philosophy of Science in a very profound way. First as a methodological tool, computing makes possible ``experimental Philosophy'' which is able to provide practical tests for different philosophical ideas. At the same time the ideal object of investigation of the Philosophy of Science is changing. For a long period of time the ideal science was Physics (e.g., Popper, Carnap, Kuhn, and Chalmers). Now the focus is shifting to the field of Computing/Informatics. There are many good reasons for this paradigm shift, one of those being a long standing need of a new meeting between the sciences and humanities, for which the new discipline of Computing/Informatics gives innumerable possibilities. Contrary to Physics, Computing/Informatics is very much human-centered. It brings a potential for a new Renaissance, where Science and Humanities, Arts and Engineering can reach a new synthesis, so very much needed in our intellectually split culture. This paper investigates contemporary trends and the relation between the Philosophy of Science and the Philosophy of Computing and Information, which is equivalent to the present relation between Philosophy of Science and Philosophy of Physics.
Full-text available
Article
Do computer scientists need to experiment at all? Only if the answer is "yes" does it make sense to ask whether there is enough of it. The author argues that experimentation is central to the scientific process. Only experiments test theories. Only experiments can explore critical factors and bring new phenomena to light, so theories can be formulated and corrected. Without experiments, according to the author, computer science is in danger of drying up and becoming an auxiliary discipline. The current pressure to concentrate on application is the writing on the wall. The author rebuts the eight most common objections computer scientists have to focusing on experimentation: The traditional scientific method isn't applicable. The current level of experimentation is good enough. Experiments cost too much. Demonstrations will suffice. There's too much noise in the way. Progress will slow. Technology changes too fast. You'll never get it published. In contrast, the author argues that experimentation would build a reliable base of knowledge and thus reduce uncertainty about which theories, methods, and tools are adequate; lead to new, useful, and unexpected insights and open whole new areas of investigation; and accelerate progress by quickly eliminating fruitless approaches, erroneous assumptions, and fads. Conversely, when we ignore experimentation and avoid contact with reality, we hamper progress. As computer science leaves adolescence behind, the author advocates the development of its experimental branch.
Article
N-version programming has been proposed as a method of incorporating fault tolerance into software. Multiple versions of a program (i.e. `N') are prepared and executed in parallel. Their outputs are collected and examined by a voter, and, if they are not identical, it is assumed that the majority is correct. This method depends for its reliability improvement on the assumption that programs that have been developed independently will fail independently. An experiment is described in which the fundamental axiom is tested. In all, 27 versions of a program were prepared independently from the same specification at two universities and then subjected to one million tests. The results of the tests revealed that the programs were individually extremely reliable but that the number of tests in which more than one program failed was substantially more than expected. The results of these tests are presented along with an analysis of some of the faults that were found in the programs. Background information on the programmers used is also summarized.
Article
Scitation is the online home of leading journals and conference proceedings from AIP Publishing and AIP Member Societies
Article
N-version programming has been proposed as a method of incorporating fault tolerance into software. Multiple versions of a program (i.e. "N") are prepared and executed in parallel. Their outputs are collected and examined by a voter,and, if theyare not identical, it is assumed that the majority is correct. This method depends for its reliability improvement on the assumption that programs that have been developed independently will fail independently.Inthis paper an experiment is described in which the fundamental axiom is tested. Atotal of twenty sevenversions of a program were prepared independently from the same specification at twouniversities and then subjected to one million tests. The results of the tests revealed that the programs were individually extremely reliable but that the number of tests in which more than one program failed was substantially more than expected. The results of these tests are presented along with an analysis of some of the faults that were found in the programs. Background information on the programmers used is also summarized. The conclusion from this experiment is that N-version programming must be used with care and that analysis of its reliability must include the effect of dependent errors. Keywords and Phrases: Multi-version programming, N-version programming, software reliability,fault-tolerant software, design diversity. * This work was sponsored in part by NASA grant number NAG1-242 and in part by a MICROgrant cofunded by the state of California and Hughes Aircraft Company. 1.
  • W F Tichy
Should Computer Scientists Experiment More?, Tichy, W.F., Computer, Vol 31 Issue 5, May 1998.
  • Computing
  • Discipline
  • P J Denning
Computing as a Discipline, Denning, P.J. et al. Commun. ACM 32, 1 (January 1989), 9
html, Tools For Thought: The History and Future of Mind-Expanding Technology
  • S Michael
  • Mahoney
Computer Science; The Search for the Mathematical Theory, Michael S. Mahoney, in Science in the Twentieth Century [6] http://www.well.com/user/hlr/texts/tftindex.html, Tools For Thought: The History and Future of Mind-Expanding Technology, Howard Rheingold, Simon & Schuster, 1985