Expert Systems

Published by Wiley

Online ISSN: 1468-0394

·

Print ISSN: 0266-4720

Articles


Device-independent color correction for multimedia applicationsusing neural networks and abductive modeling approaches
  • Conference Paper

July 1996

·

38 Reads

·

L.C. Rabel

·

E. Onjeyekwe
Although color has appeal for developers and consumers alike, color reproduction poses a major problem in many computer based applications inducting multimedia and desktop publishing. The problem arises because of device independence of color, and the way each device processes color. In order to control the error in porting color, different traditional techniques have been applied. In this paper the utilization of artificial neural networks as well as abductive modelling approaches to color error reduction are introduced from an RGB (red/green/blue) color model perspective. Analysis of the results and ongoing research issues are discussed
Share

Acquisition of hierarchy-structured probabilistic decision tables and rules from data

February 2002

·

55 Reads

The paper is concerned with the creation of predictive models from data within the framework of the variable precision rough set model. The article is focused on two aspects of the model derivation: computation of uncertain, in general, rules from information contained in probabilistic decision tables and forming hierarchies of decision tables with the objective of reduction or elimination of decision boundary in the resulting classifiers. A new technique of creation of linearly structured hierarchy of decision tables is introduced and compared to tree structured hierarchy. It is argued that the linearly structured hierarchy has significant advantages over tree structured hierarchy

REDSIEX: A Cooperative Network of Expert Systems with Blackboard Architectures

February 1996

·

19 Reads

The blackboard architecture, originally developed for the system that permits the comprehension of language, HEARSAY II, has later been used in a great variety of domains and in various environments for the construction of systems. From the classic architecture of HEARSAY II, many applications, generalizations, extensions and refinements have been developed. In this paper we present REDSIEX, (RED de SIstemas EXpertos) which is a network of expert systems within a blackboard architecture, for the cooperative solution of distributed problems. The REDSJEX system inherits various of the elements defined by the architecture of HEARSA Y II and incorporates new components and organization. These produce a very characteristic and exclusive global work style in the solution of problems, within a conceptual framework of emergent control. The main structural and functional characteristics of REDSIEX are discussed.

Neural network-based real-time robot tracking controller using position sensitive detectors

May 1995

·

31 Reads

A real-time visual servo tracking system for an industrial robot has been developed. The position sensitive detector or PSD, instead of the CCD, is used as a real time vision sensor due to its fast response (The position is converted to analog current). A neural network learns the complex association between the object position and its sensor reading and uses it to track that object. It also turns out that this scheme lends itself to a convenient way to teach a workpath for the robot. Furthermore, for real-time use of the neural net, a novel architecture has been developed based on the concept of input space partitioning and local learning. It exhibits characteristics of fast processing and learning as well as optimal usage of hidden neurons

A Query Language for Multi-version Data Web Archives

April 2015

·

85 Reads

Marios Meimaris

·

·

Stratis Viglas

·

[...]

·

The Data Web refers to the vast and rapidly increasing quantity of scientific, corporate, government and crowd-sourced data published in the form of Linked Open Data, which encourages the uniform representation of heterogeneous data items on the web and the creation of links between them. The growing availability of open linked datasets has brought forth significant new challenges regarding their proper preservation and the management of evolving information within them. In this paper, we focus on the evolution and preservation challenges related to publishing and preserving evolving linked data across time. We discuss the main problems regarding their proper modelling and querying and provide a conceptual model and a query language for modelling and retrieving evolving data along with changes affecting them. We present in details the syntax of the query language and demonstrate its functionality over a real-world use case of evolving linked dataset from the biological domain.

Belief-Rule-Based Expert Systems for Evaluation of E- Government: A Case Study

March 2014

·

138 Reads

Little knowledge exists on the impact and results associated with e-government projects in many specific use domains. Therefore it is necessary to evaluate the efficiency and effectiveness of e-government systems. Since the development of e-government is a continuous process of improvement, it requires continuous evaluation of the overall e-government system as well as evaluation of its various dimensions such as determinants, characteristics and results. E-government development is often complex with multiple stakeholders, large user bases and complex goals. Consequently, even experts have difficulties in evaluating these systems, especially in an integrated and comprehensive way as well as on an aggregate level. Expert systems are a candidate solution to evaluate such complex e-government systems. However, it is difficult for expert systems to cope with uncertain evaluation data that are vague, inconsistent, highly subjective or in other ways challenging to formalize. This paper presents an approach that can handle uncertainty in e-government evaluation: The combination of Belief Rule Base (BRB) knowledge representation and Evidential Reasoning (ES). This approach is illustrated with a concrete prototype, known as Belief Rule Based Expert System (BRBES) and put to use in the local e-government of Bangladesh. The results have been compared with a recently developed method of evaluating e-Government, and it is shown that the results of BRBES are more accurate and reliable. BRBES can be used to identify the factors that need to be improved to achieve the overall aim of an e-government project. In addition, various "what if" scenarios can be generated and developers and managers can get a forecast of the outcomes. In this way, the system can be used to facilitate decision making processes under uncertainty.

Object/rule integration in CLIPS

February 1993

·

15 Reads

This paper gives a brief overview of the C Language Integrated Production System (CLIPS) with a focus on the object-oriented features. The advantages of an object data representation over the traditional working memory element (WME), i.e., facts, are discussed, and the implementation of the Rete inference algorithm in CLIPS is presented in detail. A few methods for achieving pattern-matching on objects with the current inference engine are given, and finally, the paper examines the modifications necessary to the Rete algorithm to allow direct object pattern-matching.

Figure 1: EigenSpot algorithm, an illustrative example. The goal of the approach is the identification of the shaded area in the cases matrix. The values c and b in the baseline and cases matrix are counts corresponding to a spatiotemporal window. The process is composed of the following four steps: (1) matrix decomposition; (2) subtraction of pair singular vectors elements; (3) applying the z-score control chart on the subtract vector; and (4) combining the spatial and temporal hotspots components.
Figure 2: Mean accuracy for 16000 data sets averaged for 173 α from 0.20 to 0.01.
Figure 3: Mean accuracy of STScan and EigenSpot corresponding to different settings for 173 α from 0.20 to 0.01 averaged for 100 data sets.
Table 3 : Average accuracy for 1500 data sets averaged for 173 α from 0.20 to 0.01
Figure 4: Detected hotspot via STScan (left) and EigenSpot (right).
Eigenspace Method for Spatiotemporal Hotspot Detection
  • Article
  • Full-text available

June 2015

·

424 Reads

Hotspot detection aims at identifying subgroups in the observations that are unexpected, with respect to the some baseline information. For instance, in disease surveillance, the purpose is to detect sub-regions in spatiotemporal space, where the count of reported diseases (e.g. Cancer) is higher than expected, with respect to the population. The state-of-the-art method for this kind of problem is the Space-Time Scan Statistics (STScan), which exhaustively search the whole space through a sliding window looking for significant spatiotemporal clusters. STScan makes some restrictive assumptions about the distribution of data, the shape of the hotspots and the quality of data, which can be unrealistic for some nontraditional data sources. A novel methodology called EigenSpot is proposed where instead of an exhaustive search over the space, tracks the changes in a space-time correlation structure. Not only does the new approach presents much more computational efficiency, but also makes no assumption about the data distribution, hotspot shape or the data quality. The principal idea is that with the joint combination of abnormal elements in the principal spatial and the temporal singular vectors, the location of hotspots in the spatiotemporal space can be approximated. A comprehensive experimental evaluation, both on simulated and real data sets reveals the effectiveness of the proposed method.
Download

A Bibliography of Genetic Algorithm Business Application Research: 1988–June 1996

October 2008

·

13 Reads

The purpose of this research is to present a comprehensive bibliography of genetic algorithm application research in business. Ninety-seven genetic algorithm papers (98 applications) are identified through the exhaustive literature searches. A classification of these articles by application area reveals that genetic algorithms are being used for a diverse range of corporate functional activities, particularly in the areas of production/operations and information systems. Information on the genetic algorithm development tool/language and the computer operating environment as reported in each article is included. Those journals which have published the most genetic algorithm business applications are also presented.

Bibliography of neural network business applications research: 1988-September 1994

April 2007

·

33 Reads

The purpose of this research is to present a comprehensive bibliography of neural network application research in business. One hundred and twenty-seven neural network application papers and reports are identified through exhaustive literature searches. A classification of these articles by application area reveals that neural networks are being used for a diverse range of corporate functional activities, particularly in the areas of production/operations and finance. Information on the neural network development language/tool, the learning paradigm and the computer operating environment as reported in each article is included. Those journals which have published the most neural network business applications are also presented.

1st Class

April 2007

·

49 Reads

FIRST CLASS is an inductive expert system development tool written by William Hapgood and supplied by 1 st-Class Expert Systems Inc. The product runs on IBM PC, XT and AT computers and close compatibles, running MS/DOS 2.0 or higher. It is sold in two versons, 1st-Class (at $495 U.S.) and 1st-Class Fusion (at $1295). This is a review of the plain version; but mention of ‘Fusion’ facilities is made where appropriate. 1st-Class Expert Systems Inc. can be contacted at: 286 Boston Post Road, Wayland, Massachusetts 1778, USA (617 358–7722).

A fuzzy approach to active usage parameter control in IEEE 802.11b wireless networks

November 2004

·

14 Reads

Usage parameter control (UPC) provides support for quality of service across heterogeneous networks. For the network operator UPC assists in limiting network usage through traffic shaping, to prevent unacceptable delay. Traditional methods to apply UPC involve the generic cell rate algorithm or ‘leaky bucket’ algorithm, now commonly implemented in asynchronous transmission mode networks. This paper proposes a novel form of UPC for 802.11b wireless networks. The method proposed measures the rate of individual network flows to actively manage link utilization using a fuzzy logic controller (FLC). The FLC monitors the flow rate and adjusts the sending transmissions to stabilize flows as close to the optimum desired rate as possible. Imposing UPC and using the FLC within a packet switched TCP network enforces cooperation between competing streams of traffic. After carrying out experiments within a wireless network, the results obtained significantly improve upon a ‘best effort’ service.

An Expert System Approach to ISO 9000 Requirements for Foundry Operations

December 2002

·

33 Reads

ISO 9000 certification is becoming an important competitive factor for foundries. Considerable organizational effort and expertise are needed to achieve this certification, which may be too time-consuming and expensive for many small to medium-sized companies. This paper will present the development of an object-oriented system to support the ISO certification process, with particular emphasis on clause 4.20 (use of statistical techniques). Recommendations for the design and implementation of each statistical tool are provided by the system, as well as training examples of suitable process monitoring and improvement procedures. The system runs on a microcomputer platform and was constructed utilizing Visual BASIC.

A function‐centered framework for reasoning about system failure at multiple levels of abstraction

December 2002

·

10 Reads

This paper presents the knowledge organization for a simulation subsystem that is a component of a comprehensive expert system for failure modes and effects analysis. Organizing the simulation subsystem’s knowledge base around a function-centered ontology produces an architecture that facilitates reasoning about an engineering design at multiple levels of abstraction and throughout the life-cycle of the design. Moreover, the resulting architecture provides the capability for incorporating computer-aided analysis and design tools early on into the conceptual design of an engineering system before a commitment is made to a specific technology to implement the system’s behavior. The result is an expert system simulation knowledge source that can be used to reason about the effects of system failures based on conceptual designs, i.e. designs in which commitments to an underlying technology to achieve the system’s function have not yet been made but computer-aided assistance for reasoning about the system’s potential failure modes and effects is useful.

Expert system for nuclear power plant accident diagnosis using a fuzzy inference method

December 2002

·

26 Reads

Huge and complex systems such as nuclear power generating stations are likely to cause the operators to make operational mistakes for a variety of inexplicable reasons and to produce ambiguous and complicated symptoms in the case of an emergency. Therefore, a safety protection system to assist the operators in making proper decisions within a limited time is required. In this paper, we develop a reliable and improved diagnosis system using the fuzzy inference method so that the system can classify accident symptoms and identify the most probable causes of accidents in order for appropriate actions to be taken to mitigate the consequences. In the computer simulation, the proposed system proved to be able to classify accident types within only 20–30 s. Therefore, the corresponding operation guidelines can be determined in a very short time to put the nuclear power plant in a safe state immediately after the accident.

Barriers to adopting management expert systems: Case studies of management accounting applications which failed

April 2007

·

35 Reads

The purpose of this paper is to draw attention to some important barriers to the practical application of expert systems to management problems. By drawing attention to these barriers, we show two findings: (a) that certain types of knowledge found in management situations are more likely to lead to the successful adoption of management expert systems (MESs) than others; (b) that a potential exists for an alternative form of MES application. The first finding has implications for knowledge elicitation. We propose that MESs developed from academic knowledge, which we define in specific terms, are more likely to succeed than those developed from working with managers in the field. Alternatively, feasibility studies should include an analysis of the type of knowledge which a manager is capable of providing to ensure that suitable knowledge is available for knowledge elicitation purposes. The second finding has implications for the developers of MESs at both research and commercial levels. We suggest the need for a check-list MES application, capable of converting responses to strategic and operational questions into managerial actions.

Using the CARMEN'S framework to model and acquire knowledge

December 2002

·

123 Reads

The manner in which CARMEN (Constraints And Rules Management ENgine), an operational expert system generator developed as part of the ESPRIT project ITHACA, models and acquires knowledge is discussed. Central to CARMEN are three types of control entities (TASKs, MKSs and BKSs) which are used to describe problem-solving at the knowledge-use level. Modeling of domain knowledge is done both at the deep and surface levels. The Integrity Checking Task (or ICT) is used to illustrate the manner in which CARMEN works. We conclude the paper with a comparison of CARMEN with other approaches in knowledge modeling.

Acquiring and representing strategic knowledge in the diagnosis domain

April 2007

·

12 Reads

Techniques for acquiring and representing strategic knowledge for guiding diagnostic processes are presented. In a diagnostic expert system, strategic knowledge can be represented either by a specific knowledge base or it can be ‘embedded’ into the inference engine. We decided for the former; so that knowledge can be acquired or modified without affecting the problem solving paradigm. Strategic knowledge is acquired by expert interview in a straightforward way: on the basis of simple information provided by the expert, an internal sophisticated representation is automatically generated. The techniques are not restricted to a particular problem-solving paradigm or application. However, in order to prove the effectiveness of our approach, a problem solving paradigm is also presented. The paradigms adopted in diagnosis must face two problems: the selection of the ‘right’ hypothesis (fault) to pursue and the selection of the ‘right’ observation (measurement) to be executed. We present some criteria for selecting hypotheses and observations. Our proposal is suitable for domains where the measurements to localise the fault do not always provide certainty but only a ‘degree of belief’ about the presence of the fault. As a consequence, the problem of selecting the right measurement is solved by appropriate criteria and heuristic reasoning. Moreover, we do not consider ‘right’ as a predefined concept: actually, it is based on the information provided by the expert. So he can define this concept on the basis of his own judgment.

Extracting Expertise From Experts: Methods for Knowledge Acquisition

August 1987

·

48 Reads

Knowledge acquisition is the biggest bottleneck in the development of expert systems. Fortunately, the process of translating expert knowledge to a form suitable for expert system development can benefit from methods developed by cognitive science to reveal human knowledge structures. There are two classes of these investigative methods, direct and indirect. We provide reviews, criteria for use, and literature sources for all principal methods. Direct methods discussed are: interviews, questionnaires, observation of task performance, protocol analysis, interruption analysis, closed curves, and inferential flow analysis. Indirect methods include: multidimensional scaling, hierarchical clustering, general weighted networks, ordered trees, and repertory grid analysis.

Process factors in knowledge acquisition

April 2007

·

16 Reads

Knowledge acquisition (KA) is often characterised as a crucial bottleneck in the development of expert system applications. General definitions of KA view it as a collection of subprocesses such as elicitation, analysis and representation, but in actual practice, each KA occurrence may involve one or all of these subprocesses in varying sequences and combinations. No model or framework currently exists for describing the many possible variations in KA processes as they actually occur. This article presents a new way of characterising knowledge acquisition processes that is not tied to one particular technique or approach to KA. Three nested levels are proposed to characterise the many possible variations and combinations of KA dynamics: the process, episode and transaction levels of analysis. Each of these is further delineated in a top-down manner. Not only does this scheme provide a fairly comprehensive means for viewing KA dynamics as a whole, but it suggests several factors that have heretofore drawn little research attention. The suggested constructs capture more accurately what actually occurs in the practice of KA. In so doing they also provide the foundation for structuring future research into how KA processes may unfold.

The Knowledge Acquisition Bottleneck: Time for Reassessment?

August 1988

·

182 Reads

Knowledge acquisition has long been considered to be the major constraint in the development of expert systems. Conventional wisdom also maintains that the major problem encountered in knowledge acquisition is in identifying the varying structures and characteristics of domain knowledge and matching these to suitable acquisition techniques. With the aid of the first substantial systematic analysis of a sample of expert systems applications developed in the real world, the authors describe what is actually going on in terms of knowledge acquisition. In the light of the evidence, it is argued that a reappraisal of the conventional approach to knowledge acquisition is necessary.

D-KAT: A deep knowledge acquisition tool

April 2007

·

34 Reads

This paper describes a system of shallow and deep knowledge acquisition and representation for diagnostic expert systems. The acquisition system is integrated into a diagnostic expert system shell. Shallow knowledge is represented in a failure model as a set of cause-effect relations among the possible faults, while deep knowledge is represented in three deep models: a functional, a deep causal and a taxonomic model. The acquisition and the representation of all the models are fully integrated. The deep knowledge is used by the final expert system in order to provide the user with deep explanations of the cause-effect relations of the failure model.

Support for knowledge acquisition in the Knowledge Engineer's Assistant (KEATS)

February 1988

·

24 Reads

Abstract The ‘Knowledge Engineer's Assistant’ (KEATS) is a software environment suitable for constructing knowledge-based systems. In this paper, we discuss its role in supporting the knowledge engineer in the tasks of knowledge elicitation and domain understanding. KEATS is based upon our own investigations of the behaviour and needs of knowledge engineers and provides two enhancements to other modern ‘shells’. ‘toolkits’, and ‘environments’ for knowledge engineering: (i) transcript analysis facilities, and (ii) a sketchpad on which the KE may draw a freehand representation of the domain, from which code is automatically generated. KEATS uses a hybrid representation formalism that includes a frame-based language and a rule interpreter. We describe the novel components of KEATS in detail, and present an example of how KEATS was used to build an electronic fault diagnosis system.

Expert knowledge acquisition and the unwilling expert: A knowledge engineering perspective

December 2002

·

42 Reads

Expert systems are an evolving technology with the potential to make human expertise widely and cheaply available. The literature describing the development of expert systems generally assumes that experts willingly give up their knowledge. This is unrealistic and may be a reason why most expert system projects fail. This paper explores the problem of unwilling experts from the perspective of a knowledge engineer building an expert system. The link between knowledge and organizational power is established and human motivation theories are discussed. Finally, a new motivational approach is introduced to help the knowledge engineer deal with unwilling experts.

Learning without case records: a mapping of the repertory grid technique onto knowledge acquisition from examples

April 2007

·

12 Reads

In building a knowledge-based system, it is sometimes possible to save time by applying some machine learning process to a set of historical cases. In some problem domains, however, such cases may not be available. In addition, the classes, attributes and attribute values that comprise the partial domain model in terms of which cases are expressed may also not be available explicitly. In these circumstances, the repertory grid technique offers a single process for both building a partial domain model and generating a training set of examples. Alternatively, examples can be elicited directly. This paper explores the relationship between knowledge acquisition from examples and the repertory grid technique, and discusses the shared need for machine learning. Fragments of business-strategy knowledge are used to illustrate the discussion.

Top-cited authors