Article

Converting an Informal Ontology into Ontolingua: Some Experiences

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

We report our experiences of converting a carefully defined informal ontology expressed in natural language into the formal language: Ontolingua. The objectives of this paper are 1) to explore some of the nitty gritty details of formalising ontology definitions and 2) to serve as a basis for clarifying the relationship between this and other approaches to ontology construction (e.g. using competency questions), for the eventual aim of producing a comprehensive methodology.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... According to Pinto and Martins [13], the three most representative methodologies for building ontologies are: TOVE [14], ENTERPRISE [15] and METHONTOLOGY [16]. The first two were proposed at the same time and both have been used for the construction of ontologies to model companies' processes. ...
Conference Paper
Full-text available
People make decisions every day. If the wrong decisions are made in governments and companies, this may cause huge financial losses or other negative consequences. There are several methods to help decision-makers with these hard decisions. Delphi is one of those methodologies. It relies on the wisdom of crowds to extract the best solution from a multidisciplinary group of experts. Long-term decisions may take more than a decade to be proven right or wrong, and as the time passes, the reasons for the decision may be forgotten. There is a lack of methods which enable storage of the reasons for a decision made using the Delphi methodology. We propose an ontology which stores the whole process, thereby allowing the decision-maker to validate the reasons for the decision. If the experts’ opinions are still valid, it is likely that the decision is still valid. We also prove that our ontology is minimal and sufficient to satisfy our objective.
... Uschold and King [9] and Gomez and Perez [10] gave a skeleton methodology for building ontology. These techniques are then extended and refined in multiple ways [11][12][13]. The Suggested Upper Merged Ontology [14] is the largest formal ontology in existence today. ...
Conference Paper
In a text, two concepts can hold either direct or higher order relationship where function of some concepts is considered as another concept. Essentially, we require a mechanism to capture complex associations between concepts. Keeping this in view, we propose a knowledge representation scheme which is flexible enough to capture any order of associations between concepts in factual as well as non-factual sentences. We utilize a five-tuple representation scheme to capture associations between concepts and based on our evaluation strategy we found that by this we are able to represent 90.7 % of the concept associations correctly. This is superior to existing pattern based methods. A use case in the domain of content retrieval has also been evaluated which has shown to retrieve more accurate content using our knowledge representation scheme thereby proving the effectiveness of our approach.
... This ontology will be built using all processes of building ontologies: There are a some methodologies to build ontologies from scratch. The most representative are [18,16], [9] and [5,6]. There are very few methodologies to build ontologies by merging [8,7] and by integration [14,13] although there are some reuse experiences described in the literature, such as [10,15,1]. ...
... An important issue of heterogeneous information integration consists in establishing correlations between various elements of schemas of heterogeneous information sources. Such correlations can be established relying on semantic references of such elements to ontological concepts and reasoning in a formal ontology [12,19]. Regretfully, usually ontological modeling and reasoning is considered separately of the mediation context. ...
Article
Full-text available
Presented research is intended for ontological identification of relevant specifications for semantic context integration of heterogeneous semistructured sources. Metainformation model is defined which includes uniform features for ontol- ogy, thesaurus and classifier modeling. Special technique for integration and mapping of dierent ontologies in this model is defined. The method for identification of specification el- ement correlations in dierent contexts is considered.
... The syntax and semantics of Ontolingua definitions are based on a notation and semantics for an extended version of firstorder predicate calculus called Knowledge Format Interchange Format (KIF). [7] Their experiences during the conversion were recorded in [18]. A complete list of terms and their definition expressed in natural language can be found in [19]. ...
Conference Paper
Full-text available
An information system distinguishes itself from other software as it is developed to facilitate the operation of an organization, hence it reflects its strategies, plans, organizations, processes, marketing etc. We believe that the requirement in the form of domain knowledge acquired in the early stage of system development can be organized and modeled in Enterprise Architecture. However research in Enterprise Architecture focus on other issues rather than requirement engineering. Hence we propose several new ideas on the extension of Zachman Framework, one of the most widely used Enterprise Architectures. We first partition Zachman Framework from two different perspectives, domain engineering phases and requirement engineering techniques. Then we propose a meta-model for Zachman Framework which is adapted and integrated from the Bunge-Wand- Weber ontology and Enterprise Ontology.
... Each concept is represented in Ontolingua with some twenty slots, many of which are not obvious for people that do not understand frames. For example, if somebody wants to understand the definition of Corporation in the Enterprise Ontology as it is represented in Ontolingua (see [18] and [17]) s/he has to bother about the meaning of slots like Set-Cardinality and Relation-Universe. ...
Article
Full-text available
The upper-level ontologies are theories that capture the most common concepts, which are relevant for many of the tasks involving knowledge extraction, representation, and reasoning. These ontologies use to represent the skeleton of the human common-sense in such a formal way that covers as much aspects (or "dimensions") of the knowledge as possible. Often the result is relatively complex and abstract philosophic theory. Currently the evaluation of the feasibility of such ontologies is pretty expensive mostly because of technical problems such as different representations and terminologies used. Additionally, there are no formal mappings between the upper-level ontologies that could make easier their understanding, study, and comparison. As a result, the upper-level models are not widely used. We present OntoMap --- a project with the pragmatic goal to facilitate an easy access, understanding, and reuse of such resources in order to make them useful for experts outside the ontology/knowledge engineering community. A semantic framework on the so called conceptual level that is small and easy enough to be learned on-the-fly is designed and used for representation. Technically OntoMap is a web-site (http://www.ontomap.org) that provides access to upper-level ontologies and hand-crafted mappings between them. Currently it supports just online browsing and DAML+OIL export. The next step will be to provide the resources in various formats, including an application server giving an uniform access to the resources via OKBC. This way OntoMap will become part of the semantic web, i.e. machine-understandable rather than just human-readable.
Chapter
IoT devices now come in all shapes and forms. IoT is everywhere, from our mobile devices to cars. These devices help to perform various tasks from providing locations for navigation purposes, to detecting a heartbeat inside a locked car to notify parents or pet owners that they have left their child or pet inside the car and need attention. The latter example is not possible only with IoT devices; they need algorithms and systems to detect these heartbeats, and this is facilitated by Artificial Intelligence (AI). We will be working with Artificial Intelligence of Things (AIoT), a combination of Artificial Intelligence and IoT.
Thesis
Full-text available
In software testing process a large amount of information is required and generated. This information can be stored as knowledge that needs to be managed and maintained using principles of knowledge management. Ontologies can act as a bridge by representing this testing knowledge in an accessible and understandable way. The purpose of this master thesis is to develop a Top domain ontology (TDO) which represents general software testing knowledge. This can be achieved by unifying the domain vocabularies that are used in the software testing. This top domain ontology can be used to link existing software testing ontologies. It can act as an interface between top-level and domain ontologies and guide the development of new software testing ontologies. The standards of ISTQB were used after careful consideration as the main source of knowledge, other sources such as existing software testing ontologies were also used to develop the ontology. The available ontologies for software testing were collected and evaluated against a list of evaluation criteria. The study shows that the available software testing ontologies do not fulfill the purpose for a TDO. In this work, we developed a TDO by using a combination of two ontology development methods: Ontology 101 and Methontology. The resources used for gaining knowledge and reusing the concepts from available ontologies made it possible for this TDO to have a better coverage in the field of software testing. The ontology was evaluated by using two methods: Competency questions and Ontology expert’s evaluation. The evaluation based on competency questions focuses on the structure of the ontology and shows that the ontology is well formed and delivers expected result. The evaluation by ontology experts was done against a set of quality criteria which represented the quality and coverage of ontology. The results shows that the ontology developed can be used as a TDO after fixing some comments from the evaluators. The evaluators agree that the ontology can be adapted to different application of software testing and that it fulfils the main purpose of top domain ontology. The developed ontology could be made better by evaluating and reusing the ontologies that are not published (e.g. STOWS). Ontology maintenance is an ongoing process. Ontology needs to be updated with new knowledge of software testing that emerges with research.
Chapter
Characterizing, structuring and systematizing all the knowledge assets of an organization represents a major challenge nowadays. At the same time rapid social, economic and technological changes require organizations to act and adapt quickly. In order to be able to meet all these expectations, organizations must implement such comprehensive knowledge management solutions that enable multiple ways of using and reusing organizational knowledge. This chapter provides a complete description of how the STUDIO knowledge-based system can support organizations in applying and evaluating knowledge, in learning, in adapting changes to their own context quickly, and in translating learning into action. STUDIO is an extensible and domain independent knowledge based system that captures the relevant domain concepts and their relations by ontological entities around which a set of knowledge—and human resource management related tasks are carried out. To evaluate the proposed architecture we have applied it to the challenge of managing knowledge both in business and educational contexts.
Article
The focus of traditional evaluations of ontologies is largely performance-based. A comparison of a new ontology with well-established ones, testing of ontologies in different applications, as well as any judgment of an ontology`s appropriateness and relatedness to source data heavily rely on what results that ontology seems to manifest. This study, on the other hand, is an attempt to evaluate the quality of a particular ontology as manifested by its structure, representation, and interoperability. To that end, major categories of quality evaluations were first identified through an extensive survey of literature. Evaluation questions were formulated from these categories using the Delphi method and were validated by ontology experts. The entire process produced a set of 53 evaluation questions, which was then employed to test the quality of a newly-developed smartphone ontology.
Article
Full-text available
There are many challenges associated with the design and realization of fast changing highly customized products. One promising approach is to implement design for manufacturing (DFM) strategies aimed at reducing production costs without compromising product quality. For manufacturers doing business in a globally distributed market place, effective reuse and sharing of the DFM knowledge in a collaborative environment is essential. In recent years, ontologies are increasingly used for knowledge management in engineering. Here, ontology is defined as a formal specification of domain knowledge that can be used to define a set of data and structure that enables experts to share information in a domain of interest, to aid information reasoning, and to manage and reuse data. The primary goal of this paper is to put forward the process of ontology development and utilization for DFM and to study the most important phases in the process, including: the concept categorization and class hierarchy development, slot categorization and development, identification and realization of relations among slots, and methods to support knowledge capture and reuse. Four cases are presented to illustrate the promising use of a DFM ontology. These cases prove that the DFM ontology and the process of ontology development and utilization for the DFM can facilitate the reuse of existing data, find the inconsistency and errors in data, reduce the work associated with populating the knowledge base of the ontology, and help designers make decisions by considering complex technical and economical criteria.
Article
Ontologies are an important component in many areas, such as knowledge management and organization, electronic commerce and information retrieval and extraction. Several methodologies for ontology building have been proposed. In this article, we provide an overview of ontology building. We start by characterizing the ontology building process and its life cycle. We present the most representative methodologies for building ontologies from scratch, and the proposed techniques, guidelines and methods to help in the construction task. We analyze and compare these methodologies. We describe current research issues in ontology reuse. Finally, we discuss the current trends in ontology building and its future challenges, namely, the new issues for building ontologies for the Semantic Web.
Conference Paper
Full-text available
Currently the evaluation of the feasibility of general-purpose ontologies and upper-level models is expensive mostly because of technical problems such as different representation formalisms and terminologies used. Additionally, there are no formal mappings between the upper-level ontologies that could ease any kind of studies and comparisons. We present the OntoMap Project (http://www.OntoMap.org), a project with the pragmatic goal to facilitate the access, understanding, and reuse of such resources. A semantic framework on the conceptual level is implemented that is small and easy enough to be learned on-the-fly. We tried to design the framework so that it captures most of the semantics usually encoded in upper-level models. Technically, OntoMap is a web-site providing access to several upper-level ontologies and manual mapping between them.
Article
Ontologies and ontology-based information systems are becoming more commonplace in knowledge management. For engineering applications such as product design, ontologies can be utilised for knowledge capture/reuse and frameworks that allow for the integration and collaboration of a wide variety of tools and methods as well as participants in design (marketing/sales, engineers, customers, suppliers, distributors, manufacturing, etc.) who may be distributed globally across time, location, and culture. With this growth in the use of ontologies, it is critical to recognise and address errors that may occur in their representation, maintenance and utilisation. Passing undetected and unresolved errors downstream can cause error avalanche and could diminish the acceptance, further development and promise of significant impact that ontologies hold for product design, manufacturing, or any knowledge management environment within an organisation. This paper categorises errors and their causal factors, summarises possible solutions in ontology and ontology-based utilisation, and puts forward an ontology-based Root Cause Analysis (RCA) method to help find the root cause of errors. Error identification and collection methods are described first, followed by an error taxonomy with associated causal factors. Finally, an error ontology and associated SWRL (Semantic Web Rule Language) rules are built to facilitate the error taxonomy, the root cause analysis and solution analysis for these errors. Ultimately, this work should reduce errors in the development, maintenance and utilisation of ontologies and facilitate further development and use of ontologies in knowledge management.
Article
Full-text available
This paper describes the development and testing of the Medical Concept Mapper, a tool designed to facilitate access to online medical information sources by providing users with appropriate medical search terms for their personal queries. Our system is valuable for patients whose knowledge of medical vocabularies is inadequate to find the desired information, and for medical experts who search for information outside their field of expertise. The Medical Concept Mapper maps synonyms and semantically related concepts to a user's query. The system is unique because it integrates our natural language processing tool, i.e., the Arizona (AZ) Noun Phraser, with human-created ontologies, the Unified Medical Language System (UMLS) and WordNet, and our computer generated Concept Space, into one system. Our unique contribution results from combining the UMLS Semantic Net with Concept Space in our deep semantic parsing (DSP) algorithm. This algorithm establishes a medical query context based on the UMLS Semantic Net, which allows Concept Space terms to be filtered so as to isolate related terms relevant to the query. We performed two user studies in which Medical Concept Mapper terms were compared against human experts' terms. We conclude that the AZ Noun Phraser is well suited to extract medical phrases from user queries, that WordNet is not well suited to provide strictly medical synonyms, that the UMLS Metathesaurus is well suited to provide medical synonyms, and that Concept Space is well suited to provide related medical terms, especially when these terms are limited by our DSP algorithm.
Article
Full-text available
this paper we present a logical framework for the TOVE Enterprise Model. We first review our process for engineering an ontology. We then describe our logical framework which is based on Reiter's solution of the frame problem [Reiter 91] and Pinto's formalization of occurrence and the incorporation of time within the situation calculus [Pinto & Reiter 93]. We then provide an ontotology for activity and extensions for resource spoilage. This is followed by example queries that the ontology supports
Article
Full-text available
This paper describes the methodology used in the Enterprise Integration Laboratory for the design and evaluation of integrated ontologies, including the proposal of new ontologies and the extension of existing ontologies (see Figure 1). We illustrate these ideas with examples from our activity and organisation ontologies. 2 Motivating Scenarios
Article
We address the task of enabling naive users in a practical context to define, comprehend and use knowledge bases for representing part-whole information. This work is part of a larger effort whose target users were ecologists who had little experience in mathematics, computing, and artificial intelligence, but who wished to build computer simulation models of ecological systems. The ecological domain has a rich variety of part-whole information. This includes individuals, populations and sub-populations, as well as composite entities. We note the special requirements deriving from the need to satisfy naive users and show how various existing approaches are insufficient. We describe a novel representation, based on the typed lambda calculus which covers the above range of part-whole relationships in a flexible, uniform framework. We emphasise the role of the typed lambda calculus in particular, and more generally, how a careful description of the ontology founding our representation can be used to guide users in creating accurate, transparent knowledge bases, which in turn facilitates reuse and sharing.
Article
Philosophers have spent 25 centuries debating ontological categories. Their insights are directly applicable to the analysis, design, and specification of the ontologies used in knowledge-based systems. This paper surveys some of the ontological questions that arise in artificial intelligence, some answers that have been proposed by various philosophers, and an application of the philosophical analysis to the clarification of some current issues in AI. Two philosophers who have developed the most complete systems of categories are Charles Sanders Peirce and Alfred North Whitehead. Their analyses suggest a basic structure of categories that can provide some guidelines for the design of AI systems.
Article
This paper is intended to serve as a comprehensive introduction to the emerging field concerned with the design and use of ontologies. We observe that disparate backgrounds, languages, tools, and techniques are a major barrier to effective communication among people, organisations, and/or software systems. We show how the development and implementation of an explicit account of a shared understanding (i.e. an `ontology') in a given subject area, can improve such communication, which in turn, can give rise to greater reuse and sharing, inter-operability, and more reliable software. After motivating their need, we clarify just what ontologies are and what purposes they serve. We outline a methodology for developing and evaluating ontologies, first discussing informal techniques, concerning such issues as scoping, handling ambiguity, reaching agreement and producing definitions. We then consider the benefits of and describe, a more formal approach. We re-visit the scoping phase, and discuss the role of formal languages and techniques in the specification, implementation and evaluation of ontologies. Finally, we review the state of the art and practice in this emerging field, considering various case studies, software tools for ontology development, key reearch issues and future prospects.
The enterprise toolset -an open enterprise architecture In The Impact of Ontologies on Reuse, Interoperability and Distributed Processing, pages 42{50 Further information about the Enterprise Project and Ontology is available on the World Wide Web from
  • A J Fraser
  • M Tate
  • Uschold
Fraser. J., A. Tate, and M. Uschold. The enterprise toolset -an open enterprise architecture. In The Impact of Ontologies on Reuse, Interoperability and Distributed Processing, pages 42{50. Unicom Seminars, London, 1995. Further information about the Enterprise Project and Ontology is available on the World Wide Web from: http://www.aiai.ed.ac.uk/entprise/enterprise/.
Towards a methodology for building ontologies Also available as AIAI-TR-183 from AIAI
  • M Uschold
  • M King
M. Uschold and M. King. Towards a methodology for building ontologies. In Workshop on Basic Ontological Issues in Knowledge Sharing. International Joint Conference on Artiicial Intelligence, 1995. Also available as AIAI-TR-183 from AIAI, The University of Edinburgh.
Methodology for the design and evaluation of ontologies
  • M Gruninger
  • M S Fox
M. Gruninger and M.S. Fox. Methodology for the design and evaluation of ontologies. In Workshop on Basic Ontological Issues in Knowledge Sharing. International Joint Conference on Arti cial Intelligence, 1995.
Unicom Seminars, London, 1995. Further information about the Enterprise Project and Ontology is
  • A J Fraser
  • M Tate
  • Uschold
Fraser. J., A. Tate, and M. Uschold. The enterprise toolset -an open enterprise architecture. In The Impact of Ontologies on Reuse, Interoperability and Distributed Processing, pages 42{50. Unicom Seminars, London, 1995. Further information about the Enterprise Project and Ontology is available on the World Wide Web from: http://www.aiai.ed.ac.uk/ entprise/enterprise/.