Article

Ontological Theory for Ontological Engineering

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Software application ontologies have the potential to become the keystone in state-of-the-art information management techniques. It is expected that these ontologies will support the sort of reasoning power required to navigate large and complex terminologies correctly and efficiently. Yet, there is one problem in particular that continues to stand in our way. As these terminological structures increase in size and complexity, and the drive to integrate them inevitably swells, it is clear that the level of consistency required for such navigation will become correspondingly difficult to maintain. While descriptive semantic representations are certainly a necessary component to any adequate ontology-based system, so long as ontology engineers rely solely on semantic information, without a sound ontological theory informing their modeling decisions, this goal will surely remain out of reach. In this paper we describe how Language and Computing nv (L&C), along with The Institute for Formal Ontology and Medical Information Sciences (IFOMIS), are working towards developing and implementing just such a theory, combining the open software architecture of L&C's LinkSuite TM with the philosophical rigor of IFOMIS's Basic Formal Ontology. In this way we aim to move beyond the more or less simple controlled vocabularies that have dominated the industry to date.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Formal Representation of Knowledge is about building real world models of a certain domain or problem, and it enables reasoning and automatic interpretation (Fielding et al., 2004). These formal models, called ontologies, can be used in order to offer formal semantics (i.e. machine interpretable concepts) of every kind of information: database, catalogs, documents and web pages. ...
... In general, we refer to an Ontology as a graph which according to Fielding et al. (2004) consists of: -a set of concepts (graph vertices) -a set of relations that connect concepts (graph sides) -a set of instances associated with the concepts Formally speaking, an Ontology is determined by the function O(C, T, R, A, I, V). It consists of the set of Concepts (C), Types (T), Relations (R), Attributes (A), Instances (I) and Values (V). ...
Article
Full-text available
Formal Representation of Knowledge deals with the construction of real world models taken from a certain domain, which enables automatic reasoning and interpretation. These formal models, called also ontologies, are used to offer formal semantics (forms interpretable by machine) to all kinds of information. Ontology building in Computer Science is tightly connected to its philosophical and logical concepts. Organizing objects into categories is a very important part of knowledge representation. Even though the interaction with the world is made based on individual objects, most of the reasoning is done based on objects’ categories. Categorical formation is an intellectual issue even though its objects are not pieces of the intellectual world. As a matter of fact these objects stand at the top of the perceptive world in such a way that they look totally apart from their real structures. Standing at this practical level it is important to cite that the moment of object categorization is a pure formal process. For a business company these categories must reflect business’ concepts and rules, its logic and the conventions between the business itself and the organizations it cooperates with. In this paper we build a basis on the formalization of the categorical system. These formalizations are contemporary tools which the Albanian businesses must embrace in order for them to function properly. DOI: 10.5901/ajis.2014.v3n1p427
... Although recent research initiatives such as [Fielding et. al 2004] and [Guizzardi 2006] have elaborated on why domain ontologies must be represented with the support of a foundational theory, such an approach has still not been broadly adopted. As reported by Jones et al. (1998) and Wache et al. (2001), most existing methodologies do not emphasize or even completely ignore this aspect. We believe that ...
... For instance, [Guizzardi 2006] illustrates examples of semantic interoperability problems that can pass undetected when interoperating lightweight ontologies. Likewise, Fielding et al. (2004) discuss how a principled foundational ontology can be used to spot inconsistencies and provide solutions for problems in lightweight biomedical ontologies. As a final example, the need for methodological support in establishing precise meaning agreements is recognized in the Harvard Business Review report of October 2001, which claims that "one of the main reasons that so many online market makers have foundered [is that] the transactions they had viewed as simple and routine actually involved many subtle distinctions in terminology and meaning". ...
Article
Full-text available
Despite the fact that many authors in the literature defend the need of ontologically well-founded languages for ontology representation, this approach has not yet been broadly adopted. We present in this paper a codification of a well-founded heart-ECG domain ontology in OWL+SWRL. The lightweight ontology produced is then applied to a web environment for heart electrophysiology reasoning and visualization. We also reflect on this codification process to argue in favor of the view that two classes of languages are needed for ontology engineering: (i) a theoretically well-founded representation language for creating conceptual domain ontologies; (ii) a lightweight representation language for codifying these conceptual ontologies.
... Such a language must explicitly commit to fundamental ontological distinctions in their metamodels, given by a foundational ontology [6, 7]. Although recent research initiatives, such as [8, 9], have elaborated on the importance of foundational ontologies in the development of domain ontologies, such an approach has still not been broadly adopted. Concerning the domain of software measurement, there are quite a few initiatives committed with ontology-based modeling and formalization of this domain, among them [10, 11]. ...
... Although several researchers argue in favor of using a foundational ontology as a basis for developing domain ontologies, among them Guarino [19], Guizzardi and colleagues [9] and Fielding and colleagues [8], few works have explored this use. This is the case of the Software Measurement domain, which the proposed ontologies are in general lightweight ontologies, that are not grounded in foundational ontologies. ...
Conference Paper
Full-text available
Software measurement is a relatively young discipline. As a consequence, it is not well defined yet, making the terminology used diverse. In order to establish a basic conceptualization regarding this domain, in this paper we present a Software Measur ement Ontology, developed grounded in the Unified Foundational Ontology.
... As pointed by several works, such as [5,6], ideally domain ontologies should be developed grounded in foundational (top-level) ontologies. Concepts and relations in a domain ontology must be previously analyzed in the light of a foundational ontology. ...
Article
Full-text available
This paper presents the new version of SABiO - A Systematic Ap-proach for Building Ontologies. SABiO focus on the development of domain ontologies, and also propose support processes. SABiO distinguishes between reference and operational ontologies, providing activities that apply to the de-velopment of both types of domain ontologies.
... For instance, [27] illustrates examples of semantic interoperability problems that can pass undetected when interoperating lightweight ontologies. Likewise, [28] discusses how a principled foundational ontology can be used to spot inconsistencies and provide solutions for problems in lightweight biomedical ontologies. As a final example, the need for methodological support in establishing precise meaning agreements is recognized in the Harvard Business Review report of October 2001, which claims that "one of the main reasons that so many online market makers have foundered [is that] the transactions they had viewed as simple and routine actually involved many subtle distinctions in terminology and meaning". ...
Conference Paper
Full-text available
In philosophy, the term ontology has been used sin ce the 17 th century to refer both to a philosophical discipline (Ontology with a capital " O"), and as a domain-independent system of categori es that can be used in the conceptualization of domain -specific scientific theories. In the past decades there has been a growing interest in the subject of ontology in computer and information sciences. In the last f ew years, this interest has expanded considerably in t he context of the Semantic Web and MDA (Model-Driven Architecture) research efforts, and due to the role ontologies are perceived to play in these initiati ves. In this paper, we explore the relations between Ontology an d ontologies in the philosophical sense with domain ontologies in computer science. Moreover, we elaborate on formal characterizations for the notions of ontology, conceptualization and metamodel, as well as on the relations between these notions. Addition ally, we discuss a set of criteria that a modeling langua ge should meet in order to be considered a suitable language to model phenomena in a given domain, and present a systematic framework for language evaluation and design. Furthermore, we argue for th e importance of ontology in both philosophical sens es aforementioned for designing and evaluating a suita ble general ontology representation language, and w e address the question whether the so-called Ontology Web languages can be considered as suitable genera l ontology representation languages. Finally, we moti vate the need for two complementary classes of modeling languages in Ontology Engineering addressing two separate sets of concerns.
... A part of the Measurement Goals sub-ontology was also presented. Although several researchers argue in favor of using a foundational ontology as basis for developing domain ontologies [12, 22, 23], few works have explored this use. This is the case of the software measurement domain, in which the proposed ontologies are, in general, lightweight ontologies. ...
Conference Paper
Full-text available
Organizations define strategies and establish business goals aiming to be competitive. The process performance analysis supports goals monitoring, allowing to detect and to treat threats to goals achievement. In this context, measurement is essential. The collected data for measures are used to analyze the process performance and to guide informed decisions that lead to the achievement of business and technical goals. For software organizations, the process performance analysis is a high maturity practice. In this context, although there are several standards that address the importance of software measurement and its use in process performance analysis, the vocabulary used by these standards concerning software measurement is diverse. In order to establish a conceptualization regarding this domain, we developed a Software Measurement Ontology (SMO), grounded in the Unified Foundational Ontology. In this paper, we present a fragment of SMO with focus on software process behavior analysis.
... The use of foundational concepts that take truly ontological issues seriously is becoming more and more accepted in the ontological engineering literature, i.e., in order to represent a complex domain, one should rely on engineering tools such as design patterns, computational environments, modeling languages and methodologies that are based on well-founded ontological theories in the philosophical sense (e.g., (Burek, 2006;Fielding, 2004)). Especially in a domain with complex concepts, relations and constraints, and with potentially serious risks which could be caused by interoperability problems, a supporting ontology engineering approach should be able to: (a) allow the conceptual modelers and domain experts to be explicit regarding their ontological commitments, which in turn enables them to expose subtle distinctions between models to be integrated and to minimize the chances of running into a False Agreement Problem (Guarino, 1998); (b) support the user in justifying their modeling choices and providing a sound design rationale for choosing how the elements in the universe of discourse should be modeled in terms of language elements. ...
Article
Full-text available
Ontologies are commonly used in computer science either as a reference model to support semantic interoperability, or as an artifact that should be efficiently represented to support tractable automated reasoning. This duality poses a tradeoff between expressivity and computational tractability that should be addressed in different phases of an ontology engineering process. The inadequate choice of a modeling language, disregarding the goal of each ontology engineering phase, can lead to serious problems in the deployment of the resulting model. This article discusses these issues by making use of an industrial case study in the domain of Oil and Gas. The authors make the differences between two different representations in this domain explicit, and highlight a number of concepts and ideas that were implicit in an original OWL-DL model and that became explicit by applying the methodological directives underlying an ontologically well-founded modeling language.
... Ontological engineering shall remain outside our scope. The popularity of ontological engineering has led to the abundance of 'ontologies', some of which were not measured against any particular criterion of acceptance [7]. In contrast, Smith [18] demands that not any collection of 'objects and relations' is by itself an adequate answer to Q2 ( " the ontologist's credo " ). ...
Article
Full-text available
As a first step in the larger project of charting the ontology of computer programs, we pose three central questions: (1) Can programs, hardware, and metaprograms be organized into a meaningful taxonomy? (2) To what ontology are computer programs committed? (3) What explains the proliferation of programming languages and how do they come about? Taking the complementary perspectives software engineering and mathematical logic, we take inventory of programs and related objects and conclude that the notions of abstraction and concretization take a central role in this investigation.
Conference Paper
The field of information systems analysis and design includes numerous information modeling methods and notations (e.g. ER, ORM, UML, DFDs, Petri Nets), that are typically evolving. Even with some attempts to standardize (e.g. UML for object-oriented design), new modeling methods are constantly being introduced, many of which differ only marginally from existing approaches. These ongoing changes sig-nificantly impact the way information systems are analyzed and designed in practice. This workshop focuses on exploring, evaluating, and enhancing current informa-tion modeling methods and methodologies. Though the need for such studies is well recognized, there is a paucity of such research in the literature. The objective of EMMSAD'05 is to provide a forum for researchers and practitioners interested in modeling methods in systems analysis and design to meet, and exchange research ideas and results. EMMSAD'05 is the tenth in a very successful series of EMMSAD workshops, previously held in Crete, Barcelona, Pisa, Heidelberg, Stockholm, Interla-ken, Toronto, Velden, and Riga. To mark the tenth anniversary of the workshop, this year the workshop includes an invited keynote address by Prof. Janis Bubenko Jr. that reflects on historical trends in information modeling. EMMSAD'05 is jointly sponsored by the Conference on Advanced Information Systems Engineering (CAiSE), the International Federation for Information Process-ing Working Group 8.1 (IFIP WG 8.1), the International Federation for Information Processing Working Group 8.1 (IFIP WG 8.1), the Network of Excellence for Inter-operability Research for Networked Enterprises Applications and Software (INTEROP), and the Association for Information Systems Special Interest Group on Systems Analysis and Design (AIS-SIGSAND). This year we had 36 submissions from all over the globe. After an extensive re-view process by a distinguished international program committee, with each paper re-ceiving three or more reviews, we accepted the 21 papers that appear, together with an abstract of the keynote address, in these proceedings. Congratulations to the suc-cessful authors!
Article
Full-text available
In recent years, there has been a growing interest in the development and use of domain ontologies, strongly motivated by the Semantic Web initiative. However, as we demonstrate in this paper, an approach for ontology representation uniquely based on the modeling languages adopted in the Semantic Web is insufficient to address a number of semantic interoperability problems that arise in concrete application scenarios. The main objective of this paper is to advocate in favor of an approach for conceptual modeling, in general, and domain ontology representation, in particular, in which lightweight modeling languages such as OWL and standard UML are complemented by modeling languages and methodologies based on theoretically principled Foundational Ontologies.
Article
Full-text available
The paper describes an exemplary ontological engineering process based on the conceptualization of a spatial domain. A pragmatic view of ontologies is briefly introduced, whereas the focus is on the stepwise engineering process. An example is chosen that results from experiences gained during a lecture, including exercises, on semantics of geoinformation. The aim is to describe the main steps during the engineering process. The diffusion of methods that support ontological engineering is important due to the fact that ontologies will be used increasingly to support the access to spatial data sources and information sharing in future, especially within the framework of the development of spatial data infrastructures.
Chapter
Full-text available
Ontologies are commonly used in computer science either as a reference model to support semantic interoperability, or as an artifact that should be efficiently represented to support tractable automated reasoning. This duality poses a tradeoff between expressivity and computational tractability that should be addressed in different phases of an ontology engineering process. The inadequate choice of a modeling language, disregarding the goal of each ontology engineering phase, can lead to serious problems in the deployment of the resulting model. This article discusses these issues by making use of an industrial case study in the domain of Oil and Gas. We make explicit the differences between two different representations in this domain, and highlight a number of concepts and ideas that were implicit in an original OWL-DL model and that became explicit by applying the methodological directives underlying an ontologically well-founded modeling language.
Conference Paper
The integration of information resources in the life sciences is one of the most challenging ,problems ,facing bioinformatics ,today. We describe ,how Language and Computing nv, originally a developer of ontology-based natural language understanding systems for the healthcare domain, is developing a framework,for the integration of structured,data with unstructured information contained,in natural ,language ,texts. L&C’s LinkSuite™ ,combines ,the flexibility of a ,modular ,software ,architecture with ,an ontology ,based ,on rigorous philosophical and logical principles that is designed to comprehend,the basic formal ,relationships that structure both ,reality and the ways ,humans perceive and communicate,about reality.
Conference Paper
Providing eGovernment solutions is becoming a matter of great importance for governments all over the world. In order to meet the special requirements of this sort of projects, several attempts have been and are currently developed. This papers proposes its own approach that takes advantage of resources derived from the use of Semantics and from an artifact, deeply discussed on the paper, defined as LifeEvent. On the basis of these premises, a entire software platform is described and a prototype developed, as shown in the paper. Also some conclusions and hints for future projects in the scope are provided.
Conference Paper
Full-text available
Development of web-based applications needs tools in order to make the application development more effective by using/re-using business rules as well as web services. These tools should incorporate facilities for capturing different semantic aspects of an application. This paper presents a new conceptual and technological framework of using a rule language and rule engine for capturing semantics in modern web-based systems. The framework enables to cover two aspects of semantics in web-based systems: business rules and web service composition logic. The technology consists of 2 main parts: the application server Xstone for creating 3-layered systems and the RqlGandalf rule solver. The middleware server Xstone connects to Oracle, PostgreSQL databases and the RqlGandalf rule system. The RqlGandalf rule system is targeted for two different tasks: defining and using business logic rules as well as for rule-based synthesis of complex queries over web services. The presented rule-based system development technology is implemented for the Linux platform as open source software.
Conference Paper
Full-text available
The paper describes a concept map based multiagent system that has been developed for learners' knowledge assessment and self-assessment in process oriented learning. The architecture of the system in terms of modules, their functions and interaction is presented. The special attention is given to the intelligent assessment agent, which at the moment is composed of communication, knowledge evaluation, interaction registering, and expert agents. The paper also discusses a novel approach to adaptive knowledge assessment using concept maps.
Conference Paper
Full-text available
The analysis of events ordered over time and the discovery of significant hidden relationships from this temporal data is becoming the concern of the information society. Using temporal data as temporal sequences without any preprocessing fails to find key features of these data. Therefore, before applying mining techniques, an appropriate representation of temporal sequences is needed. Our representation of time series can be used in different fields, such as aviation science and earth science, and can also be applied to, for instance, Temporal Web Mining (TWM) [1], [2], [3], [4]. Our representation of time series aims at improving the possibility of specifying and finding an important occurrence. In our new concept, we use data band ranges and areas in order to determine the importance or the weight of a segment. According to the closeness of a segment to a data band range, this representation of time series can help to find a significant event. This paper focuses on our representation of time series.
Conference Paper
Full-text available
This paper explores the challenges of constructing an architecture for inter-organisational collaborative interactions based on Service Oriented Architecture (SOA), Web services choreographies and software agents. We present an approach to harmonisation of the “global” or neutral definition of business collaborations, with partner-specific implementations, which can differ in terms of platform, environment, implementation technology, etc. By introducing the concept of pluggable business service handlers into our architecture we draw on the work carried out by ebXML initiative, business services interfaces, in particular. Due to increasing need for better management of collaborative interactions, Virtual Organisations (VO) become an important tool for creation and maintenance of federated trust domains among the collaboration partners. We look into the software agents abilities to serve as the background support mechanism for the automation and management of the Virtual Organisations lifecycle.
Conference Paper
Full-text available
Aspect-oriented development has become one of the most intensively investigated themes in software development. In this paper, the method is proposed for reconfigurable modeling of aspect-oriented information system when < > and < > concerns may be represented separately and combined in different ways without changing their models or implementation. < > concerns are consistently represented during development process starting from < > use cases till crosscutting interfaces and templates for tailoring aspects for specific contexts. Examples from IT-Europe project are given where aspect-oriented concepts were used for modeling behavior of software agents performing self-management functionality of IT Knowledge Portal. The work is supported by Lithuanian State Science and Studies Foundation according to Eureka programme project “IT-Europe” (Reg. No 3473).
Conference Paper
Full-text available
Two-hemisphere model driven (2HMD) approach assumes modeling and use of procedural and conceptual knowledge on equal and related basis according to the principles of Model Driven Architecture (MDA), which separates different aspects of system modeling. This differentiates 2HMD approach from pure procedural, pure conceptual, and object oriented approaches. The approach may be applied in the context of modeling of a particular business domain as well as in the context of modeling the knowledge about the domain. Therefore, the principles of MDA via 2HMD approach may be applied not only in the context of software development but also in the context of the study course and program development. Knowledge modeling by 2HMD approach gives an opportunity to transparently analyze and compare knowledge to be provided and knowledge actually provided by courses belonging to a particular study program, and, thus, to identify and fill gaps between desirable and actual knowledge content of the study program.
Conference Paper
Full-text available
Fragmentation and allocation are database distribution design techniques used to improve the system performance by increasing data localisation and reducing data transportation costs between different network sites. Often fragmentation and allocation are considered separately, disregarding that they are using the same input information to achieve the same objective. Vertical fragmentation is often considered a complicated problem, because the huge number of alternatives make it nearly impossible to obtain an optimal solution. Therefore, many researchers seek for heuristic solutions, among which affinity-based vertical fragmentation approaches form a main stream in the literature. However, using attribute affinities to perform fragmentation can not really reflect the local needs of data at each site. Therefore it is not guaranteed that the remote data transportation costs can be reduced. This paper addresses vertical fragmentation and allocation simultaneously in the context of the relational data model. The core of the paper is a heuristic approach to vertical fragmentation, which uses a cost model and is targeted at globally minimising these costs. Further, based on the proposed vertical fragmentation, an integrated methodology is proposed by applying vertical and horizontal fragmentation simultaneously to produce mixed fragmentation schemata.
Conference Paper
Full-text available
Web services are one of the most popular ways for building distributed Web Information Systems (WIS). In this paper we propose a distributed implementation of the Hera methodology, a methodology based on models that specify the different aspects of a WIS. The Web services used in the implementation are responsible for capturing the required data transformations built around specific Hera models. A service orchestrator coordinates the different Web services so that the required WIS presentation is built. Based on the degree of support of the user interaction with the system two architectures are identified: one for the construction of static applications, and another one for the building of dynamic applications.
Conference Paper
Full-text available
Workflow time management provides predictive features to forecast eventually upcoming deadline violations and proactive strategies to speed up late processes. Existing time management approaches assume that communication with external processes or services is conducted synchronously. This is not the case with inter-organizational processes which very frequently communicate in an asynchronous manner. Therefore weexamine diverse asynchronous communication patterns, show how to map them on an interval-based time model, and describe their application to inter-organizational workflow environments.
Thesis
Full-text available
In this thesis, we aim at contributing to the theory of conceptual modeling and ontology representation. Our main objective here is to provide ontological foundations for the most fundamental concepts in conceptual modeling. These foundations comprise a number of ontological theories, which are built on established work on philosophical ontology, cognitive psychology, philosophy of language and linguistics. Together these theories amount to a system of categories and formal relations known as a foundational ontology
Article
Full-text available
There are increasing efforts directed at providing formal frameworks to consolidate the widening net of terms and relations used in medical practice. While there are many reasons for this, the need for standardisation of protocol and terminology is critical, not only for the provision of uniform levels of health care, but also to facilitate medical science research. In the domain of breast cancer pathology, a summary of current practice by the World Health Organisation states that the variability of the evidence archive (inconsistencies in describing microscopic appearances of phenomena, different diagnostic thresholds for working pathologists) is chief among the barriers to the medical understanding of the symptoms and development of early cancers. Such variability is acknowledged across specialist fields of medicine, motivating standardisation of terminologies for reporting medical practice. The desideratum of making these standards machine-readable has led to their formalisation as ontologies. Ontologies are computational artefacts designed to provide representations of a domain of interest. Thus, the representation must be a formal description so that it can be encoded, and reused, allowing navigation of the key concepts recorded and retrieval of information indexed against it. This brings the required standardisation by offering a set of labelling options to record observations and events encountered by medical professionals. Given the twin goals of ontologies -- representation and standardisation -- this paper will consider the key question of their design in the context of the use by experts, of information handling applications built around them. We build on our experience in developing ontologies for decision support software in the area of breast cancer diagnosis and treatment. We will also examine, from this perspective, the suggestion offered in the literature that a set of metaphysically motivated questions should form the basis of ontology building as guarantors of fidelity to reality. We find that ontologies intended to support medical practice can only be understood within the context of their intended use. The declarative framework within which they are encoded generates the hope that their meaning transcends the specific application context. We show, however, that these declarative statements are to be understood as end products of chains of procedural engagements between humans, materials and communitarian norms. It is only when this scaffolding that brings this representation into existence becomes routine and consensual (within the community that exchanges information indexed against it) that the concepts stand in for physiological states with independent dynamics. However, as the state of biomedical knowledge is always in a state of flux, and different institutions and practitioners may be out of sync with respect to such modifications, the concepts embedded in the ontologies are constantly subject to reinterpretation within the context of specific institutional practices. Given the fragmentation of the patient’s body when viewed through various specialised lenses, ontologies can provide placeholders for co-ordinating disparate viewpoints to provide suitable medical interventions. The extent to which such interventions reflect any underlying reality, as manifest in measures of their efficacy, is closely wrapped up in the regulatory apparatus of protocol-guided consensus making. The value of ontologies lies in their reflection of, and support for, the sense-making activities that constitute expertise, not in their transparent access to a metaphysical reality.
Article
Full-text available
Successful biomedical data mining and information extraction require a complete picture of biological phenomena such as genes, biological processes and diseases as these exist on different levels of granularity. To realize this goal, several freely available heterogeneous databases as well as proprietary structured datasets have to be integrated into a single global customizable scheme. We will present a tool to integrate different biological data sources by mapping them to a proprietary biomedical ontology that has been developed for the purposes of making computers understand medical natural language.
Article
As the use of geographical information systems develops apace, a significant strand of research activity is being directed to the fundamental nature of geographic information. This volume contains a collection of essays and discussions on this theme. What is geographic information? What fundamental principles are associated with it? How can it be represented? How does it represent the world? How can geographic information be quantified? How can it be communicated and related to the other information sciences? How does HCI tie in with it? A number of other more specific but relevant issues are considered, such as Spatio-temporal relationships, boundaries, granularity and taxonomy.
Article
We propose a modular ontology of the dynamic features of reality. This amounts, on the one hand, to a purely spatial ontology supporting snapshot views of the world at successive instants of time and, on the other hand, to a purely spatiotemporal ontology of change and process. We argue that dynamic spatial ontology must combine these two distinct types of inventory of the entities and relationships in reality, and we provide characterizations of spatiotemporal reasoning in the light of the interconnections between them.
Conference Paper
We describe the ontology engineering processes and their supporting technologies at L&C, a company developing intelligent medical applications based on ontologies. We describe the principal tasks that the modellers of our ontology have to execute, how they are supported and guided by some specifically ontology-focused management practices, and how (semi-)automated technology can also aid in their support and guidance, so as to produce a higher quality and quantity of ontology product. The ontology processes include the development of new structures of concepts and relations, the integration of other ontologies and terminologies, the integration of the ontology to natural language applications, and the reforming of the current ontology’s formal structure. The automated supports we talk about include OntoClean, a principled methodology for analyzing ontological properties and their constraints. We finally note how far we think our ontology technology comes to some proposed desiderata recently given for “enterprise standard” ontology environments.
Conference Paper
Ontology mapping is seen as a solution provider in today's landscape of ontology research. As the number of ontologies that are made publicly available and accessible on the Web increases steadily, so does the need for applications to use them. A single ontology is no longer enough to support the tasks envisaged by a distributed environment like the Semantic Web. Multiple ontologies need to be accessed from several applications. Mapping could provide a common layer from which several ontologies could be accessed and hence could exchange information in semantically sound manners. Developing such mappings has been the focus of a variety of works originating from diverse communities over a number of years. In this article we comprehensively review and present these works. We also provide insights on the pragmatics of ontology mapping and elaborate on a theoretical approach for defining ontology mapping.
Conference Paper
This report discusses a suitable and working solution for the semantic integration of dispersed, medical relational databases by cou- pling the databases to a medical ontology. We will demonstrate how this can be done by means of a case study, and how the coupling result can be deployed to query a relational database at the ontology level. Next to that, we will introduce a coupling language, and a discussion on how to integrate the language in two ontology models. Keywords: medical ontology, medical relational databases, database in- tegration, semantic integration, ontology tools, coupling language
Article
The paper is a contribution to formal ontology. It seeks to use topological means in order to derive ontological laws pertaining to the boundaries and interiors of wholes, to relations of contact and connectedness, to the concepts of surface, point, neighbourhood, and so on. The basis of the theory is mereology, the formal theory of part and whole, a theory which is shown to have a number of advantages, for ontological purposes, over standard treatments of topology in set-theoretic terms. One central goal of the paper is to provide a rigorous formulation of Brentano's thesis to the effect that a boundary can exist as a matter of necessity only as part of a whole of higher dimension of which it is the boundary. It concludes with a brief survey of current applications of mereotopology in areas such as natural-language analysis, geographic information systems, machine vision, naive physics, and database and knowledge engineering.
Article
The rapidly increasing wealth of genomic data has driven the development of tools to assist in the task of representing and processing information about genes, their products and their functions. One of the most important of these tools is the Gene Ontology (GO), which is being developed in tandem with work on a variety of bioinformatics databases. An examination of the structure of GO, however, reveals a number of problems, which we believe can be resolved by taking account of certain organizing principles drawn from philosophical ontology. We shall explore the results of applying such principles to GO with a view to improving GO's consistency and coherence and thus its future applicability in the automated processing of biological data.
Article
INTRODUCTION One of the most important tools for the representation and processing of information about gene products and functions is the Gene Ontology (GO). GO is being developed in tandem with work on a variety of biological databases within the framework of the umbrella project OBO (for: open biological ontologies) . It provides a controlled vocabulary for the description of cellular components, molecular functions, and biological processes. Representatives from a number of groups working on model organism databases, including FlyBase (Drosophila), the Saccharomyces Genome Database (SGD) and the Mouse Genome Database (MGD), initiated the Gene Ontology project in 1998 in order to provide a common reference framework for the associated controlled vocabularies. As of June 19, 2003 GO contains 1297 component, 5396 function and 7290 process terms. The total number of GO informal term definitions is 11020. Terms are organized in parent-child hierarchies, indicating either that
Article
A granular partition is a way of dividing up, or classifying, or mapping a certain portion of reality. We characterize partitions at two levels: as systems of cells and subcells, and in terms of their relation to reality. We define a notion of well-formedness for partitions, and we give an account of what it means for a partition to project onto objects in reality. We continue by classifying partitions along three axes: (a) in terms of the degree of correspondence between partition cells and objects in reality; (b) in terms of the degree to which a partition represents the mereological structure of the domain it is projected onto; and (c) in terms of the degree of completeness with which a partition represents this domain. On this basis we define a notion of identity for partitions.
A Theory of Granular Partitions. Foundations of Geographic Information Science
  • T Bittner
  • Smith
Bittner, T, Smith, B. A Theory of Granular Partitions. Foundations of Geographic Information Science, Duckham, M, Goodchild, M, and Worboys, M, eds., London: Taylor & Francis Books, 2003: 117-151.
The Ontology of the Gene Ontology Proceedings of the AMIA Symposium 2003, forthcoming. Smith B. Mereotopology: a theory of parts and boundaries
  • B Smith
  • J Williams
  • S Schulze-Kremer
Smith B, Williams J, Schulze-Kremer S. The Ontology of the Gene Ontology. Proceedings of the AMIA Symposium 2003, forthcoming. Smith B. Mereotopology: a theory of parts and boundaries. Data & Knowledge Engineering 1996; 20: 287-303.
SNAP and SPAN: Towards dynamic special ontology Ontology Mapping: The State of the Art
  • P Grenon
  • B Smith
Grenon P, Smith B. SNAP and SPAN: Towards dynamic special ontology. Forthcoming. Kalfoglou, Y, Schorlemmer, M. Ontology Mapping: The State of the Art. The Knowledge Engineering Review 18(1), 2003.
Some Ontology Engineering Processes and their Supporting Technologies. Siguença, Spain
  • A Flett
  • Dos Santos
  • M Ceuster
Flett A, Dos Santos M, Ceuster W. Some Ontology Engineering Processes and their Supporting Technologies. Siguença, Spain, October 2002. EKAW 2002.