Fakultat Fur Informatik’s research while affiliated with Karlsruhe Institute of Technology and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (16)


Universitat Karlsruhe
  • Article

December 2001

·

9 Reads

A Poor Man's

·

Interner Bericht

·

Fakultat Fur Informatik

·

[...]

·

The Prolog program "termexpansion((define C as A with B), (C=?A:-B,!)). termexpansion((transition E if C then D), ((transition E):-C,!,B,A,(transition ))) :- serialize(D,B,A). serialize((E,F),(C,D),(A,B)) :- serialize(E,C,B), serialize(F,D,A). serialize(F:=G, ([G]=?*[E],F=..[C---D],D=?*B,A=..[C---B]), asserta(A=?E)). [G---H]=?*[E---F] :- (G="E; G=..[C---D],D=?*B,A=..[C---B],A=?E), !,H=?*F. []=?*[]. A=?B :- [A,B]=?*[D,C], D==C." implements a virtual machine for evolving algebras. It offers an efficient and very flexible framework for their simulation. Computation models and specification methods seem to be worlds apart. The evolving algebra project started as an attempt to bridge the gap by improving on Turing's thesis. (Gurevich, 1994) 1


Consistency Driven Planning
  • Article
  • Full-text available

December 2001

·

53 Reads

This paper describes a novel approach to planning. The presented algorithm is based on a consistency maintaining procedure for computing possible worlds out of given worlds and applications of operators. Worlds are represented by facts, rules, and consistency constraints. In order to avoid the frame and qualification problems we state neither frame axioms nor qualification axioms. Instead, general consistency constraints are used. As a result an execution of an action which asserts its postconditions to the current state of the world may result in inconsistency. A repair mechanism then generates possible changes (repairs) to the inconsistent world such that the resulting world describes the actual consistent state of affairs, i. e. a possible world after the execution of an action. The generated repairs serve two purposes: as they describe possible worlds in which the action's postconditions hold, they can be used to reason about these worlds and eliminate those which have undesired properties. Moreover, the repairs are used to guide the linear planner such that the generated plans lead to the selected possible world. 1

Download

An Experimental Study on the Complexity of

May 2001

·

5 Reads

Not only in deductive databases, logic programming, and constraint satisfaction problems but also in object bases where each single dot in a path expression corresponds to a join, the optimizer is faced with the problem of ordering large numbers of joins. This might explain the renewed interest in the join ordering problem. Although many join ordering techniques have been invented and benchmarked over the last years, little is known on the actual effectiveness of the developed methods and the cases where they are bound to fail. The problem attacked is the discovery of parameters and their qualitative influence on the complexity of single problem instances and on the effectiveness of join ordering techniques including search procedures, heuristics, and probabilistic algorithms. Thus an extensive analysis of the search space is carried out, with particular emphasis on the existence of phase transitions in this space and on the influence the parameters have on these transitions.


Optimizing Boolean Expressions in Object Bases

December 1999

·

124 Reads

·

13 Citations

In this paper we address the problem of optimizing the evaluation of boolean expressions in the context of object-oriented data modelling. We develop a new heuristic for optimizing the evaluation sequence of boolean expressions based on selectivity and cost estimates of the terms constituting the boolean expression. The quality and efficiency of the heuristic is evaluated based on a quantitative analysis which compares our heuristic with the optimal, but infeasible algorithm and other known methods. The heuristic is based on the selectivity and evaluation-cost estimates of the terms of which the boolean expression is composed. Deriving these inputs of the heuristics, i.e., the selectivity and cost estimates, is then addressed. We use an adaptation of well-known sampling for estimating the selectivity of terms. The cost estimation is much more complex than in the relational context due to the possibility of invoking functions within a boolean expression. We develop the decapsulation met...


A Blackboard Architecture for Query Optimization in Object Bases

December 1999

·

46 Reads

·

3 Citations

Adopting the blackboard architecture from the area of Artificial Intelligence, a novel kind of optimizer enabling two desirable ideas will be proposed. Firstly, using such a well-structured approach backpropagation of the optimized queries allows an evolutionary improvement of (crucial) parts of the optimizer. Secondly, the A strategy can be applied to harmonize two contrary properties: Alternatives are generated whenever necessary, and straight-forward optimizing is performed whenever possible, however.


Figure 1: Architecture of a CORBA-integrated federated information system
The Network as a Global Database: Challenges of Interoperability, Proactivity, Interactiveness, Legacy

July 1997

·

29 Reads

·

5 Citations

The current integrated developments in network and computing give rise to a technical infrastructure for the information society which one may variously circumscribe by terms such as ubiquitous computing, telepresence and the network as one giant global database. The paper applies to the network the metaphor of global database, and subsumes the aspects of ubiquity and telepresence under it. It should then be possible to preserve many of the existing database techniques and to concentrate on adjusting these to the network information infrastructure. The paper explores four challenges for adjustment: interoperability due to heterogeneous data repositories, proactivity due to autonomy of data sources, interactiveness due to the need of short-term and task-specific interaction and cooperation, and legacy due to the fitting of old systems to the networked environment. Based on several application projects and exemplary solutions, the paper claims as its experiences that objectorientation pr...


Efficient Consistency Control in Deductive Databases

June 1997

·

3 Reads

·

1 Citation

In this paper a theoretical framework for efficiently checking the consistency of deductive databases is provided and proven to be correct. Our method is based on focussing on the relevant parts of the database by reasoning forwards from the updates of a transaction, and using this knowledge about real or just possible implicit updates for simplifying the consistency constraints in question. Opposite to the algorithms by Kowalski/Sadri and Lloyd/Topor, we are neither committed to determine the exact set of implicit updates nor to determine a fairly large superset of it by only considering the head literals of deductive rule clauses. Rather, our algorithm unifies these two approaches by allowing to choose any of the above or even intermediate strategies for any step of reasoning forwards. This flexibility renders possible the integration of statistical data and knowledge about access paths into the checking process. Second, deductive rules are organized into a graph to avoid searching f...


On the Notion of Concept

June 1997

·

394 Reads

·

2 Citations

The notion of concept is central to the notion of data model in databases. Its purpose is to provide construction mechanisms for organizing a universe of data into manageable segments with a well-defined structure. However, so far the notion lacks rigor with a concomitant danger of inconsistent, overlapping or redundant use of concepts. The paper takes the approach that rigor is achieved by expressing the intuitive semantics within a formal logic. In doing so, the notion of concept may also provide a mechanism for structuring sets of facts, rules, and constraints in general deductive databases. In the paper a formal semantics for concept definitions is given by showing that each concept definition can be interpreted as a mapping from one database state to another. A number of examples show the usefulness of the concept definition. Especially, the basics of object-orientation are modeled as concepts. The paper also examines some difficult issues arising in connection with the concepts i...


Cooperation in Object Bases through Alliances

June 1997

·

6 Reads

Interaction of objects in today's object-oriented database systems is based on four premises: First, messages are procedures, which are executed instantaneously. Second, interaction is limited to two objects, a client and a server. Third, communication contexts are defined algorithmically by procedures --- or at top level --- by transactions. Fourth, all objects are globally accessible no matter where they physically reside in a computer network (location transparency). We argue in this paper that this paradigm of object interaction does not suffice for large-scale cooperative and distributed applications such as computer-integrated manufacturing (CIM), distributed artificial intelligence (DAI), or office automation which are characterized, on the one hand, by autonomously acting agents which allow for the necessary high degree of specialization to meet local computing requirements and, on the other hand, by the need for coordination among these agents. As a remedy we propose a strict ...


A Framework for Predictions: Driving Forces

June 1997

·

12 Reads

this paper is that one such framework could be provided by the forces that drive the current efforts in database research, product management and application and that will do so in the foreseeable future. We claim that there are three main driving forces: applications that have only recently discovered the benefits of database technology, new or evolving base technologies upon which database technology rests or that it can profit from in order to overcome current weaknesses, and the growing number of standards that constrain developments to certain directions but that at the same time promote the integration of heterogeneous technologies and products. The remainder of this paper is organized as follows. Next, in Chapter 2, the three main driving forces are identified in more detail. Then, in Chapters 3 and 4 we examine past and present database developments with the intention of identifying their respective driving forces. Projecting the results of our analyses into the future is the subject of Chapter 5. Chapter 6 concludes this investigation with a critical assessment of our forecasts, in particular by addressing the potential impact of some other, less obvious forces aside from the three main driving forces identified earlier. 2 Driving Forces: Present and Future


Citations (1)


... Theia's query design draws upon results from research in relational databases [6,10,11]. However, unlike queries in relational databases that are textual, queries in Theia are XML data structures and photo-processing code objects. ...

Reference:

Opportunistic Content Search of Smartphone Photos
Optimizing Boolean Expressions in Object Bases