Article

Partial Inductive Definitions.

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

An attempt to consider partial definitions of semantically oriented data types will be described. We will in a certain sense think of such data types as inductively defined. A class of inductive definitions will be interpreted as partial definitions: partial inductive definitions. The presentation of such a definition is in itself elementary and the true complexity of the definition will show itself in questions concerning the isolation of totally defined objects. It is the same situation as in the case of partial recursive functions.The basic aim is to investigate the possibility to give direct inductive definitions of semantical notions exploring, so to speak, the structure of the given notion rather than to think of such notions as indirectly presented by a formal system or given by a definition, together with a proof of its correctness, in terms of recursion on some well-founded structure.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... 5 At this point, an obvious question arises: why 1 Though a Tarskian consequence relation is usually understood as singleconclusioned, as we will be working in a multi-conclusion framework, we prefer this more general definition  which actually should be more accurately called "Scottian" than "Tarskian", due to the multi-conclusion schema. 2 Non-transitive approaches to logical consequence were discussed, previously, in many works  to which the authors refer in their papers. Some of these are due to Strawson (as referred in [7,8,9,13,16,17,27,29,32,33,34,35]. Non-reflexive logics are discussed, for example, in [7,14,25,30]. ...
... A transparent truth predicate  plus a suitable mechanism to achieve self-reference  can be safely added to all of these logics  at least if there is no problematic logical constant around, such as a consistency/classicallity operator. 16 Though we will not say more about this issue here, this happens because they are defined through Strong Kleene valuations (or models, or some other way to interpret predicated and names, as the addition of a truth predicate force us to do). ...
... So far, then, we have provide a way to achieve philosophical inter- 16 As an anonymous referee has pointed out, this is not immediate. First, because these are propositional, and quantified versions of these logics haven't been discussed. ...
Article
We will present all the mixed and impure disjoint three-valued logics based on the Strong Kleene schema. Some, but not all of them, are (inferentially) empty logics, while one of them is trivial. We will compare them regarding their relative strength. We will also provide a recipe for building philosophical interpretations for each of these logics, and show why the kind of permeability that characterises them is not such a bad feature. Finally, we will present a three-side sequent system for most of these logics.
... 11 Now it is only natural to extend this proof-theoretic procedure and equip the sequent system with left-introduction rules. This leads to the theory of definitional reflection, which was first proposed by Hallnäs und has been further developed and investigated by Lars Hallnäs and the author (Hallnäs 1991. More precisely, such generalized clauses can be given a declarative sense. ...
... For computational purposes they would have to be restricted to a language based on conjunction, implication and universal quantification. See Hallnäs & Schroeder-Heister (1990, 1991. pose a program of the form ...
... where, as in the previous section, the bar indicates that the major premiss a i must occur in top position. For more details on the theory of definitional reflection I refer to Hallnäs (1991), and Schroeder-Heister (1992, 1994b. Here, I should only mention that one of the advantages of dealing with arbitrary atoms rather than logically compound formulas is that we can now treat non-wellfounded systems of clauses (programs). ...
Article
Logical calculi exhibit a certain asymmetry between assumptions and assertions. Whereas for assertions there are many specific axioms and/or rules available, for assumptions there is normally only the very unspecific rule which allows the introduction of an arbitrary formula as an assumption. I propose tak- ing Gentzen's cut-free sequent calculus as a model, which, given a certain reading of its left-introduction rules, permits the specific introduction of an assumption according to its meaning. This reading is extended with definitional reflection as a principle for the introduction of atomic assumptions. Although cut does not necessarily hold, especially when non-wellfounded (e.g. circular) construc- tions are used, it can be shown that cut and therefore transitivity of deduction can be re-established, when specific and unspecific assumptions are kept apart. This entire paper intends to demonstrate that the common concept of assump- tion, though lying at the heart of logic, has not yet received a fully satisfactory
... The idea of definitional reflection and the terms " definitional closure " and " definitional reflection " have been proposed by Hallnäs, who investigated this framework in the context of infinitary clauses [3, 4]. The handling of finitary clauses with variables is due to Hallnäs and Schroeder-Heister [5]. ...
... says that correctness of inference establishes consequence. For reasons to be explained later, we call, following Hallnäs [3] a definition D satisfying (11) total, and a definition D satisfying (12) complete (the latter in a sense somewhat different from Hallnäs's). A definition D, which is both total and complete, is called standard, as it underlies standard semantics. ...
... Consequence 14 Sometimes it is, for example in the case of wellfoundedness. 15 So we associate with " completeness " a slightly different meaning from that given to it by Hallnäs [3], who associated with it the fact that for every object a, either |= D a or a |= D ⊥ holds (with ⊥ being undefined). This fact is classically related to completeness in our sense. ...
Article
The hypothetical notion of consequence is normally understood as the transmission of a categorical notion from premisses to conclusion. In model-theoretic semantics this categorical notion is ‘truth’, in standard proof-theoretic semantics it is ‘canonical provability’. Three underlying dogmas, (I) the priority of the categorical over the hypothetical, (II) the transmission view of consequence, and (III) the identification of consequence and correctness of inference are criticized from an alternative view of proof-theoretic semantics. It is argued that consequence is a basic semantical concept which is directly governed by elementary reasoning principles such as definitional closure and definitional reflection, and not reduced to a categorical concept. This understanding of consequence allows in particular to deal with non-wellfounded phenomena as they arise from circular definitions. KeywordsConsequence–Inference–Proof-theoretic semantics–Definitional reflection
... We initiated and partly organized five conferences with corresponding proceedings: ELP1989 in Tübingen (C1991b), ELP1991 in Stockholm (C1992), ELP1992 in Bologna (C1993), ELP1993 in St. Andrews (C1994) and ELP1996 in Leipzig (C1996). From the beginning of the 1990s Hallnäs and I have put more emphasis on definitional reasoning as a foundational approach that goes way beyond logic programming and is a general reasoning principle, as originally intended by Hallnäs (1991Hallnäs ( , 2006 (see my A1993, A1994b). Since it can be applied to any system of definitional rules, thus also to the rule defining in terms of not-, it has applications to paradoxical reasoning, and to any kind of non-wellfounded definition. ...
... This had implications for my later work on paradoxes. Hallnäs had done his Ph.D. with Prawitz in 1983 on normalization in set theory (Hallnäs, 1983), in which he had been strongly involved in the proof-theoretic treatment of paradoxes. Our work on definitional reflection quickly grew into a friendship between us and our families, with many short and long visits to each other's homes. ...
Chapter
Full-text available
In this autobiographical sketch, which is followed by a bibliography of my writings, I try to relate my intellectual development to problems, ideas and results in proof-theoretic semantics on which I have worked and to which I have contributed.
... This is not at all to say, however, that equivocation is the only possible source of nontransitivity in consequence. Nontransitive consequence relations have been argued to arise from at least three phenomena other than conflation (and other than tonk; see footnote 11): relevance (see Lewy [15], Schroeder-Heister [27], Weir [32]). For each of these cases, it would be at least contentious to understand it as a form of conflation. ...
... At least if disjunctive syllogism is valid in an unconflated language, Camp's recommended approach fails to secure validity preservation. 15 ...
Article
I consider the phenomenon of conflation-treating distinct things as one-and develop logical tools for modeling it. These tools involve a purely consequence-theoretic treatment, independent of any proof or model theory, as well as a four-valued valuational treatment
... Common to all these studies is that they focus on formal expressivity results, formal accounts of what classes of objects can be defined. In the words of Hallnäs (1991), these studies were primarily concerned with inductive definability, more than with inductive definitions. ...
... Most inductive definitions of functions are definitions over a well-founded order. Hallnäs (1991) defines and investigates a logic of inductive definitions of (partial) functions. In contrast, the logic that we define in this paper is to define sets. ...
Article
Full-text available
The definition is a common form of human expert knowledge, a building block of formal science and mathematics, a foundation for database theory and is supported in various forms in many knowledge representation and formal specification languages and systems. This paper is a formal study of some of the most common forms of inductive definitions found in scientific text: monotone inductive definition, definition by induction over a well-founded order and iterated inductive definitions. We define a logic of definitions offering a uniform formal syntax to express definitions of the different sorts, and we define its semantics by a faithful formalization of the induction process. Several fundamental properties of definition by induction emerge: the non-determinism of the induction process, the confluence of induction processes, the role of the induction order and its relation to the inductive rules, how the induction order constrains the induction process and, ultimately, that the induction order is irrelevant: the defined set does not depend on the induction order. We propose an inductive construction capable of constructing the defined set without using the induction order. We investigate borderline definitions of the sort that appears in definitional paradoxes.
... This yields a general bias towards positive forward reasoning, which is reflected in the primacy of forward-directed introductions (for a criticism of this approach see Schroeder-Heister, 2012a). To some extent this view is also implicit in the clause-based theory of definitional reflection (Hallnäs, 1991;Schroeder-Heister, 1993), as clauses are directed from bodies to heads, that is, from defining conditions towards defined atoms. The non-determinism in clauses, i.e., the fact that several clauses may define the same atom (which in logic we have, for example, with the introduction rules for disjunction) emphasizes this directedness. ...
... Definitional reflection adapts basic ideas concerning harmony and inversion from the logical realm to the realm of clausal definitions of atoms, inspired by a proof-theoretic interpretation of logic programming (Hallnäs, 1991;Schroeder-Heister, 1993, 2012b. In the simplest case, a definition is a finite list of clauses of the form b 1 , . . . ...
Book
Full-text available
This volume includes fifteen research papers to celebrate Luiz Carlos Pereira's 60th birthday. Among the authors contributing to the volume we find colleagues, friends - including his PhD advisor - and admirers. Similar to Luiz Carlos Pereira's intellectual interests and work, the contributions range from Philosophy to Mathematics, from Mathematics to Logic, and from Logic to Philosophy, passing through Computer Science. They are the result of current research by well-known scholars in these fields. Proof Theory is, maybe, the Ariadne's thread that unites the different subjects treated. Questions around the nature of proofs are often present in Luiz Carlos' formal and informal talks and publications. Of course, he was not the first to dedicate himself to these questions, but he always raises and deals with them enthusiastically. This enthusiasm, together with his intellectual perspicacity, has allowed us to enjoy a wonderful journey into the world of proofs. Although not all contributions focus directly on proof theory, we can feel its echos in all of them.
... Proof systems for logics with definitions or fixed points typically impose a syntactic stratification condition on the body of a recursive definition to make sure that its corresponding fixed point operator has a fixed point. Often such a restriction is too strong; for example, the logics of definitions in [9,4,20,12] all impose a strict stratification where no negative occurrences of a recursive predicate is allowed in the body of its definition. This strict stratification rules out the obvious kinds of inconsistent definitions, such as one that defines an atom to be its negation, e.g., p p ⊃ ⊥. ...
... There is a long series of works on logics of definitions, often extended with induction/co-induction proof rules [9,4,20,12,17,23,6,1]. All these works enforce a strict stratification that forbids negative occurrences of a recursive predicate in its definition. ...
Conference Paper
Proof systems for logics with recursive definitions typically impose a strict syntactic stratification on the body of a definition to ensure cut elimination and consistency of the logics, i.e., by forbidding any negative occurrences of the predicate being defined. Often such a restriction is too strong, as there are cases where such negative occurrences do not lead to inconsistency. Several logical frameworks based on logics of definitions have been used to mechanise reasoning about properties of operational semantics and type systems. However, some of the uses of these frameworks actually go beyond what is justified by their logical foundations, as they admit definitions which are not strictly stratified, e.g., in the formalisation of logical-relation type of arguments in typed λ-calculi. We consider here a more general notion of stratification, which allows one to admit some definitions that are not strictly stratified. We outline a novel technique to prove consistency and a partial cut elimination result, showing that every derivation can be transformed into a certain head normal form, by simulating its cut reductions in an infinitary proof system. We demonstrate this technique for a specific logic, but it can be extended to other richer logics.
... So far there is one definitional programming language GCLA 3 [6,7,25]. In GCLA programs are regarded as belonging to a special class of definitions, the partial inductive definitions (PID) [15,26]. Apart from being a definitional language there is one feature that set GCLA apart from the rest of the declarative languages discussed in this note, namely its approach to control. ...
... • Instead of the syntactically oriented approach in GCLA and in [15,26] a more abstract description of the notion of a definition should be used. A definition D is simply given by the following data: ...
Article
Full-text available
We discuss some approaches to declarative programming in- cluding functional programming, various logic programming languages and extensions, and definitional programming. In particular we discuss the programmers need and possibilities to influence the control part of programs. We also discuss some problems of the definitional programming language GCLA and try to find directions for future research into definitional programming.
... It is then used as a general principle of definitional reasoning (cf. Hallnäs (1991), Hallnäs and Schroeder-Heister (1990/91) and Schroeder-Heister (1993)). Nonetheless, here we present it taking assertions or propositions as atoms, since our main interest is in definitions of logical constants, and each definitional clause can be interpreted as relating assertions. ...
... More general interpretations include partial inductive definitions, which allow for implications in the bodies of clauses and where definitions are not necessarily well-founded (cf. Hallnäs (1991)). Finally, the derivations of the sequents representing disjunction introduction rules are ...
Article
Full-text available
The inversion principle for logical rules expresses a relationship between introduction and elimination rules for logical constants. Hallnäs and Schroeder-Heister (1990/91) proposed the principle of definitional reflection, which embodies basic ideas of inversion in the more general context of clausal definitions. For the context of admissibility statements, this has been further elaborated by Schroeder-Heister (2007). Using the framework of definitional reflection and its admis-sibility interpretation, we show that, in the sequent calculus of minimal propositional logic, the left introduction rules are admissible when the right introduction rules are taken as the definitions of the logical constants and vice versa. This generalises the well-known relationship between introduction and elimination rules in natural deduction to the framework of the sequent calculus. 1. Inversion Principle. The idea of inverting logical rules can be found in a well-known remark by Gentzen: "The introductions are so to say the 'definitions' of the sym-bols concerned, and the eliminations are ultimately only consequences hereof, what can approximately be expressed as follows: In eliminating a symbol, the formula concerned – of which the outermost symbol is in question – may only 'be used as that what it means on the ground of the introduction of that symbol'." 1 The inversion principle itself was formulated by Lorenzen (1955) in the general context of rule-based systems and is thus not restricted to logical rules. It is based on the idea that if we have certain defining rules for some α, e.
... for any C such that B ⊆ C and any p ∈ A, if ϕ C p and ψ C p, then C p One justification for the clauses is the principle of definitional reflection (DR) (see Hallnäs [20,21] and Schroeder-Heister [51]): ...
Article
Full-text available
Proof-theoretic semantics (P-tS) is an innovative approach to grounding logical meaning in terms of proofs rather than traditional truth-conditional semantics. The point is not that one provides a proof system, but rather that one articulates meaning in terms of proofs and provability. To elucidate this paradigm shift, we commence with an introduction that contrasts the fundamental tenets of P-tS with the more prevalent model-theoretic approach to semantics. The contribution of this paper is a P-tS for a substructural logic, intuitionistic multiplicative linear logic (IMLL). Specifically, we meticulously examine and refine the established P-tS for intuitionistic propositional logic. Subsequently, we present two novel and comprehensive forms of P-tS for IMLL. Notably, the semantics for IMLL in this paper embodies its resource interpretation through its number-of-uses reading (restricted to atoms). This stands in contrast to the conventional model-theoretic semantics of the logic, underscoring the value that P-tS brings to substructural logics.
... for any C such that B ⊆ C and any p ∈ A, if ϕ C p and ψ C p, then C p One justification for the clauses is the principle of definitional reflection (DR) (see Hallnäs [14,15] and Schroeder-Heister [31]): ...
Chapter
Full-text available
This work is the first exploration of proof-theoretic semantics for a substructural logic. It focuses on the base-extension semantics (B-eS) for intuitionistic multiplicative linear logic ( IMLL\mathrm IMLL ). The starting point is a review of Sandqvist’s B-eS for intuitionistic propositional logic (IPL), for which we propose an alternative treatment of conjunction that takes the form of the generalized elimination rule for the connective. The resulting semantics is shown to be sound and complete. This motivates our main contribution, a B-eS for IMLL\mathrm IMLL , in which the definitions of the logical constants all take the form of their elimination rule and for which soundness and completeness are established.
... Further investigation is also needed to clarify the exact relatioship between the Prawitz-Tennant analysis of paradoxes based on normalization failure and the solution to paradoxes consisting in restricting the use of the cut rule in sequent calculus, a solution which goes back at least to Hallnäs (1991) and that has been recently brought up again by several authors, notably Ripley (2013). ...
Article
Full-text available
Developing early results of Prawitz, Tennant proposed a criterion for an expression to count as a paradox in the framework of Gentzen’s natural deduction: paradoxical expressions give rise to non-normalizing derivations. Two distinct kinds of cases, going back to Crabbé and Tennant, show that the criterion overgenerates, that is, there are derivations which are intuitively non-paradoxical but which fail to normalize. Tennant’s proposed solution consists in reformulating natural deduction elimination rules in general (or parallelized) form. Developing intuitions of Ekman we show that the adoption of general rules has the consequence of hiding redundancies within derivations. Once reductions to get rid of the hidden redundancies are devised, it is clear that the adoption of general elimination rules offers no remedy to the overgeneration of the Prawitz–Tennant analysis. In this way, we indirectly provide further support for a solution to one of the two overgeneration cases developed in previous work.
... Many consequence relations obey this condition (indeed, sometimes this is required as part of how 'consequence relation' is understood!); but in recent years, there has been increasing interest in systems that do not. (For examples, see Frankowski 2004;Hallnäs 1991;Hallnäs and Schroeder-Heister 1991;Tennant 1987;Weir 1998;Zardini 2008. Of these, only Zardini 2008 focus on vagueness.) ...
Article
Full-text available
The principle of tolerance characteristic of vague predicates is sometimes presented as a soft rule, namely as a default which we can use in ordinary reasoning, but which requires care in order to avoid paradoxes. We focus on two ways in which the tolerance principle can be modeled in that spirit, using special consequence relations. The first approach relates tolerant reasoning to nontransitive reasoning; the second relates tolerant reasoning to nonmonotonic reasoning. We compare the two approaches and examine three specific consequence relations in relation to those, which we call: strict-to-tolerant entailment, pragmatic-to-tolerant entailment, and pragmatic-to-pragmatic entailment. The first two are nontransitive, whereas the latter two are nonmonotonic.
... Even if we note that there are no definitions having the closure properties stated in Sect. 2 above, there is still the possibility to read these definitions from a more strict intensional point of view. We then look at the closure conditions as clauses in two partial inductive definitions ([6, 7, 13]). The idea is basically to look at if . . ...
Chapter
Full-text available
In this paper we discuss a proof-theoretic foundation of set theory that focusses on set definitions in an open type free framework. The idea to make Cantor’s informal definition of the notion of a set more precise by saying that any given property defines a set seems to be in conflict with ordinary modes of reasoning. There is to some extent a confusion here between extensional perspectives (sets as collections of objects) and intensional perspectives (set theoretic definitions) that the central paradoxes build on. The solutions offered by Zermelo-Fraenkel set theories, von Neumann-Bernays set-class theories and type theories follow the strategy of retirement behind more or less safe boundaries. What if we revisit the original idea without making strong assumptions on closure properties of the theoretical notion of a set? That is, take the basic definitions for what they are without confusing the borders between intensional and extensional perspectives.
... This connects the proof theory of clausal definitions with theories of paradoxes, which conceive paradoxes as based on locally correct reasoning (Prawitz [44] (Appendix B), Tennant [68], Schroeder-Heister [62], Tranchini [71]). For the situation that obtains here, Hallnäs [28] proposed the terms 'total' vs. 'partial' in analogy with the terminology used in recursive function theory. That a computable (i.e., partial recursive) function is total is not something required by definition, but is a matter of (mathematical) fact, actually an undecidable matter. ...
Chapter
Full-text available
I present three open problems the discussion and solution of which I consider relevant for the further development of proof-theoretic semantics: (1) The nature of hypotheses and the problem of the appropriate format of proofs, (2) the problem of a satisfactory notion of proof-theoretic harmony, and (3) the problem of extending methods of proof-theoretic semantics beyond logic.
... The transitive closure T C E of a graph E is defined as the set of pairs of nodes (x, y) such that there is a path from x to y in E. It is well-known that this relation cannot be expressed in general in first-order logic. In order to overcome this limitation, several authors have studied the concept of inductive definitions (Post, 1943;Spector, 1961;Kreisel, 1963;Feferman, 1970;Martin-Löf, 1971; Moschovakis, 1974a,b;Aczel, 1977;Buchholz et al., 1981;Hallnäs, 1991). Inductive definitions, and related fixpoint constructs have been integrated with first-order logic in many languages such as µ-calculus (Kozen, 1983;Streett and Emerson, 1989), database query languages (Afanasiev et al., 2008;Bidoit and Ykhlef, 1998), description logics (Calvanese et al., 1999) and in the logic FO(ID) (Denecker and Ternovska, 2008). ...
Thesis
Full-text available
In the field of knowledge representation and reasoning, many different logics are developed. Often, these logics exhibit striking similarities, either because they emerged from related ideas, or because they use similar underlying fundamental principles. Approximation fixpoint theory (AFT) is an abstract algebraical unifying framework that aims at exposing these principles by formalising them in lattice theory. It has been successfully applied to unify all common semantics of logic programs, autoepistemic logic, default logic, and more recently Dung's argumentation frameworks and abstract dialectical frameworks. In this dissertation, we extend approximation fixpoint theory to expose more underlying principles common to the aforementioned logics. In these domains, researchers have made use of a similar intuition: that facts (or models) can be derived from the ground up. They typically phrase this intuition by saying, e.g., that the facts should be grounded, or that they should not be unfounded, or that they should be supported by cycle-free arguments. In different domains, semantics that allow ungrounded models have received a lot of criticism. In logic programming for example, this was the case for Clark's completion semantics, which was later improved by perfect model semantics, stable semantics and well-founded semantics. In autoepistemic logic, a similar evolution happened: Moore's expansion semantics turned out to allow self-supporting models; this resulted in the development of many different semantics in attempts to get rid of this erroneous behaviour. In the first part of this dissertation, we formalise groundedness in approximation fixpoint theory. We study how groundedness relates to other concepts and fixpoints studied in AFT. We apply our abstract theory to the aforementioned domains: we show that our notion of groundedness indeed captures the intuitions that existed in these domains and study complexity of reasoning with grounded models. We study which existing semantics are grounded and which are not. For example, for logic programming, we find that Clark's completion semantics (indeed) is not grounded, while stable and well-founded semantics are grounded. We show that the well-founded model is not just any grounded model: it is the least precise partial grounded model. In the second part of this thesis we define a class of autoepistemic theories for which it is informally clear how to construct the intended model. Unfortunately, existing constructive semantics for autoepistemic logic, in particular, the well-founded semantics, fail to identify this model. In order to overcome this limitation, we propose, algebraically, a new constructive semantics based on the notion of groundedness. Our new construction refines the well-founded model construction and succeeds in identifying the intended model for the class of motivating examples. Furthermore, we show that for this class of examples, our novel construction constructs the unique grounded fixpoint. Summarised, in this dissertation, we continue the work on approximation fixpoint theory by identifying novel concepts occurring in all of the application domains and by refining existing semantics to better capture the intended meaning of a class of theories.
... Different simplified versions of paradoxes such as the one given above have been discussed on the background of certain extensions of both natural deduction and sequent calculus. Schroeder-Heister [31], developing ideas of Hallnäs [13], considers extensions of both sequent calculi and natural deduction by means of clausal definitions. In this context, paradoxes are typical examples of nonwell-founded definitions such as the following one, in which an atom R (only a notational variant of the λ used in this article) is defined through its own negation: ...
Article
Full-text available
In this paper we show how Dummett-Prawitz-style proof-theoretic semantics has to be modified in order to cope with paradoxical phenomena. It will turn out that one of its basic tenets has to be given up, namely the definition of the correctness of an inference as validity preservation. As a result, the notions of an argument being valid and of an argument being constituted by correct inference rules will no more coincide. The gap between the two notions is accounted for by introducing the distinction between sense and denotation in the proof-theoretic-semantic setting.
... As described in Falkman and Torgersson [40], the approach of MedView is to combine formal knowledge representation of clinical concepts with the design of flexible and user-friendly tools that are quickly brought into everyday practice. The knowledge representation is a declarative model based on the assumption that definitions are central tools in all attempts to provide a precise and formalized representation of knowledge [60]. The clinical knowledge used in MedView is divided into examination templates, value lists, and aggregates (also referred to as value classes). ...
... Therefore logically compound expressions , for which a definition is given, would be atoms in our sense as well. (For further details see Hallnäs 1991; Hallnäs and Schroeder-Heister 1990/1991.) In the present context it is more instructive and useful to present the inference rules as a type system ( ...
Article
From the point of view of proof-theoretic semantics, it is argued that the sequent calculus with introduction rules on the assertion and on the assumption side represents deductive reasoning more appropriately than natural deduction. In taking consequence to be conceptually prior to truth, it can cope with non-well-founded phenomena such as contradictory reasoning. The fact that, in its typed variant, the sequent calculus has an explicit and separable substitution schema in form of the cut rule, is seen as a crucial advantage over natural deduction, where substitution is built into the general framework.
... The whole approach is called 'definitional reasoning', with the right and left rules being called 'definitional closure' and 'definitional reflection', respectively. (This terminology as well as the basic idea of definitional reflection is due to Hallnäs 1991Hallnäs , 2006.) Definitional closure is reasoning along the definitional clauses according to the principle ...
Article
1. The model-theoretic view By 'model-theoretic consequence' we mean logical consequence explained in terms of models. According to the standard reading given to it by Tarski, which is related to ideas already developed by Bolzano, a sentence A follows logically from a set of sentences M, iff every model of M is a model of A, symbolically MA ,df (8M) ( (8B 2 M)(MB) ) MA ). Thus consequence is transmission of truth from the premisses to the conclusion, where 'transmission' is understood in the simple declarative sense of classical implication: if the premisses are true (in a model-theoretic structure), then so is the conclusion. This means in particular that truth is conceptually prior to consequence, as the latter is explained in terms of the former. Using a more traditional terminology, we might say that the categorical concept of truth is conceptually prior to the hypothetical concept of consequence. Proof-theoretic consequence is normally understood as derivability in a formal sys- tem. A sentence A is derivable from a set of sentences M in a formal system K if it can be generated from elements of M using the axioms and inference rules of K, formally MK A.
... MedView has a declarative model which is based on the assumption that definitions are central tools in all attempts to provide a precise and formalized representation of knowledge [6]. We begin by explaining how this definitional approach has been realized in MedView, followed by a description of how templates and value lists are stored. ...
Article
This report describes the remodeling of the representation of clinical examinations in oral medicine, from the previous proprietary format used by the MedView project, to using the World Wide Web Consortium's recommendations Web Ontology Language (OWL) and Resource Description Framework (RDF). This includes the representation of (1) ex-amination templates, (2) lists of values that can be included in individual examination records, and (3) aggregates of such values used for e.g., analyzing and visualizing data. It also includes the representation of (4) individual examination records. We describe how OWL and RDF are used to represent these different knowledge components of MedView, along with the design decisions made in the remodeling process. These design decisions are related to, among other things, whether or not to use the constructs of domain and range, appropriate naming in URIs, the level of detail to initially aim for, and appropriate use of classes and individuals. A description of how these new representations are used in the previous applications and code base is also given, as well as their use in the Swedish Oral Medicine Web (SOMWeb) online community. We found that OWL and RDF can be used to address most, but not all, of the requirements we compiled based on the limitations of the MedView knowledge model. Our experience in using OWL and RDF is that, while there is much useful support material available, there is some lack of support for important design decisions and best practice guidelines are still under development. At the same time, using OWL gives us access to a potentially beneficial array of externally developed tools and the ability to come back and refine the knowledge model after initial deployment.
... For BL we rely on Sambin et al. [10] and Sambin [9]. DR is explained, e.g., in Hallnäs [6], Hallnäs & Schroeder-Heister [7, 8] and Schroeder-Heister [12] (for more philosophical remarks see also Schroeder-Heister [13] and the references therein). We confine ourselves to propositional logic in order to make the conceptual points clear. ...
Article
In their Basic Logic, Sambin, Battilotti and Faggian give a foundation of logical inference rules by reference to certain reflection principles. We investigate the relationship between these principles and the principle of Definitional Reflection proposed by Hallnäs and Schroeder-Heister.
... The definitional programming language GCLA 1 [5,3,16,17] was developed as a tool suitable for the design and implementation of knowledge-based systems. In GCLA, programs are regarded as instances of a specific class of definitions, the partial inductive definitions (PID) [13]. The definitions of GCLA consist of a number of definitional clauses ...
Conference Paper
Full-text available
In 1995, the MedView project, based on a co-operation between computing science and clinical medicine was initiated. The overall aim of the project is to develop models, methods, and tools to support clinicians in their diagnostic work. Today, the system is in daily use at several clinics and the knowledge base created contains more than 2000 examination records from the involved clinics. Knowledge representation and reasoning within MedView uses a declarative model based on a theory of definitions. In order to be able to model knowledge declaratively and integrate reasoning into applications with GUIs a framework for definitional programming has been developed. We give an overview of the project and of how declarative programming techniques are integrated with industrial strength object-oriented programming tools to facilitate the development of real-world applications.
... An alternative position, which takes the concept of consequence primary, would go beyond both Lorenzen's and Prawitz's conception of validity, and also beyond the classical view of truth as the basis of consequence. Such ideas are developed in Hallnäs (1991 Hallnäs ( , 2006), Schroeder-Heister (1993, 2004, 2008a ) and Heister & Contu (2005). Ad (iii) As the canonical vs. non-canonical distinction constitutes the fundamental difference between the two approaches, this point is dealt with in a separate section. ...
Chapter
With his Introduction to Operative Logic and Mathematics1, which first appeared in 1955, Paul Lorenzen became an exponent of an approach to the foundations of logic and mathematics, which is both formalistic and intuitionistic in spirit. Formalistic because its basis is the purely syntactical handling of symbols—or ‘figures’, as Lorenzen preferred to say —, and intuitionistic because the insight into the validity of admissibility statements justifies the laws of logic. It is also intuitionistic with respect to its result, as Heyting’s formalism of intuitionistic logic is legitimatised this way. Along with taking formal calculi as its basis, the notion of an inductive definition becomes fundamental. Together with a theory of abstraction and the idea of transfinitely interating inductive definitions, Lorenzen devised a novel foundation for mathematics, many aspects of which still deserve attention. When he wrote his Operative Logic, neither a full-fledged theory of inductive definitions nor a proof-theoretic semantics for logical constants was available. A decade later, Lorenzen’s inversion principle was used and extended by Prawitz (Prawitz 1965) in his theory of natural deduction, and in the 1970s, the idea of inversion was used for a logical semantics in terms of proofs by Dummett, Martin-Löf, Prawitz and others. Prawitz and others. Another aspect which makes Lorenzen’ theory interesting from a modern point of view, is that in his protologic he anticipated certain views of rule-based reasoning and free equality which much later became central to the theory of resolution and logic programming.
... Definitional reflection has been developed in the context of partial inductive definitions (see [4]), where it is considered to be a fundamental principle of reasoning which is dual to the (more common) principle of definitional closure. In the present context, however, where I want to compare it with the inversion principle proposed by Lorenzen [10, 11] in the early 1950s, a narrower interpretation is appropriate: the admissibility interpretation. ...
Article
The term inversion principle goes back to Lorenzen who coined it in the early 1950s. It was later used by Prawitz and others to describe the symmetric relationship between introduction and elimination inferences in natural deduction, sometimes also called harmony. In dealing with the invertibility of rules of an arbitrary atomic production system, Lorenzen’s inversion principle has a much wider range than Prawitz’s adaptation to natural deduction. It is closely related to definitional reflection, which is a principle for reasoning on the basis of rule-based atomic definitions, proposed by Hallnäs and Schroeder-Heister. After presenting definitional reflection and the inversion principle, it is shown that the inversion principle can be formally derived from definitional reflection, when the latter is viewed as a principle to establish admissibility. Furthermore, the relationship between definitional reflection and the inversion principle is investigated on the background of a universalization principle, called the ω-principle, which allows one to pass from the set of all defined substitution instances of a sequent to the sequent itself.
... Our main contribution with respect to McBride's work is that we allow matching against the structure of higher-order terms which poses significant additional challenges. Another related development is the theory of partial inductive definitions [7], especially in its finitary form [6] and the related notion of definitional reflec- tion [22] . This calculus contains a rule schema that, re-interpreted in our context , would allow any (finite) complete set of unifiers between a coverage goal ∆ A : type and the heads of the clauses defining A. Because of the additional condition of so-called a-sufficiency for the substitutions this was never fully automated . ...
Conference Paper
Coverage checking is the problem of deciding whether any closed term of a given type is an instance of at least one of a given set of patterns. It can be used to verify if a function defined by pattern match- ing covers all possible cases. This problem has a straightforward solution for the first-order, simply-typed case, but is in general undecidable in the presence of dependent types. In this paper we present a terminating algorithm for verifying coverage of higher-order, dependently typed pat- terns. It either succeeds or presents a set of counterexamples with free variables, some of which may not have closed instances (a question which is undecidable). Our algorithm, together with strictness and termination checking, can be used to certify the correctness of numerous proofs of properties of deductive systems encoded in a system for reasoning about LF signatures.
Article
In 1973, Dag Prawitz conjectured that the calculus of intuitionistic logic is complete with respect to his notion of validity of arguments. On the background of the recent disproof of this conjecture by Piecha, de Campos Sanz and Schroeder‐Heister, we discuss possible strategies of saving Prawitz's intentions. We argue that Prawitz's original semantics, which is based on the principal frame of all atomic systems, should be replaced with a general semantics, which also takes into account restricted frames of atomic systems. We discard the option of not considering extensions of atomic systems, but acknowledge the need to incorporate definitional atomic bases in the semantic framework. It turns out that ideas and results by Westerståhl on the Carnap categoricity of intuitionistic logic can be applied to Prawitz semantics. This implies that Prawitz semantics has a status of its own as a genuine, though incomplete, semantics of intuitionstic logic. An interesting side result is the fact that every formula satisfiable in general semantics is satisfiable in an axioms‐only frame (a frame whose atomic systems do not contain proper rules). We draw a parallel between this seemingly paradoxical result and Skolem's paradox in first‐order model theory.
Chapter
Full-text available
Two distinct kinds of cases, going back to Crabbé and Ekman, show that the Tennant-Prawitz criterion for paradoxicality overgenerates, that is, there are derivations which are intuitively non-paradoxical but which fail to normalize. We argue that a solution to “Ekman’s paradox” consists in restricting the set of admissible reduction procedures to those that do not yield a trivial notion of identity of proofs. We then discuss a different kind of solution, due to von Plato, and recently advocated by Tennant, consisting in reformulating natural deduction elimination rules in general (or parallelized) form. Developing intuitions of Ekman we show that the adoption of general rules has the consequence of hiding redundancies within derivations. Once reductions to get rid of the hidden redundancies are devised, it is clear that the adoption of general elimination rules offers no remedy to the overgeneration of the Prawitz-Tennant analysis. In this way, we indirectly provide further support for our own solution to Ekman’s paradox.
Chapter
Full-text available
The chapter introduces the Prawitz-Tennant analysis of paradoxes, according to which paradoxes are derivations of a contradiction which cannot be brought into normal form, due to “loops” arising in the process of reduction. After presenting Prawitz’ original formulation of Russell’s paradox, we introduce a simplified presentation of it, and then discuss the relevance of the difference between intuitionistic and classical logic and of structural properties of derivability for the Prawitz-Tennant analysis.
Chapter
Full-text available
The initial premise of this paper is that the structure of a proof is inherent in the definition of the proof. Side conditions to deal with the discharging of assumptions means that this does not hold for systems of natural deduction, where proofs are given by monotone inductive definitions. We discuss the idea of using higher order definitions and the notion of a functional closure as a foundation to avoid these problems. In order to focus on structural issues we introduce a more abstract perspective, where a structural proof theory becomes part of a more general theory of functional closures. A notion of proof equations is discussed as a structural classifier and we compare the Russell and Ekman paradoxes to illustrate this.
Chapter
Full-text available
In this note, we review paradoxes like Russell’s, the Liar, and Curry’s in the context of intuitionistic logic. One may observe that one cannot blame the underlying logic for the paradoxes, but has to take into account the particular concept formations. For proof-theoretic semantics, however, this comes with the challenge to block some forms of direct axiomatizations of the Liar. A proper answer to this challenge might be given by Schroeder-Heister’s definitional freedom.
Preprint
Full-text available
We will present 12 different mixed metainferential consequence relations. Each one of them is specified using two different inferential Tarskian or non-Tarskian consequence relations: K3,LP,ST or TS+. We will show that it is possible to obtain a Tarskian logic with non-Tarskian inferential logics, but also a non-Tarskian logic with Tarskian inferential logics. Moreover, we will show how some of these metainferential logics work better than the corresponding inferential rivals. Finally, we will show how these logics prove that it is not enough to work with inferences as pairs of sets of formulas to obtain a contractive logic.
Article
Definitional reflection rules (DRRs) provide a proof-theoretic framework for dealing with a set of clauses. An infinite version of definitional reflection logic (DRL), which has some infinite-premise DRRs, is introduced according to Gentzen-type sequent calculus for classical propositional logic. A finite version of DRL is obtained from the infinite version of DRL by replacing the infinite-premise DRRs with finite ones. A theorem for embedding the infinite version into infinitary propositional logic is proved, and a theorem for embedding the finite version into classical propositional logic is shown. The cut-elimination theorems for Gentzen-type sequent calculi for these versions are obtained using these embedding theorems. The finite version is shown to be decidable. Some similar results for the infinite and finite versions of generalized definitional reflection logic (GDRL) which has generalized definitional reflection rules (GDRRs) are also obtained. Some paraconsistent and temporal extensions of the above-mentioned classical versions of DRL and GDRL are also investigated.
Chapter
This paper describes the logic programming language GCLA II, its operational semantics and parts of its theoretical foundations. GCLA II is a generalization of the language GCLA (Generalized Horn Clause Language) augmented by a method to guide and constrain proof search. The method is based on specification of strategies in a meta language that is a sub language of GCLA itself. A GCLA II program is partitioned into two distinct parts. One is used to express the declarative content of the program, while the other is used to define the possible inferences made from this declarative knowledge. Although the intended use of the declarative part and the procedural parts are quite different, they can both be described in the formalism of partial inductive definitions. Thus we preserve a declarative reading of the program as a whole. In particular, given certain syntactical constraints on the program, the heuristics associated with proof search does not affect the declarative reading of the declarative part of the program at all. Experimental interpreters for the language have been implemented and work on a compiler from GCLA II to Prolog is almost completed.
Chapter
In logical sequent calculi, initial sequents expressing the axiom of identity can be required to be atomic, without affecting the deductive strength of the system. When extending the logical system with right- and left-introduction rules for atomic formulas, one can analogously require that initial sequents be restricted to “uratoms”, which are undefined (not even vacuously defined) atoms. Depending on the definitional clauses for atoms, the resulting system is possibly weaker than the unrestricted one. This weaker system may however be preferable to the unrestricted system, as it enjoys cut elimination and blocks unwanted derivations arising from non-wellfounded definitions, for example in the context of paradoxes.
Chapter
Atomic systems are systems of rules containing only atomic formulas. In proof-theoretic semantics for minimal and intuitionistic logic they are used as the base case in an inductive definition of validity. We compare two different approaches to atomic systems. The first approach is compatible with an interpretation of atomic systems as representations of states of knowledge. The second takes atomic systems to be definitions of atomic formulas. The two views lead to different notions of derivability for atomic formulas, and consequently to different notions of proof-theoretic validity. In the first approach, validity is stable in the sense that for atomic formulas logical consequence and derivability coincide for any given atomic system. In the second approach this is not the case. This indicates that atomic systems as definitions, which determine the meaning of atomic sentences, might not be the proper basis for proof-theoretic validity, or conversely, that standard notions of proof-theoretic validity are not appropriate for definitional rule systems.
Conference Paper
We develop an abstract theory of justifications suitable for describing the semantics of a range of logics in knowledge representation, computational and mathematical logic. A theory or program in one of these logics induces a semantical structure called a justification frame. Such a justification frame defines a class of justifications each of which embodies a potential reason why its facts are true. By defining various evaluation functions for these justifications, a range of different semantics are obtained. By allowing nesting of justification frames, various language constructs can be integrated in a seamless way. The theory provides elegant and compact formalisations of existing and new semantics in logics of various areas, showing unexpected commonalities and interrelations, and creating opportunities for new expressive knowledge representation formalisms.
Article
European Summer Meeting of the Association for Symbolic Logic - Volume 57 Issue 1 - E.-J. Thiele
Article
We present our calculus of higher-level rules, extended with propositional quantification within rules. This makes it possible to present general schemas for introduction and elimination rules for arbitrary propositional operators and to define what it means that introductions and eliminations are in harmony with each other. This definition does not presuppose any logical system, but is formulated in terms of rules themselves. We therefore speak of a foundational (rather than reductive) account of proof-theoretic harmony. With every set of introduction rules a canonical elimination rule, and with every set of elimination rules a canonical introduction rule is associated in such a way that the canonical rule is in harmony with the set of rules it is associated with. An example given by Hazen and Pelletier is used to demonstrate that there are significant connectives, which are characterized by their elimination rules, and whose introduction rule is the canonical introduction rule associated with these elimination rules. Due to the availabiliy of higher-level rules and propositional quantification, the means of expression of the framework developed are sufficient to ensure that the construction of canonical elimination or introduction rules is always possible and does not lead out of this framework.
Article
Several proof-theoretic notions of validity have been proposed in the literature, for which completeness of intuitionistic logic has been conjectured. We define validity for intuitionistic propositional logic in a way which is common to many of these notions, emphasizing that an appropriate notion of validity must be closed under substitution. In this definition we consider atomic systems whose rules are not only production rules, but may include rules that allow one to discharge assumptions. Our central result shows that Harrop’s rule is valid under substitution, which refutes the completeness conjecture for intuitionistic logic.
Article
The aim of this paper is to reconsider several proposals that have been put forward in order to develop a Proof-Theoretical Semantics, from the by now classical neo-verificationist approach provided by D. Prawitz and M. Dummett in the Seventies, to an alternative, more recent approach mainly due to the work of P. Schroeder-Heister and L. Hallnäs, based on clausal definitions. Some other intermediate proposals are very briefly sketched. Particular attention will be given to the role played by the so-called Fundamental Assumption. We claim that whereas, in the neo-verificationist proposal, the condition expressed by that Assumption is necessary to ensure the completeness of the justification procedure (from the outside, so to speak), within the definitional framework it is a built-in feature of the proposal. The latter approach, therefore, appears as an alternative solution to the problem which prompted the neo-verificationists to introduce the Fundamental Assumption.
Article
The Hybrid system (Ambler et al . 2002b), implemented within Isabelle/HOL, allows object logics to be represented using higher order abstract syntax (HOAS), and reasoned about using tactical theorem proving in general, and principles of (co)induction in particular. The form of HOAS provided by Hybrid is essentially a lambda calculus with constants. Of fundamental interest is the form of the lambda abstractions provided by Hybrid . The user has the convenience of writing lambda abstractions using names for the binding variables. However, each abstraction is actually a definition of a de Bruijn expression, and Hybrid can unwind the user's abstractions (written with names) to machine friendly de Bruijn expressions (without names). In this sense the formal system contains a hybrid of named and nameless bound variable notation. In this paper, we present a formal theory in a logical framework, which can be viewed as a model of core Hybrid , and state and prove that the model is representationally adequate for HOAS. In particular, it is the canonical translation function from λ-expressions to Hybrid that witnesses adequacy. We also prove two results that characterise how Hybrid represents certain classes of λ-expression. We provide the first detailed proof to be published that proper locally nameless de Bruijn expressions and α-equivalence classes of λ-expressions are in bijective correspondence. This result is presented as a form of de Bruijn representational adequacy, and is a key component of the proof of Hybrid adequacy. The Hybrid system contains a number of different syntactic classes of expression, and associated abstraction mechanisms. Hence, this paper also aims to provide a self-contained theoretical introduction to both the syntax and key ideas of the system. Although this paper will be of considerable interest to those who wish to work with Hybrid in Isabelle/HOL, a background in automated theorem proving is not essential.
Article
Combining higher-order abstract syntax and (co)-induction in a logical framework is well known to be problematic. We describe the theory and the practice of a tool called Hybrid, within Isabelle/HOL and Coq, which aims to address many of these difficulties. It allows object logics to be represented using higher-order abstract syntax, and reasoned about using tactical theorem proving and principles of (co)induction. Moreover, it is definitional, which guarantees consistency within a classical type theory. The idea is to have a de Bruijn representation of λ-terms providing a definitional layer that allows the user to represent object languages using higher-order abstract syntax, while offering tools for reasoning about them at the higher level. In this paper we describe how to use Hybrid in a multi-level reasoning fashion, similar in spirit to other systems such as Twelf and Abella. By explicitly referencing provability in a middle layer called a specification logic, we solve the problem of reasoning by (co)induction in the presence of non-stratifiable hypothetical judgments, which allow very elegant and succinct specifications of object logic inference rules. We first demonstrate the method on a simple example, formally proving type soundness (subject reduction) for a fragment of a pure functional language, using a minimal intuitionistic logic as the specification logic. We then prove an analogous result for a continuation-machine presentation of the operational semantics of the same language, encoded this time in an ordered linear logic that serves as the specification layer. This example demonstrates the ease with which we can incorporate new specification logics, and also illustrates a significantly more complex object logic whose encoding is elegantly expressed using features of the new specification logic. KeywordsLogical frameworks–Higher-order abstract syntax–Interactive theorem proving–Induction–Variable binding–Isabelle/HOL–Coq
Article
Intuitionistic and linear logics can be used to specify the operational semantics of transition systems in various ways. We consider here two encodings: one uses linear logic and maps states of the transition system into formulas, and the other uses intuitionistic logic and maps states into terms. In both cases, it is possible to relate transition paths to proofs in sequent calculus. In neither encoding, however, does it seem possible to capture properties, such as simulation and bisimulation, that need to consider all possible transitions or all possible computation paths. We consider augmenting both intuitionistic and linear logics with a proof theoretical treatment of definitions. In both cases, this addition allows proving various judgments concerning simulation and bisimulation (especially for noetherian transition systems). We also explore the use of infinite proofs to reason about infinite sequences of transitions. Finally, combining definitions and induction into sequent calculus proofs makes it possible to reason more richly about properties of transition systems completely within the formal setting of sequent calculus.
Conference Paper
The goal of the MedView project is to develop models, methods, and tools to support clinicians in their daily work and research. MedView is based on a formal declarative model, which constitutes the main governing principle in MedView, not only in the formalization of knowledge, but in visualization models and in the design and implementation of individual tools and the system as a whole as well. Tools are provided for modeling, acquiring, and sharing knowledge, and for visualization and analysis of data.
Conference Paper
We give a short introduction to Martin-Löf's Type Theory, seen as a theory of inductive definitions. The first part contains historical remarks that motivate this approach. The second part presents a computational semantics, which explains how proof trees can be represented using the notations of functional programming.
Conference Paper
Full-text available
Combining Higher Order Abstract Syntax (HOAS) and (co)- induction is well known to be problematic. In previous work [1] we have described the implementation of a tool called Hybrid, within Isabelle HOL, which allows object logics to be represented using HOAS, and reasoned about using tactical theorem proving and principles of (co)induction. Moreover, it is definitional, which guarantees consistency within a classical type theory. In this paper we describe how to use it in a multi-level reasoning fashion, similar in spirit to other meta-logics such FOλ△IN and Twelf. By explicitly referencing provability, we solve the problem of reasoning by (co)induction in presence of non-stratifiable hypothetical judgments, which allow very elegant and succinct specifications. We demonstrate the method by formally verifying the correctness of a compiler for (a fragment) of Mini-ML, following [10]. To further exhibit the flexibility of our system, we modify the target language with a notion of non-well-founded closure, inspired by Milner & Tofte [16] and formally verify via co-induction a subject reduction theorem for this modified language.
Article
Full-text available
Applications of the Burali-Forti paradox to some formal systems are presented. The first part of this study concerns extensions of Church's higher-order logic. It is explained why this calculus is inconsistent, and this result is applied to the type ML system of P. Martin-Loef (1971, 1975, 1980). Another extension is to the introduction of ML polymorphism. It is shown that this calculus is still inconsistent. It is, however, possible to generalize Church's systems with a (weak) notion of polymorphism in a consistent way, and the corresponding type system is presented. The second part of this study is largely a reformulation of the first in a natural deduction framework. 30 refs.
Article
Full-text available
Isabelle [28, 30] is an interactive theorem prover that supports a variety of logics. It represents rules as propositions (not as functions) and builds proofs by combining rules. These operations constitute a meta-logic (or ‘logical framework’) in which the object-logics are formalized. Isabelle is now based on higher-order logic-a precise and well-understood foundation. Examples illustrate the use of this meta-logic to formalize logics and proofs. Axioms for first-order logic are shown to be sound and complete. Backwards proof is formalized by meta-reasoning about object-level entailment. Higher-order logic has several practical advantages over other meta-logics. Many proof techniques are known, such as Huet's higher-order unification procedure.
Chapter
Inductive definitions of sets are often informally presented by giving some rules for generating elements of the set and then adding that an object is to be in the set only if it has been generated according to the rules. An equivalent formulation is to characterize the set as the smallest set closed under the rules. This chapter discusses monotone induction and its role in extensions of recursion theory. The chapter reviews some of the work on non-monotone induction and outlines the separate motivation that has led to its development. The chapter briefly considers inductive definitions in a more general context.
Article
This chapter discusses that relating constructive mathematics to computer programming seems to be beneficial. Among the benefits to be derived by constructive mathematics from its association with computer programming, one is that you see immediately why you cannot rely upon the law of excluded middle: its uninhibited use would lead to programs that one did not know how to execute. By choosing to program in a formal language for constructive mathematics, like the theory of types, one gets access to the conceptual apparatus of pure mathematics, neglecting those parts that depend critically on the law of excluded middle, whereas even the best high level programming languages so far designed are wholly inadequate as mathematical languages. The virtue of a machine code is that a program written in it can be directly read and executed by the machine. The distinction between low and high level programming languages is of course relative to the available hardware. It may well be possible to turn what is now regarded as a high level programming language into machine code by the invention of new hardware.
Article
The notion of Frege structure is introduced and shown to give a coherent context for the rigorous development of Frege's logical notion of set and an explanation of Russell's paradox.
Article
If programming is understood not as the writing of instructions for this or that computing machine but as the design of methods of computation that it is the computer’s duty to execute (a difference that Dijkstra has referred to as the difference between computer science and computing science), then it no longer seems possible to distinguish the discipline of programming from constructive mathematics. This explains why the intuitionistic theory of types (Martin-Lof 1975 In Logic Colloquium 1973 (ed. H. E. Rose & J. C. Shepherdson), pp. 73- 118. Amsterdam: North-Holland), which was originally developed as a symbolism for the precise codification of constructive mathematics, may equally well be viewed as a programming language. As such it provides a precise notation not only, like other programming languages, for the programs themselves but also for the tasks that the programs are supposed to perform. Moreover, the inference rules of the theory of types, which are again completely formal, appear as rules of correct program synthesis. Thus the correctness of a program written in the theory of types is proved formally at the same time as it is being synthesized.
Article
To every partial inductive definitionD, a natural deduction calculusND(D) is associated. Not every such system will have the normalization property; specifically, there are definitionsD′ for whichND(D′) permits non-normalizable deductions. A lambda calculus is formulated where the terms are used as objects realizing deductions inND(D), and is shown to have the Church-Rosser property. SinceND(D) permits non-normalizable deductions, there will be typed terms which are non-normalizable. It will, for example, be possible to obtain a typed fixed-point operator.
Article
acknowledgment of the authors and individuals contributors to the work; and all applicable portions of the copyright notice. Copying, reproducing, or republishing for any other purpose shall require a license with payment of fee to the Systems Research Center. All rights reserved. Page 1
Article
This chapter presents an exposition of certain themes in proof theory. The chapter discusses ideas behind what may be called Gentzen-type analysis in proof theory, where in particular, one wants to draw attention to the fact that they constitute the embryo to a general proof theory; extensions of the results obtained by Gentzen to more powerful theories; and the connection between proofs and the terms used in functional interpretations of intuitionistic logic, in particular, the connection between Gentzen-type analysis and the Gödel-type analysis that originated with Godel's socalled Dialectica interpretation. A notion of validity of derivations is presented that may be contemplated as a possible explication of Gentzen's ideas about an operational interpretation of the logical constants. Proofs of the results have shown how this notion of validity may be used as a convenient tool to establish the main result about strong normalization in first order logic.
Article
This chapter discusses general proof theory. The name proof theory was originally given by Hilbert to a constructive study of proofs with certain specific aims. By such a study, the consistency of mathematics is established or, more generally, a reduction of mathematics to a certain constructive part is obtained. Hence, the study of proofs was only a tool to obtain this reduction, and thus could not use principles that were more advanced than the principles contained in the constructive part of mathematics to which all mathematics was to be reduced. Some of the more important results in reductive proof theory, in particular Gentzen's well-known results, were indeed such by-products. They were obtained from insights about the general structure of proofs, which insights are in their own right at least as interesting as their applications in reductive proof theory. It is tried to formulate this method in more general terms, in the chapter, by not tying the notion of validity of inferences to any particular deductive system but defining it for inferences in general. Results mentioned above may then be obtained as special cases of this method. The work is thus directed towards a foundation of general proof theory, but it should be noted that there are many open problems in this connection and the work has a tentative character.
Article
In this paper definite Horn clause programs are investigated within a proof-theoretic framework; program clauses being considered rules of a formal system. Based on this approach, the soundness and completeness of SLD-resolution is established by purely proof-theoretic methods. Extended Horn clauses are defined as rules of higher levels and related to an approach based on implication formulae in the bodies of clauses. In a further extension, which is treated in Part II of this series, program clauses are viewed as clauses in inductive definitions of atoms, justifying an additional inference schema: a reflection principle that roughly corresponds to interpreting the program clauses as introduction rules in the sense of natural deduction. The evaluation procedures for queries with respect to the defined extensions of definite Horn clause programs are shown to be sound and complete. The sequent calculus with the general elimination schema even permits the introduction of a genuine notion of falsity which is not defined via a meta-rule.
Article
T 0 will denote Gödel's theory T[3] of functionals of finite type (f.t.) with intuitionistic quantification over each f.t. added. T 1 will denote T 0 together with definition by bar recursion of type o, the axiom schema of bar induction, and the schema of choice. Precise descriptions of these systems are given below in §4. The main results of this paper are interpretations of T 0 in intuitionistic arithmetic U 0 and of T 1 in intuitionistic analysis is U 1 . U 1 is U 0 with quantification over functionals of type (0,0) and the axiom schemata AC 00 and of bar induction.
Conference Paper
This chapter presents a proof theoretical analysis of the intuitionistic theory of generalized inductive definitions iterated an arbitrary finite number of times. Like the Hilbert type systems of first order predicate logic that are used, the theories of single and iterated generalized inductive definitions formulated do not lend themselves immediately to a proof theoretical analysis. Therefore, the chapter reformulates the axioms expressing the principles of definition and proof by generalized induction as rules of inference similar to those introduced by Gentzen in his system of natural deduction for first order predicate logic. As in Gentzen's case, this reformulation leads to a notable systematization that is already in the case of ordinary inductive definitions, the rules corresponding to the axioms that express the principle of definition by induction appearing as introduction rules for the inductively defined predicates, whereas the axioms that express the principle of proof by induction give rise to the corresponding elimination rules. Moreover, the generalized inductive definitions appear as inductive definitions iterated once and the iterated generalized inductive definitions as inductive definitions iterated twice or more.
Article
One of the main ideas of calculi of natural deduction, as introduced by Jaskowski and Gentzen, is that assumptions may be discharged in the course of a derivation. As regards sentential logic, this conception will be extended in so far as not only formulas but also rules may serve as assumptions which can be discharged. The resulting calculi and derivations with rules of any finite level are informally introduced in ?1, while ??2 and 3 state formal definitions of the concepts involved and basic lemmata. Within this framework, a standard form for introduction and
Article
The Edinburgh Logical Framework (LF) provides a means to define (or present) logics. It is based on a general treatment of syntax, rules, and proofs by means of a typed -calculus with dependent types. Syntax is treated in a style similar to, but more general than, Martin-Lof's system of arities. The treatment of rules and proofs focuses on his notion of a judgement. Logics are represented in LF via a new principle, the judgements as types principle, whereby each judgement is identified with the type of its proofs. This allows for a smooth treatment of discharge and variable occurrence conditions and leads to a uniform treatment of rules and proofs whereby rules are viewed as proofs of higher-order judgements and proof checking is reduced to type checking. The practical benefit of our treatment of formal systems is that logic-independent tools such as proof editors and proof checkers can be constructed. Categories and subject descriptors: F.3.1 [Logics and Meanings of Programs]: Specifying and Verifying and Reasoning about Programs; F.4.1 [Mathematical Logic and Formal Languages]: Mathematical Logic. General terms: algorithms, theory, verification. Additional key words and phrases: typed lambda calculus, formal systems, proof checking, interactive theorem proving. 1
Einfiihrung in die Operativen Logik und
  • P Lorenzen
P. Lorenzen, Einfiihrung in die Operativen Logik und Mathematik (Springer, Berlin, 1969).
The Lambda Calculus (North-Holland
  • H Barendregt
H. Barendregt, The Lambda Calculus (North-Holland, Amsterdam, 1984).
A proof theoretic approach to logic programming. I Generalized Horn clauses
  • L Hallnss
  • P Schroeder-Heister
L. HallnSs and P. Schroeder-Heister, A proof theoretic approach to logic programming. I Generalized Horn clauses, SICS research report R88005, 1988 (a revised version will appear in J. Logic and Computation).
Amendments to intuitionistic type theory, Notes from a lecture given at Chalmers, G6teborg
  • P Martin
  • L / Sf
P. Martin-L/Sf, Amendments to intuitionistic type theory, Notes from a lecture given at Chalmers, G6teborg, 1986.
“Type” is not a type
  • Meyer