Article

Nested General Recursion and Partiality in Type Theory

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We extend Bove's technique for formalising simple general recursive algorithms in constructive type theory to nested recursive algorithms. The method consists in defining an inductive special-purpose accessibility predicate, that characterises the inputs on which the algorithm terminates. As a result, the type-theoretic version of the algorithm can be defined by structural recursion on the proof that the input values satisfy this predicate. This technique results in definitions in which the computational and logical parts are clearly separated; hence, the type-theoretic version of the algorithm is given by its purely functional content, similarly to the corresponding program in a functional programming language. In the case of nested recursion, the special predicate and the type-theoretic algorithm must be defined simultaneously, because they depend on each other. This kind of definitions is not allowed in ordinary type theory, but it is provided in type theories extended wit...

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... This situation is similar to the case of partial recursive functions or recursive functions with non-structurally recursive arguments. In order to formalise such function in constructive type theory, there is a method of adding an inductive domain predicate introduced in [12] and extensively developed by Bove and Capretta [5]. According to this method we need to define an inductively defined predicate E h (µ, α) with the intended meaning that µ and α are in the domain of m h which in turn means that the homographic algorithm should emit at least one digit when applied on µ and α. ...
... It is important to have in mind that we have separated the issue of productivity and correctness. This is in accordance with separation of termination and correctness in the method of Bove–Capretta for general recursion [5]. In order to prove the correctness we need to define a suitable semantics (for example use another model of real numbers) and prove that the effect of the above algorithm is equivalent to the effect of Möbius maps in the field of real numbers. ...
... This suggests that one can generalise this method to obtain a scheme in style of [6] for formalising specification of partial functions on final coalgebras. Such a method would be the dual of the Bove–Capretta [5] for general recursion method and the term general corecursion seems suitable. In this article we have not developed such a scheme, as our focus lies on the special case of exact arithmetic algorithms for the coalgebra of reals. ...
Article
Full-text available
In this article we present a method to define algebraic structure (field operations) on a representation of real numbers by coinductive streams. The field operations will be given in two algorithms (homographic and quadratic algorithm) that operate on streams of Möbius maps. The algorithms can be seen as coalgebra maps on the coalgebra of streams and hence they will be formalised as general corecursive functions. We use the machinery of Coq proof assistant for coinductive types to present the formalisation.
... In a series of previous articles, we expound two different ways of achieving that goal. In our first approach [3, 4, 2, 5], we define an inductive (domain) predicate that characterises the inputs on which a function terminates. The constructors of this predicate can be easily and automatically determined from the recursive equations defining the function. ...
... Based on our work in [3], Setzer [21, 22] defines a type (of codes) of partial functions. From the code of a partial function, one can extract the domain of the function and the function itself, and one can evaluate the function on a certain argument. ...
... Inspired on the previous analysis, we introduce a new type constructor for partial recursive functions in which the coalgebra-algebra pair is used in the introduction rule. For the elimination rule, we define a domain predicate similar to the one in the Bove/Capretta method [3]. For full generality, the method we used in the previous section must be adapted by defining a lifted universal quantifier: for every predicate P : X → Prop and functor F , we define F,P : F X → Prop as the conjunction of the statement of P on every element of type X occurring in a member of F X; its formal definition will be given in the next section. ...
Conference Paper
Full-text available
Our goal is to define a type of partial recursive functions in constructive type theory. In a series of previous articles, we studied two different formulations of partial functions and general recursion. We could obtain a type only by extending the theory with either an impredicative universe or with coinductive definitions. Here we present a new type constructor that eludes such entities of dubious constructive credentials. We start by showing how to break down a recursive function definition into three components: the first component generates the arguments of the recursive calls, the second evaluates them, and the last computes the output from the results of the recursive calls. We use this dissection as the basis for the introduction rule of the new type constructor. Every partial recursive function is associated with an inductive domain predicate; evaluation of the function requires a proof that the input values satisfy the predicate. We give a constructive justification for the new construct by interpreting it into the base type theory. This shows that the extended theory is consistent and constructive.
... This induction principle is provable using Definition 6.7 and the definition of accessibility on dcpo's. 4 Therefore we accept this induction principle. Using this induction principle and by pattern matching on the input we can prove a statement about the productivity of NAlg on the family of accessible open expression trees. ...
... We restrict ourselves to the homographic algorithm, as the general case of accessible open expression is essentially similar but considerably more verbose. That is, instead of the general NAlg we consider the following restricted case which is the restriction of 4 In fact it is the elimination principle of the inductive set which is the initial algebra of the functor F(X ) = 1 + (1 + M) × X + (1 + T) × X 2 , when considered as an inductive type. 5 Intuitively, this is because an open expression tree is a finite composition of Möbius maps and tensors applied to a real number x. Composing n Möbius maps and m tensors (in one variable) results in a rational function with degree 2n + m. 6 This is due to the fact that such expression trees are obtained from the continued fraction expansions of elementary functions [26, Section 10.2]. ...
... The only way to formalise a partial function is to change either the domain or the codomain of the function and formalise an extensionally equal function. One approach to formalising partial functions is the inductive domain predicate which is developed for formalising partial or nonstructurally recursive functions while still keeping the general shape (and hence complexity) of the function intact [9], [4]. According to this method one can define the proof obligation to be an inductively defined predicate. ...
Article
Full-text available
In this work we focus on a formalisation of the algorithms of lazy exact arithmetica la Edalat-Potts in type theory. We choose the constructive type theory extended with coinductive types as our formal verification tool. We show examples of how infinite objects such as streams and ex- pression trees can be formalised as coinductive types. We study the type theoretic notion of productivity which ensures the infiniteness of the out- come of the algorithms on infinite objects. Syntactical methods are not always strong enough to ensure the productivity. However, if some infor- mation about the complexity of a function is provided, one may be able to show the productivity of that function. In the case of the normalisa- tion algorithm we show that such information can be obtained from the choice of real number representation that is used to represent the input and the output. Furthermore, we show how to employ this semantic in- formation to formalise a restricted case of normalisation algorithm in our type theory.
... In chapter 3 we prove normalisation for a system based on combinators. I introduce the method for proving normalisation results (Big-Step Normalisation) and the Bove-Capretta technique [20] for dealing with nested recursive functions in type theory. I choose the simple system of combinators for pedagogical reasons. ...
... 5. Use the Bove-Capretta technique [20] to yield a structurally recursive normaliser using our original definition and the termination proof. ...
... We then show that termination in terms of the big-step semantics. When we have done this we can use our termination proof to define a structurally recursive version of our normaliser using the Bove-Capretta technique [20]. Before proceeding we carry out the whole process for a simpler example. ...
Article
This thesis is about Martin-Löf's intuitionistic theory of types (type theory). Type theory is at the same time a formal system for mathematical proof and a dependently typed programming language. Dependent types are types which depend on data and therefore to type check dependently typed programming we need to perform computation(normalisation) in types. Implementations of type theory (usually some kind of automatic theorem prover or interpreter) have at their heart a type checker. Implementations of type checkers for type theory have at their heart a normaliser. In this thesis I consider type checking as it might form the basis of an implementation of type theory in the functional language Haskell and then normalisation in the more rigorous setting of the dependently typed languages Epigram and Agda. I investigate a method of proving normalisation called Big-Step Normalisation (BSN). I apply BSN to a number of calculi of increasing sophistication and provide machine checked proofs of meta theoretic properties.
... The usual extensions of such methods require the induction-recursion scheme [Dybjer 2000], where a function can be defined simultaneously with an inductive predicate ś a technique not available in Coq. To overcome this difficulty, we use a variant of the Bove-Capretta method [Bove 2009;Bove and Capretta 2001] of encoding partial functions in constructive type theory, where an auxiliary graph-based definition of the encoded function is used. ...
... Additionally, we need to encode recursion properly so that Coq can verify that it is terminating. To this end, we use the Bove-Capretta method [Bove and Capretta 2001], where each function has a dedicated inductive accessibility predicate whose definition reflects the structure of recursive calls of the function. Computations (the Set universe) and logic (the Prop universe) are strictly separated in the Coq system and it is impossible to perform such recursion over proof term directly. ...
Conference Paper
We present a Coq formalization of the normalization-by-evaluation algorithm for Martin-Löf dependent type theory with one universe and judgmental equality. The end results of the formalization are certified implementations of a reduction-free normalizer and of a decision procedure for term equality. The formalization takes advantage of a graph-based variant of the Bove-Capretta method to encode mutually recursive evaluation functions with nested recursive calls. The proof of completeness, which uses the PER-model of dependent types, is formalized by relying on impredicativity of the Coq system rather than on the commonly used induction-recursion scheme which is not available in Coq. The proof of soundness is formalized by encoding logical relations as partial functions.
... This topic is also treated in [214]. An in-depth discussion of the various ways to treat computations in theorem provers is given in [32] and further related work is presented in [54]. ...
... Several proposals have been made to alleviate the restrictions posed by the structural recursion (without giving up the decidability of type checking). The 'Bove-Capretta' approach of [54], jointly developed in Nijmegen (Capretta) and Gothenburg (Bove) has been very successful as it succeeds in taking apart the definition of a function, which is done very much in a functional programming style, and the proof that it terminates, which can be postponed until later. It also provides a way of dealing with partial functions. ...
... We introduce a simply-typed λ-calculus with explicit substitution and βη-equality in section 2. We then implement a recursive normalisation function in partial Type Theory in section 3. Using a technique introduced in (Bove & Capretta, 2001) we use a relational presentation, i.e., a big-step reduction relation of the partial functions in total Type Theory to be able to characterize the graph of our normalisation function in section 4. Using a variant of strong computability 6 (Tait, 1967), incorporating Kripke logical predicates, we then show that our partial normalisation function terminates and returns a result convertible to the input in section 5. It remains to show soundness, we do this using Kripke logical relations in section 6. ...
... The functions defined in the previous section are not structurally recursive and hence it is not obvious how to implement them in total Type Theory. To bridge this gap we will exploit a technique pioneered in (Bove & Capretta, 2001): we inductively define the graph of our function and then show that the graph is total: i.e. for every input there exists an output. We can use this proof to actually run our function without having to employ a choice principle—i.e. ...
Article
Full-text available
Traditionally, decidability of conversion for typed λ-calculi is established by showing that small-step reduction is confluent and strongly normalising. Here we investigate an alternative approach employing a recursively defined normalisation function which we show to be terminating and which reflects and preserves conversion. We apply our approach to the simply typed λ-calculus with explicit substitutions and βη-equality, a system which is not strongly normalising. We also show how the construction can be extended to system T with the usual β-rules for the recursion combinator. Our approach is practical, since it does verify an actual implementation of normalisation which, unlike normalisation by evaluation, is first order. An important feature of our approach is that we are using logical relations to establish equational soundness (identity of normal forms reflects the equational theory), instead of the usual syntactic reasoning using the Church–Rosser property of a term rewriting system.
... Bove and Capretta [6] use indexed inductive-recursive definitions in their analysis of the termination of functions defined by nested general recursion. Given such a function f the idea is to simultaneously define a predicate D(x) expressing that f (x) terminates, and a function f (x, p), which returns the same value as f (x), but has a proof p : D(x) that f (x) terminates as second argument. ...
... Note that the type of h (i, a, v) is a)]. 6 Again we are using uncurried versions of intro r ...
Article
We give two finite axiomatizations of indexed inductive-recursive definitions in intuitionistic type theory. They extend our previous finite axiomatizations of inductive-recursive definitions of sets to indexed families of sets and encompass virtually all definitions of sets which have been used in intuitionistic type theory. The more restricted of the two axiomatization arises naturally by considering indexed inductive-recursive definitions as initial algebras in slice categories, whereas the other admits a more general and convenient form of an introduction rule. The class of indexed inductive-recursive definitions properly contains the class of indexed inductive definitions (so called “inductive families”). Such definitions are ubiquitous when using intuitionistic type theory for formalizing mathematics and program correctness. A side effect of the present paper is to get compact finite axiomatizations of indexed inductive definitions in intuitionistic type theory as special cases. Proper indexed inductive-recursive definitions (those which do not correspond to indexed inductive definitions) are useful in intuitionistic metamathematics, and as an example we formalize Tait-style computability predicates for dependent types. We also show that Palmgren’s prooftheoretically strong construction of higher-order universes is an example of a proper indexed inductive-recursive definition. A third interesting example is provided by Bove and Capretta’s definition of termination predicates for functions defined by nested recursion. Our axiomatizations form a powerful foundation for generic programming with dependent types by introducing a type of codes for indexed inductive-recursive definitions and making it possible to define generic functions by recursion on this type.
... In Coq, the method was implemented in [33,4], see also [12]. The method of using accessibility predicates was improved by Bove in her thesis [15] and series of papers [14,17,16,18]. The core of the improvement proposed by Bove was to separate computational and logical parts of the definitions of general recursive algorithms. ...
Preprint
In Constructive Type Theory, recursive and corecursive definitions are subject to syntactic restrictions which guarantee termination for recursive functions and productivity for corecursive functions. However, many terminating and productive functions do not pass the syntactic tests. Bove proposed in her thesis an elegant reformulation of the method of accessibility predicates that widens the range of terminative recursive functions formalisable in Constructive Type Theory. In this paper, we pursue the same goal for productive corecursive functions. Notably, our method of formalisation of coinductive definitions of productive functions in Coq requires not only the use of ad-hoc predicates, but also a systematic algorithm that separates the inductive and coinductive parts of functions.
... Progress determines whether we are done, or should take another step; preservation provides evidence that the new term is well-typed, so we may iterate. In a language with guaranteed termination such as Agda, we cannot iterate forever, but there are a number of well-known techniques to address that issue; see, e.g., Bove and Capretta (2001), Capretta (2005), or McBride (2015. We use the simplest, similar to McBride's petroldriven (or step-indexed) semantics: provide a maximum number of steps to execute; if that number proves insufficient, the evaluator returns the term it reached, and one can resume execution by providing a new number. ...
Article
One of the leading textbooks for formal methods is Software Foundations (SF), written by Benjamin Pierce in collaboration with others, and based on Coq. After five years using SF in the classroom, we came to the conclusion that Coq is not the best vehicle for this purpose, as too much of the course needs to focus on learning tactics for proof derivation, to the cost of learning programming language theory. Accordingly, we have written a new textbook, Programming Language Foundations in Agda (PLFA). PLFA covers much of the same ground as SF, although it is not a slavish imitation. What did we learn from writing PLFA? First, that it is possible. One might expect that without proof tactics that the proofs become too long, but in fact proofs in PLFA are about the same length as those in SF. Proofs in Coq require an interactive environment to be understood, while proofs in Agda can be read on the page. Second, that constructive proofs of preservation and progress give immediate rise to a prototype evaluator. This fact is obvious in retrospect but it is not exploited in SF (which instead provides a separate normalise tactic) nor can we find it in the literature. Third, that using extrinsically-typed terms is far less perspicuous than using intrinsically-typed terms. SF uses the former presentation, while PLFA presents both; the former uses about 1.6 as many lines of Agda code as the latter, roughly the golden ratio. The textbook is written as a literate Agda script, and can be found here: http://plfa.inf.ed.ac.uk
... We note that, computationally, the type-theoretic implementation of nf behaves the same as the recursive implementation since we have only added propositional arguments which cannot affect the computation. This could be made precise by using a compiler based on [9], which eliminates arguments not relevant at run-time. ...
Conference Paper
Full-text available
We present a Tait-style proof to show that a simple functional normaliser for a combinatory version of System T terminates. Using a technique pioneered by Bove and Capretta, we can implement the normaliser in total Type Theory. The main interest in our construction is methodological, it is an alternative to the usual small-step operational semantics on the one side and normalisation by evaluation on the other. The present work is motivated by our longer term goal to verify implementations of Type Theory such as Epigram.
... Some standard data structures, for instance leftist heaps and red-black trees [17 for example], have a natural inductive-recursive structure; this will be the topic of a forthcoming article. Ana Bove and I [2] used it to implement nested recursive functions simultaneously with their domain predicate. ...
Article
Wander types are a coinductive version of inductive-recursive definitions. They are defined by simultaneously specifying the constructors of the type and a function on the type itself. The types of the constructors can refer to the function component and the function itself is given by pattern matching on the constructors. Wander types are different from inductive-recursive types in two ways: the structure of the elements is not required to be well-founded, so infinite applications of the constructors are allowed; and the recursive calls in the definition of the function are not required to be on structurally smaller arguments. Wander types generalize several known type formers. We can use the functional compo- nent to control the way the data branch. This allows not only the implementation of coin- duction, but also of induction, by imposing well-foundedness through an appropriate func- tion definition. Special instances of wander types are: plain inductive and coinductive types, inductive-recursive types, mixed inductive-coinductive types, continuous stream processors.
... The same translation can be applied to the Bove-Capretta method to untangle the definition of function from the accessibility predicate. We illustrate it on the most simple example, the nest function [2], which is defined using inductionrecursion as: The original domain predicate, function, and constructors for the predicate can be defined as: D nest n = Σ z:(D * nest n) (Nest check n z) nest n h = (nest * n (π 1 h)) nest 0 = nest * 0 , 0 check nest S n h 1 h 2 = (nest * S D * nest λx.λq.q nest * n (π 1 h 1 ) (π 1 h 2 )), (S check n (π 1 h 1 ) (π 2 h 1 ) (π 1 h 2 ) (π 2 h 2 )) and the validity of recursion equations for nest can be verified. ...
Article
Full-text available
The family T, does not depend on the function g,, because all calls to it have been replaced by f, and is not inductive, because all recursive occurrences of T in the constructors have been replaced by Y. Therefore, the definition is a standard polymorphic definition in the Calculus of Inductive Constructions and g, is a standard function on it defined by case analysis. However, we have enlarged the set of elements of T by polymorphically generalizing T to a variable Y. We can restrict it to the original desired elements by an inductive predicate on T,. We illustrate the method with a simple example. More examples follow the exposition of the general method.
... Les seuls éléments qui entrent en compte dans la représentation des croyances sont d'une part la connaissance de la structure du système (ontologie des descriptions possibles), et d'autre part une notion d'ordre partiel sur ces descriptions permettant de les comparer suivant la quantité d'informations qu'elles fournissent. En observant les objets physiques comme des éléments essentiels de processus physiques et en se basant sur les résultats de travaux récents dans le domaine du traitement des langues naturelles (Boldini, 2000; Ginzburg, 2005; Cooper, 2005; Ranta, 2004), dans le domaine de la certification de programmes (Bove & Capretta, 2001; Coquand & Coquand, 1999; Paulson, 1989) et plus récemment dans le domaine du Web sémantique (Halpin & Thompson, 2006 ), nous avons proposé un modèle utilisant la théorie intuitionniste des types (Barlatier & Dapoigny, 2007; Dapoigny & Barlatier, 2006, 2007b) pour modéliser les contextes, les actions et les effets. La représentation des contextes de processus est décrite par des enregistrements à types dépendants. ...
Article
Full-text available
Résumé : Ce papier propose une nouvelle sémantique pour la logique de plani-fication dans laquelle on considère un but comme une fonction de son contexte (le contexte étant une structure de connaissance sur le domaine). Étant donnés un but global et une situation initiale, le modèle de planification est généré directe-ment à partir d'un ensemble structuré de buts primitifs par un raisonnement sur les types de buts et la connaissance du domaine. Cette dernière est extraite d'une ontologie locale de domaine par une sélection précise et ordonnée des entités dis-ponibles et de leurs relations. Formellement, cette sélection est modélisée par des types d'enregistrements dépendants de la théorie intuitionniste des types (ITT). Ainsi, en utilisant un démonstrateur de théorème reposant sur cette théorie, il est possible de vérifier la validité (plus précisément le type) d'un ensemble d'objets instanciés représentant la connaissance et l'état actuel du système. Cette identifi-cation par un type correspond à une connaissance partielle du domaine associée à un but pour produire une action.
... Contrary to our work, this approach does not support code generation for partial functions (except tail-recursive ones) and does not support corecursion. The technique of recursion on an ad-hoc predicate, which consists in defining a function by structural induction on an inductive predicate that describes its domain, was suggested by Dubois and Donzeau-Gouge [9] and developed by Bove and Capretta [7]. Later, Barthe et al. [2] used it in the implementation of a tool for Coq. ...
Article
In this paper, we develop a general theory of fixed point combinators, in higher-order logic equipped with Hilbert's epsilon operator. This combinator allows for a direct and effective formalization of corecursive values, recursive and corecursive functions, as well as functions mixing recursion and corecursion. It supports higher-order recursion, nested recursion, and offers a proper treatment of partial functions in the sense that domains need not be hardwired in the definition of functionals. Our work, which has been entirely implemented in Coq, unifies and generalizes existing results on contraction conditions and complete ordered families of equivalences, and relies on the theory of optimal fixed points for the treatment of partial functions. It provides a practical way to formalize circular definitions in higher-order logic.
... Doornbos and Backhouse [8] have asked the question under what conditions the hylo diagram has a unique solution. In type theory, structured (co)recursion schemes for initial algebras (final coalgebras ) have been studied by, e.g., Giménez [15,16] and (co)recursion more generally by, e.g., Bove and Capretta [5,6] and McBride and McKinna [17]. Organization of the paper In Section 2, we explain our motivation for studying recursive coalgebras and give the definition. ...
Article
The concept of recursive coalgebra of a functor was introduced in the 1970s by Osius in his work on categorical set theory to discuss the relationship between wellfounded induction and recursively specified functions. In this paper, we motivate the use of recursive coalgebras as a paradigm of structured recursion in programming semantics, list some basic facts about recursive coalgebras and, centrally, give new conditions for the recursiveness of a coalgebra based on comonads, comonad-coalgebras and distributive laws of functors over comonads. We also present an alternative construction using countable products instead of cofree comonads.
... In Coq, the method was implemented in [33,4], see also [12]. The method of using accessibility predicates was improved by Bove in her thesis [15] and series of papers [14,17,16,18]. The core of the improvement proposed by Bove was to separate computational and logical parts of the definitions of general recursive algorithms. ...
Article
Full-text available
In Constructive Type Theory, recursive and corecursive definitions are subject to syntactic restrictions which guarantee termination for recursive functions and productivity for corecursive functions. However, many terminating and productive functions do not pass the syntactic tests. Bove proposed in her thesis an elegant reformulation of the method of accessibility predicates that widens the range of terminative recursive functions formalisable in Constructive Type Theory. In this paper, we pursue the same goal for productive corecursive functions. Notably, our method of formalisation of coinductive definitions of productive functions in Coq requires not only the use of ad-hoc predicates, but also a systematic algorithm that separates the inductive and coinductive parts of functions.
... Doornbos and Backhouse [8] have asked the question under what conditions the hylo diagram has a unique solution. In type theory, structured (co)recursion schemes for initial algebras (final coalgebras ) have been studied by, e.g., Giménez [15,16] and (co)recursion more generally by, e.g., Bove and Capretta [5,6] and McBride and McKinna [17]. Organization of the paper In Section 2, we explain our motivation for studying recursive coalgebras and give the definition. ...
Article
Full-text available
The concept of recursive coalgebra of a functor was introduced in the 1970s by Osius in his work on categorical set theory to discuss the relationship between wellfounded induction and recursively specified functions. In this paper, we motivate the use of recursive coalgebras as a paradigm of structured recursion in programming semantics, list some basic facts about recursive coalgebras and, centrally, give new conditions for the recursiveness of a coalgebra based on comonads, comonad-coalgebras and distributive laws of functors over comonads. We also present an alternative construction using countable products instead of cofree comonads.
... This topic is also treated in [99]. An in-depth discussion of the various ways to treat computations in theorem provers is given in [19] and further related work is presented in [36]. The Calculemus RTN has also studied other approaches to theorem proving and their capacities to integrate computations (see also [123] ). ...
... The analysis of contextual reasoning requires a sound and expressive formalism. Widely used in Natural Language processing (Boldini 2000; Ranta 2004 ) and in Programming Languages (Bove & Capretta 2001; Paulson 1989; Coquand & Coquand 1999), ITT (Martin-Löf 1982) has been proven to be appropriate to support the linguistic na-ture of physical situations. We extend these works with the description of contexts. ...
Conference Paper
Full-text available
The context paradigm emerges from different areas of Artificial Intelligence. However, while significative for- malizations have been proposed, contexts are either mapped on independent micro-theories or considered as different concurrent viewpoints with mappings between contexts to export/import knowledge. These logical for- malisms focus on the semantic level and do not take into account dynamic low-level information such as those available from sensors. This information is a key ele- ment of contexts in pervasive computing environments. In this paper, we introduce a formal framework where the knowledge representation of context bridges the gap between semantic high-level and low-level knowledge. The logical reasoning based on intuitionistic type the- ory and the Curry-Howard isomorphism is able to incor- porate expert knowledge as well as technical resources such as task properties. Based on our context model, we also present the foundations of a Context-Aware archi- tecture (Softweaver) for building of context-aware ser- vices.
... So we are led to look for a different implementation of polynomials whose structure reflects the steps of the algorithm. A general method to obtain a data type whose structure is derived from the recursive definition of a function is presented in [7]. In our case we obtain the result by representing polynomials as tree structures in which the left subtree contains the even coefficients and the right subtree contains the odd coefficients. ...
Conference Paper
Full-text available
We program the Fast Fourier Transform in type theory, using the tool Coq. We prove its correctness and the correctness of the Inverse Fourier Transform. A type of trees representing vectors with interleaved elements is defined to facilitate the definition of the transform by struc- tural recursion. We define several operations and proof tools for this data structure, leading to a simple proof of correctness of the algorithm. The inverse transform, on the other hand, is implemented on a different rep- resentation of the data, that makes reasoning about summations easier. The link between the two data types is given by an isomorphism. This work is an illustration of the two-level approach to proof development and of the principle of adapting the data representation to the specific algorithm under study. CtCoq, a graphical user interface of Coq, helped in the development. We discuss the characteristics and usefulness of this tool.
... The formalisation of mutually recursive functions usually presents no major problems in systems like Coq, but on the other hand, nested functions are not trivial to formalise. Already our previous work (using the domain predicates) on nested recursive functions [5] could not directly be translated into Coq because the proof assistant lacks support for inductive-recursive definitions, as described by Dybjer in [10]. As can be seen from the proofs, the method is general and can be adapted to every simple recursive program. ...
Conference Paper
Full-text available
We describe a new method to represent (partial) recursive functions in type theory. For every recursive definition, we define a co-inductive type of prophecies that characterises the traces of the computation of the function. The structure of a prophecy is a possibly infinite tree, which is coerced by linearisation to a type of partial results defined by applying the delay monad to the co-domain of the function. Using induction on a weight relation defined on the prophecies, we can reason about them and prove that the formal type-theoretic version of the recursive function, resulting from the present method, satisfies the recursive equations of the original function. The advantages of this technique over the method previously developed by the authors via a special-purpose accessibility (domain) predicate are: there is no need of extra logical arguments in the definition of the recursive function; the function can be applied to any element in its domain, regardless of termination properties; we obtain a type of partial recursive functions between any two given types; and composition of recursive functions can be easily defined.
... This follows Giesl's approach [22,23], but it does not require a new notion of truth, since all constructions happen in standard HOL. Bove and Capretta [11] can only support nested recursion by defining the domain and the function simultaneously, which requires extending the underlying theory to support simultaneous inductive-recursive definitions as described by Dybjer [20]. In contrast, our classical setting avoids this issue and works in standard higher-order logic, since the domain is not required for the function definition but only introduced for the purpose of convenient reasoning . ...
Article
Based on inductive definitions, we develop a tool that automates the definition of partial recursive functions in higher-order logic (HOL) and provides appropriate proof rules for reasoning about them. Termination is modeled by an inductive domain predicate which follows the structure of the recursion. Since a partial induction rule is available immediately, partial correctness properties can be proved before termination is established. It turns out that this modularity also facilitates termination arguments for total functions, in particular for nested recursions. Our tool is implemented as a definitional package extending Isabelle/HOL. Various extensions provide convenience to the user: pattern matching, default values, tail recursion, mutual recursion and currying.
... There, besides presenting the formal definition of the function, we also show how to prove some of its properties. For further reading on the method, including its limitations, the reader is referred to [5] [6] [4] where the method is presented and exemplified for a constructive type theory in the spirit of Martin-Löf's type theory. For the applicability of the method to the Calculus of Inductive Constructions (CIC), see Chapter 15 of Bertot's and Castéran's book [3]. ...
Article
Full-text available
Bove and Capretta have presented a method to deal with partial and general recursive functions in constructive type theory which relies on an inductive characterisation of the domains of the functions. The method separates the logical and the computational aspects of an algorithm, and facilitates the formal verification of the functions being defined. For nested recursive functions, the method uses Dybjer' schema for simultaneous inductive-recursive definitions. However, not all constructive type theories support this kind of definitions.Here we present a new approach for dealing with partial and general recursive functions that preserves the advantages of the method by Bove and Capretta, but which does not rely on inductive-recursive definitions. In this approach, we start by inductively defining the graph of the function, from which we first define the domain and afterwards the type-theoretic version of the function. We show two ways of proving the formal specification of the functions defined with this new approach: by induction on the graph, or by using an induction principle in the style of the induction principle associated to the domain predicates of the Bove-Capretta method.
... One way to represent partial functions, pursued by Ana Bove and myself in a series of papers [14,15], is by a pair of an inductive domain predicate and a (total) function on the elements that satisfy it. First of all, characterise those streams on which min terminates, by an inductive predicate: The domain is an inductive predicate with two constructors. ...
Article
Full-text available
This is a survey article on the use of coalgebras in functional programming and type theory. It presents the basic theory underlying the implementation of coinductive types, families and predicates. It gives an overview of the application of corecursive methods to the study of general recursion, formal power series, tabulations of functions on inductive data. It also sketches some advanced topics in the study of the solutions to non-guarded corecursive equations and the design of non-standard type theory.
... This topic is also treated in [99]. An in-depth discussion of the various ways to treat computations in theorem provers is given in [19] and further related work is presented in [36]. The Calculemus RTN has also studied other approaches to theorem proving and their capacities to integrate computations (see also [123]). ...
... This is one of the main difficulties of the verification process; in Coq, formalising non-structural yet terminating recursion is possible in various ways. But all of the methods either require a priori knowledge of the algorithm complexity (for example the Balaa and Bertot's method [1]) or lead to very large proof terms by changing the representation of the function's domain (for example the Bove and Capretta's method [5]). In Sect. ...
Conference Paper
In this paper we present the formalisation of the library which is an implementation of rational numbers as binary sequences for both lazy and strict computation. We use the representation also known as the Stern-Brocot representation for rational numbers. This formalisation uses advanced machinery of the theorem prover and applies recent developments in formalising general recursive functions. This formalisation highlights the rôle of type theory both as a tool to verify hand-written programs and as a tool to generate verified programs.
Chapter
One of the leading textbooks for formal methods is Software Foundations (SF), written by Benjamin Pierce in collaboration with others, and based on Coq. After five years using SF in the classroom, I have come to the conclusion that Coq is not the best vehicle for this purpose, as too much of the course needs to focus on learning tactics for proof derivation, to the cost of learning programming language theory. Accordingly, I have written a new textbook, Programming Language Foundations in Agda (PLFA). PLFA covers much of the same ground as SF, although it is not a slavish imitation. What did I learn from writing PLFA? First, that it is possible. One might expect that without proof tactics that the proofs become too long, but in fact proofs in PLFA are about the same length as those in SF. Proofs in Coq require an interactive environment to be understood, while proofs in Agda can be read on the page. Second, that constructive proofs of preservation and progress give immediate rise to a prototype evaluator. This fact is obvious in retrospect but it is not exploited in SF (which instead provides a separate normalise tactic) nor can I find it in the literature. Third, that using raw terms with a separate typing relation is far less perspicuous than using inherently-typed terms. SF uses the former presentation, while PLFA presents both; the former uses about 1.6 as many lines of Agda code as the latter, roughly the golden ratio. The textbook is written as a literate Agda script, and can be found here: http://plfa.inf.ed.ac.uk.
Conference Paper
We present a Coq formalization of the normalization-by-evaluation algorithm for Martin-Löf dependent type theory with one universe and judgmental equality. The end results of the formalization are certified implementations of a reduction-free normalizer and of a decision procedure for term equality. The formalization takes advantage of a graph-based variant of the Bove-Capretta method to encode mutually recursive evaluation functions with nested recursive calls. The proof of completeness, which uses the PER-model of dependent types, is formalized by relying on impredicativity of the Coq system rather than on the commonly used induction-recursion scheme which is not available in Coq. The proof of soundness is formalized by encoding logical relations as partial functions.
Conference Paper
There are several different approaches to the theory of data types. At the simplest level, polynomials and containers give a theory of data types as free standing entities. At a second level of complexity, dependent polynomials and indexed containers handle more sophisticated data types in which the data have an associated indices which can be used to store important computational information. The crucial and salient feature of dependent polynomials and indexed containers is that the index types are defined in advance of the data. At the most sophisticated level, induction-recursion allows us to define data and indices simultaneously. This work investigates the relationship between the theory of small inductive recursive definitions and the theory of dependent polynomials and indexed containers. Our central result is that the expressiveness of small inductive recursive definitions is exactly the same as that of dependent polynomials and indexed containers. A second contribution of this paper is the definition of morphisms of small inductive recursive definitions. This allows us to extend our main result to an equivalence between the category of small inductive recursive definitions and the category of dependent polynomials/indexed containers. We comment on both the theoretical and practical ramifications of this result.
Article
The basic goal of context-aware systems is to make software aware of the environment and to adapt to their changing context. For that purpose, the core problem is to have a powerful context model. While significative formalizations have been proposed, context models are either expressed through logical formalisms or with ontology-based approaches. The major problem with all that approaches is that they suffer from the chronic insufficiency of first-order logic to cope with dynamic change and especially, to solve the frame problem. Therefore, building context-aware software is a complex task due to a lack of appropriate formal models in dynamic environments. In this paper, we propose a model which combines the strengths of both approaches while trying not to carry their specific weaknesses into the resulting formal framework. For this purpose, the formal model relies both on a knowledge representation with ontologies and on a logical reasoning with Dependent Record Types (DRT) based on Intuitionistic Type Theory and the Curry-Howard isomorphism. This logic modelling aims to be applied to any kind of process-based applications.
Chapter
The context paradigm emerges from different areas of Artificial Intelligence (AI). However, while significative formalizations have been proposed, contexts are either mapped on independent micro-theories or considered as different concurrent viewpoints with mappings between contexts to export/import knowledge. These logical formalisms focus on the semantic level and do not take into account dynamic low-level information such as those available from sensors via physical variables. This information is a key element of contexts in pervasive computing environments. In this paper, we introduce a formal framework where the knowledge representation of context bridges the gap between semantic high-level and low-level knowledge. The logical reasoning based on intuitionistic type theory and the Curry-Howard isomorphism is able to incorporate expert knowledge as well as technical resources such as computing variable properties.
Conference Paper
In this work, a method to formalise general recursive algorithms in constructive type theory is presented throughout examples. The method separates the computational and logical parts of the definitions. As a consequence, the resulting type-theoretic algorithms are clear, compact and easy to understand. They are as simple as their equivalents in a functional programming language, where there is no restriction on recursive calls. Given a general recursive algorithm, the method consists in defining an inductive special-purpose accessibility predicate that characterises the inputs on which the algorithm terminates. The type-theoretic version of the algorithm can then be defined by structural recursion on the proof that the input values satisfy this predicate. When formalising nested algorithms, the special-purpose accessibility predicate and the type-theoretic version of the algorithm must be defined simultaneously because they depend on each other. Since the method separates the computational part from the logical part of a definition, formalising partial functions becomes also possible
Conference Paper
Full-text available
We show that certain input-output relations, termed inductive invariants are of central importance for termination proofs of algorithms defined by nested recursion. Inductive invariants can be used to enhance recursive function definition packages in higher-order logic mechanizations. We demonstrate the usefulness of inductive invariants on a large example of the BDD algorithm Apply. Finally, we introduce a related concept of inductive fixpoints with the property that for every functional in higher-order logic there exists a largest partial function that is such a fixpoint.
Conference Paper
Nuprl supports program synthesis by extracting programs from proofs. In this paper we describe the extraction of “efficient” recursion schemes from proofs of well-founded induction principles. This is part of a larger methodology; when these well-founded induction principles are used in proofs, the structure of the program extracted from the proof is determined by the recursion scheme inhabiting the induction principle. Our development is based on Paulson’s paper Constructing recursion operators in intuitionistic type theory, but we specifically address two possibilities raised in the conclusion of his paper: the elimination of non-computational content from the recursion schemes themselves and, the use of the Y combinator to allow the recursion schemes to be extracted directly from the proofs of well-founded relations.
Conference Paper
Full-text available
We describe a package to reason eciently,about executable specifications in Coq. The package,provides,a command,for synthesizing a customized induction principle for a recursively defined function, and a tactic that combines,the application,of the customized,induction,prin- ciple with,automatic,rewriting. We further illustrate how,the package leads to a drastic reduction,(by a factor of 10 approximately),of the size of the proofs in a large-scale case study on reasoning,about,JavaCard.
Conference Paper
In this paper, we develop a general theory of fixed point combinators, in higher-order logic equipped with Hilbert’s epsilon operator. This combinator allows for a direct and effective formalization of corecursive values, recursive and corecursive functions, as well as functions mixing recursion and corecursion. It supports higher-order recursion, nested recursion, and offers a proper treatment of partial functions in the sense that domains need not be hardwired in the definition of functionals. Our work, which has been entirely implemented in Coq, unifies and generalizes existing results on contraction conditions and complete ordered families of equivalences, and relies on the theory of optimal fixed points for the treatment of partial functions. It provides a practical way to formalize circular definitions in higher-order logic.
Conference Paper
In this article we present a method for formally proving the correctness of the lazy algorithms for computing homographic and quadratic transformations — of which field operations are special cases— on a representation of real numbers by coinductive streams. The algo- rithms work on coinductive stream of Mobius maps and form the basis of Edalat-Potts exact real arithmetic. We build upon our earlier work of formalising the homographic and quadratic algorithms in constructive type theory via general corecursion. Based on the notion of cofixed point equations for general corecursive definitions we prove by coinduction the correctness of the algorithms. We use the machinery of the Coq proof as- sistant for coinductive types to present the formalisation. The material in this article is fully formalised in the Coq proof assistant.
Conference Paper
Surreal Numbers form a totally ordered (commutative) Field, containing copies of the reals and (all) the ordinals. I have encoded most of the Ring structure of surreal numbers in Coq. This encoding relies on Aczel’s encoding of set theory in type theory. This paper discusses in particular the definitional or proving points where I had to diverge from Conway’s or the most natural way, like separation of simultaneous induction-recursion into two inductions, transforming the definition of the order into a mutually inductive definition of “at most” and “at least” and fitting the rather complicated induction/recursion schemes into the type theory of Coq.
Conference Paper
The concept of goal is central in Artificial Intelligence and its modelling is a challenging issue. It has been given much attention in areas such as Requirement Engineering (RE) and Planning and Scheduling, where its modelling can support formal reasoning through goal types, goal attributes and relations to other components. However there is a lack of formalisms able to reason with goal structures in dynamic environments. We claim that a logical framework based on Intuitionistic Type Theory and more precisely, on Dependent Record Types is able to address this problem. The formal foundations rely on context modelling through dependent record types allowing partial knowledge and dynamic reasoning. For the purpose of goal modelling, we introduce a family of functions which map Context Record Types to Intentional Record Types expressing their related actions and goals. A case study in planning illustrates this approach.
Conference Paper
This paper develops machinery necessary to mechanically import arbitrary functional programs into Coq's type theory, manually strengthen their specifications with additional proofs, and then me- chanicaly re-extract the newly-certified program in a form which is as efficient as the original program. In order to facilitate this goal, the coinductive technique of (Cap05) is modified to form a monad whose operators are the constructors of a coinductive type rather than functions defined over the type. The inductive invariant technique of (KM03) is extended to allow optional "after the fact" termination proofs. These proofs inhabit members of Prop, and therefore do not affect extracted code. Compared to (Cap05), the new monad makes it possible to directly represent unrestricted recursion without violating productivity re- quirements (Gim95), and it produces efficient code via Coq's ex- traction mechanism. The disadvantages of this technique include reliance on the JMeq axiom (McB00) and a significantly more com- plex notion of equality. The resulting technique is packaged as a Coq library, and is suitable for formalizing programs written in any side-effect-free functional language with call-by-value semantics.
Conference Paper
Proof animation is a way of executing proofs to find errors in the formalization of proofs. It is intended to be "testing in proof engineering". Although the realizability interpretation as well as the functional interpretation based on limit-computations were introduced as means for proof animation, they were unrealistic as an architectural basis for actual proof animation tools. We have found game theoretical semantics corresponding to these interpretations, which is likely to be the right architectural basis for proof animation.
Conference Paper
Full-text available
There is growing public concern about personal data collected by both private and public sectors. People have very little control over what kinds of data are stored and how such data is used. Moreover, the ability to infer new knowl- edge from existing data is increasing rapidly with advances in database and data mining technologies. We describe a solution which allows people to take control by specifying constraints on the ways in which their data can be used. User con- straints are represented in formal logic, and organizations that want to use this data provide formal proofs that the software they use to process data meets these constraints. Checking the proof by an independent verier demonstrates that user constraints are (or are not) respected by this software. Our notion of ìprivacy cor- rectnessî differs from general software correctness in two ways. First, properties of interest are simpler and thus their proofs should be easier to automate. Sec- ond, this kind of correctness is stricter; in addition to showing a certain relation between input and output is realized, we must also show that only operations that respect privacy constraints are applied during execution. We have therefore an intensional notion of correctness, rather that the usual extensional one. We dis- cuss how our mechanism can be put into practice, and we present the technical aspects via an example. Our example shows how users can exercise control when their data is to be used as input to a decision tree learning algorithm. We have formalized the example and the proof of preservation of privacy constraints in Coq.
Conference Paper
Full-text available
In a series of articles, we developed a method to translate general recursive functions written in a functional programming style into constructive type theory. Three problems remained: the method could not properly deal with functions taking functional arguments, the translation of terms containing λ-abstractions was too strict, and par- tial application of general recursive functions was not allowed. Here, we show how the three problems can be solved by defining a type of partial functions between given types. Every function, including arguments to higher order functions, λ-abstractions and partially applied functions, is then translated as a pair consisting of a domain predicate and a func- tion dependent on the predicate. Higher order functions are assigned domain predicates that inherit termination conditions from their func- tional arguments. The translation of a λ-abstraction does not need to be total anymore, but generates a local termination condition. The domain predicate of a partially applied function is defined by fixing the given arguments in the domain of the original function. As in our previous articles, simultaneous induction-recursion is required to deal with nested recursive functions. Since by using our method the inductive definition of the domain predicate can refer globally to the domain predicate itself, here we need to work on an impredicative type theory for the method to apply to all functions. However, in most practical cases the method can be adapted to work on a predicative type theory with type universes.
Article
The notion of pattern matching in correlation with functional programming is discussed. The key feature of pattern matching in simply-typed languages is that the structure of an arbitrary value in a datatype is explained. Pattern matching analyzes constructor patterns on the left-hand sides of functional equations, and is defined by a subsystem of the operational semantics with hard-wired rules for computing substitution from the pattern variables to values. Elementary pattern matching may be recast in abstract form, with a semantics given by translation.
Article
Full-text available
In this paper I present a partial formalisation of a normaliser for type theory in Agda [Ulf Norell. Agda 2, 2007. http://www.cs.chalmers.se/~ulfn/]; extending previous work on big-step normalisation [Thorsten Altenkirch and James Chapman. Big-Step Normalisation. Journal of Functional Programming, 2008. Special Issue on Mathematically Structured Functional Programming. To appear, Thorsten Altenkirch and James Chapman. Tait in one big step. In Workshop on Mathematically Structured Functional Programming, MSFP 2006, Kuressaare, Estonia, July 2, 2006, electronic Workshop in Computing (eWiC), Kuressaare, Estonia, 2006. The British Computer Society (BCS)]. The normaliser in written as an environment machine. Only the computational behaviour of the normaliser is presented omitting details of termination.
Article
Full-text available
We propose to use a simple inductive type as a basis to represent the field of rational numbers. We describe the relation between this representation of numbers and the representation as fractions of non-zero natural numbers. The usual operations of comparison, multiplication, and addition are then defined in a naive way. The whole construction is used to build a model of the set of rational numbers as an ordered archimedian field. All constructions have been modeled and verified in the Coq proof assistant.
Article
Ziel dieser Arbeit ist die Entwicklung einer Infrastruktur für rekursive Funktionsdefinitionen in einem interaktiven Theorembeweiser, in dem Rekursion und Pattern-Matching nicht von Haus aus unterstützt werden. Wir arbeiten im Kontext höherstufiger Logik (higher-order logic, HOL). Im ersten Teil entwickeln wir ein Werkzeug, welches Funktionsdefinitionen automatisiert und geeignete Beweisregeln dafür bereitstellt. Im Gegensatz zu existierenden Ansätzen unterstützt unser Verfahren auch partielle Funktionen, die mit Hilfe eines induktiven Domainprädikats modelliert werden. Eine automatisch generierte Induktionsregel erlaubt partielle Korrektheitsbeweise unabhängig von der Terminierung der Funktion. Diese modulare Struktur erleichtert insbesondere die Behandlung von geschachtelter Rekursion. Der zweite Teil behandelt automatische Terminierungsbeweise, um die aus Funktionsdefinitionen entstehenden Beweisverpflichtungen automatisch zu lösen. Dabei verwenden wir Methoden aus der Literatur, die allerdings an die spezifischen Anforderungen unseres Szenarios angepasst werden müssen. Unser Ansatz beinhaltet eine regelbasierte Auswahl von Maßfunktionen, eine einfache Kontrollflussanalyse ähnlich der Dependency-Pairs-Methode, und eine Variante des Size-Change-Kriteriums mit Zertifikaten. Eine Formalisierung des vollen Size-Change-Kriteriums wird ebenfalls entwickelt. Im dritten Teil untersuchen wir, wie das in der funktionalen Programmierung gebräuchliche Pattern-Matching in HOL unterstützt werden kann. Wir entwickeln eine sehr allgemeine Form von Pattern-Matching in der Logik, welches beliebige Terme als Patterns erlaubt und mit Hilfe eines speziellen Kombinators in der Logik ausgedrückt werden kann. Die Konsistenz der Spezifikation wird durch spezielle Beweisverpflichtungen sichergestellt. Außerdem untersuchen wir das Problem, Spezifikationen mit sequenziellem Pattern-Matching in reine Gleichungsspezifikationen mit möglichst wenigen Glei­chungen umzuwandeln. Wir ziehen eine Parallele zum Problem der Minimierung Boolescher Ausdrücke in DNF und zeigen, dass das Minimierungsproblem für Patterns S2PS_2^P-vollständig ist. Wir geben auch einen konkreten Algorithmus zur Minimierung von Patterns an. Als weitere Anwendung der neuen Werkzeuge zeigen wir, wie sich vom Benutzer vorgegebene Induktionsschemata automatisch auf einfachere Beweisziele reduzieren lassen, wodurch sich der Beweis solcher Schemata oft weitgehend automatisieren lässt.
Conference Paper
Full-text available
Alf is an interactive proof editor. It is based on the idea that to prove a mathematical theorem is to build a proof object for the theorem. The proof object is directly manipulated on the screen, different manipulations correspond to different steps in the proof. The language we use is Martin-Löf's monomorphic type theory. This is a small functional programming language with dependent types. The language is open in the sense that it is easy to introduce new inductively defined sets. A proof is represented as a mathematical object and a proposition is identified with the set of its proof objects. The basic part of the proof editor can be seen as a proof engine with two basic commands, one which builds an object by replacing a placeholder in an object by a new object, and another one which deletes a part of an object by replacing a sub-object by a placeholder. We show that the validity of the incomplete object is preserved by admissible insertions and deletions.
Article
Full-text available
The first example of a simultaneous inductive-recursive definition in intuitionistic type theory is Martin-Löfs universe à la Tarski. A set U 0 of codes for small sets is generated inductively at the same time as a function T 0 , which maps a code to the corresponding small set, is defined by recursion on the way the elements of U 0 are generated. In this paper we argue that there is an underlying general notion of simultaneous inductive-recursive definition which is implicit in Martin-Löf's intuitionistic type theory. We extend previously given schematic formulations of inductive definitions in type theory to encompass a general notion of simultaneous induction-recursion. This enables us to give a unified treatment of several interesting constructions including various universe constructions by Palmgren, Griffor, Rathjen, and Setzer and a constructive version of Aczel's Frege structures. Consistency of a restricted version of the extension is shown by constructing a realisability model in the style of Allen.
Article
Full-text available
Boyer and Moore have discussed a function that puts conditional expressions into normal form [1]. It is difficult to prove that this function terminates on all inputs. Three termination proofs are compared: (1) using a measure function, (2) in domain theory using LCF, (3) showing that its recursion relation, defined by the pattern of recursive calls, is well-founded. The last two proofs are essentially the same though conducted in markedly different logical frameworks. An obviously total variant of the normalize function is presented as the ‘computational meaning’ of those two proofs. A related function makes nested recursive calls. The three termination proofs become more complex: termination and correctness must be proved simultaneously. The recursion relation approach seems flexible enough to handle subtle termination proofs where previously domain theory seemed essential.
Conference Paper
We offer a new account of recursive definitions for both types and partial functions. The computational requirements of the theory restrict recursive type definitions involving the total function-space constructor () to those with only positive occurrences of the defined typed. But we show that arbitrary recursive definitions with respect to the partial function-space constructor are sensible. The partial function-space constructor allows us to express reflexive types of Scott's domain theory (as needed to model the lambda calculus) and thereby reconcile parts of domain theory with constructive type theory.
Conference Paper
Functions specified by nested recursions are difficult to define and reason about. We present several ameliorative techniques that use deduction in a classical higher-order logic. First, we discuss how an apparent circular dependency between the proof of nested termination conditions and the definition of the specified function can be avoided. Second, we propose a method that allows the specified function to be defined in the absence of a termination relation. Finally, we show how our techniques extend to nested program schemes, where a termination relation cannot be found until schematic parameters have been filled in. In each of these techniques, suitable induction theorems are automatically derived.
Article
In Martin-Löf's type theory, general recursion is not available. The only iterating constructs are primitive recursion over natural numbers and other inductive sets. The paper describes a way to allow a general recursion operator in type theory (extended with propositions). A proof rule for the new operator is presented. The addition of the new operator will not destroy the property that all well-typed programs terminate. An advantage of the new program construct is that it is possible to separate the termination proof of the program from the proof of other properties.
Article
This paper deals with automated termination analysis for functional programs. Previously developed methods for automated termination proofs of functional programs often fail for algorithms with nested recursion and they cannot handle algorithms with mutual recursion. We show that termination proofs for nested and mutually recursive algorithms can be performed without having to prove the correctness of the algorithms simultaneously. Using this result, nested and mutually recursive algorithms do no longer constitute a special problem and the existing methods for automated termination analysis can be extended to nested and mutual recursion in a straightforward way. We give some examples of algorithms whose termination can now be proved automatically (including well-known challenge problems such as McCarthy’s f_91 function).
Conference Paper
A typed framework for working with nonterminating computations is provided. The basic system is the calculus of constructions. It is extended using an original idea proposed by R. Constable and S.F. Smith (2nd Ann. IEEE Conf. on Logic in Comput. Sci., 1987) and implemented in Nuprl. From the computational point of view, an equivalent of the Kleene theorem for partial recursive functions over the integers within an index-free setting is obtained. A larger class of algebraic types is defined. Logical aspects need more examination, but a syntactic method for dealing with partial and total objects, leading to the notion of generic proof, is provided
Article
General recursive algorithms are such that the recursive calls are performed on arguments satisfying no condition that guarantees termination. Hence, there is no direct way of formalising them in type theory. The standard way of handling general recursion in type theory uses a well-founded recursion principle. Unfortunately, this way of formalising general recursive algorithms often produces unnecessarily long and complicated codes. On the other hand, functional programming languages like Haskell impose no restrictions on recursive programs, and then writing general recursive algorithms is straightforward. In addition, functional programs are usually short and self-explanatory. However, the existing frameworks for reasoning about the correctness of Haskell-like programs are weaker than the framework provided by type theory. The goal of this work is to present a method that combines the advantages of both programming styles when writing simple general recursive algorithms....
Conference Paper
this paper is mostly described by Coquand, Pfenning, and Paulin-Mohring in [14, 4, 11]. General use of well-founded recursion in Martin-L#f's intuitionistic type theory was studied by Paulson in [12], who shows that reduction rules can be obtained for each of several means to construct well-founded relations from previously known well-founded relations. By comparison with Paulson's work, our technique is to obtain reduction rules that are specic to each recursive function. The introduction of well-founded recursion using an accessibility principle as used in this paper was described by Nordstr#m in [9].
A User's Guide to ALF Available on the WWW ftp://ftp.cs.chalmers.se/pub/users/alti/alf
  • T Altenkirch
  • V Gaspes
  • B Nordström
  • B Sydow
T. Altenkirch, V. Gaspes, B. Nordström, and B. von Sydow. A User's Guide to ALF. Chalmers University of Technology, Sweden, May 1994. Available on the WWW ftp://ftp.cs.chalmers.se/pub/users/alti/alf.ps.Z.
Theorem Proving in Higher Order Logics: 13th International Conference
  • J Harrison
  • M Aagaard
J. Harrison and M. Aagaard, editors. Theorem Proving in Higher Order Logics: 13th International Conference, TPHOLs 2000, volume 1869 of Lecture Notes in Computer Science. Springer-Verlag, 2000.