Rod M. Burstall’s research while affiliated with University of Edinburgh and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (41)


Some Fundamental Algebraic Tools for the Semantics of Computation: Part 3: Indexed Categories
  • Article

December 1991

·

16 Reads

·

138 Citations

Theoretical Computer Science

·

Rod M. Burstall

·

Joseph A. Goguen

This paper presents indexed categories which model uniformly defined families of categories, and suggests that they are a useful tool for the working computer scientist. An indexed category gives rise to a single flattened category as a disjoint union of its component categories plus some additional morphisms. Similarly, an indexed functor (which is a uniform family of functors between the components categories) induces a flattened functor between the corresponding flattened categories. Under certain assumptions, flattened categories are (co)complete if all their components are, and flattened functors have left adjoints if all their components do. Several examples are given. Although this paper is Part 3 of the series “Some fundamental algebraic tools for the semantics of computation”, it is entirely independent of Parts 1 and 2.


A Natural Deduction treatment of Operational Semantics.

December 1988

·

10 Reads

·

15 Citations

Lecture Notes in Computer Science

We show how Natural Deduction extended with a replacement operator can provide a framework for defining programming languages, a framework which is more expressive than the usual Operational Semantics presentation in that it allows hypothetical premises. This allows us to do without an explicit environment and store. Instead we use the hypothetical premises to make assumptions about the values of variables. We define the extended Natural Deduction logic using the Edinburgh Logical Framework.


Pebble, a Kernel Language for Modules and Abstract Data Types

February 1988

·

54 Reads

·

97 Citations

Information and Computation

[An earlier version of this paper appeared in Lect. Notes Comput. Sci. 173, 1-50 (1984; Zbl 0552.68009).] A small set of constructs can simulate a wide variety of apparently distinct features in modern programming languages. Using a kernel language called Pebble based on the typed lambda calculus with bindings, declarations, dependent types, and types as compile time values, we show how to build modules, interfaces and implementations, abstract data types, generic types, recursive types, and unions. Pebble has a concise operational semantics given by inference rules.



Inductively Defined Functions in Functional Programming Languages.

April 1987

·

21 Reads

·

20 Citations

Journal of Computer and System Sciences

This paper proposes a notation for defining functions or procedures in such a way that their termination is guaranteed by the scope rules. It uses an extension of case expressions. Suggested uses include programming languages and logical languages; an application is also given to the problem of proving inequations from initial algebra specifications.



Computing with Categories.

January 1985

·

13 Reads

·

1 Citation

Lecture Notes in Computer Science

This paper shows how the constructions involved in category theory may be turned into computer programs. Key issues are the computational representation of categories and of universal properties. The approach is illustrated with a program for computing finite limits of an arbitrary category; this is written in the functional programming language ML. We have developed such programs for a number of categorical constructions.


A Study in the Functions of Programming Methodology: Specifications, Institutions, Charters and Parchments.

January 1985

·

15 Reads

·

99 Citations

Lecture Notes in Computer Science

The theory of institutions formalizes the intuitive notion of a "logical system." Institutions were introduced (1) to support as much computer science as possible independently of the underlying logical system, (2) to facilitate the transfer of results (and artifacts such as theorem provers) from one logical system to another, and (3) to permit combining a number of different logical systems. In particular, programming-in-the-large (in the style of the Clear specification language) is available for any specification or "logical" programming language based upon a suitable institution. Available features include generic modules, module hierarchies, "data constraints" (for data abstraction), and "multiplex" institutions (for combining multiple logical systems). The basic components of an institution are: a category of signatures (which generally provide symbols for constructing sentences); a set (or category) of Σ-sentences for each signature Σ; a category (or set) of Σ-models for each Σ; and a Σ-satisfaction relation, between Σ-sentences and Σ-models, for each Σ. The intuition of the basic axiom for institutions is that truth (i.e., satisfaction) is invariant under change of notation. This paper enriches institutions with sentence morphisms to model proofs, and uses this to explicate the notion of a logical programming language. To ease constructing institutions, and to clarify various notions, this paper introduces two further concepts. A charter consists of an adjunction, a "base" functor, and a "ground" object; we show that "chartering" is a convenient way to "found" institutions. Parchments provide a notion of sentential syntax, and a simple way to "write" charters and thus get institutions. Parchments capture the insight that the syntax of logic is an initial algebra. Everything is illustrated with the many-sorted equational institution. Parchments also explicate the sense of finitude that is appropriate for specifications. Finally, we introduce generalized institutions, which generalize both institutions and Mayoh's "galleries", and we introduce corresponding generalized charters and parchments.



Some Fundamental Algebraic Tools for the Semantics of Computation. Part 1: Comma Categories, Colimits, Signatures and Theories.

December 1984

·

17 Reads

·

95 Citations

Theoretical Computer Science

This paper develops a number of fundamental tools from category theory and applies them to problems in computation. The tools include algebraic theories, colimits, comma categories and two-dimensional categories. The applications concern making program specifications understandable by expressing them as interconnections of smaller ‘mind sized’ specifications in a variety of ways, as in our language spCLEAR. The link between these tools and applications is the idea of using algebraic theories as program specifications. To carry out this programme requires developing a formal calculus of operations for constructing, modifying and interconnecting theories. These operations include: constructing free theories, combining given theories, deriving new computational functions from old, abstracting (or ‘hiding’) parts of theories, inductively completing a theory, and applying a theory-valued procedure to an actual theory. Because a number of different notions of theory are relevant to computation, this paper also sketches an axiomatically based calculus of operations applicable to any notion of theory which satisfies certain axioms. The paper also presents a number of sample calculations illustrating how the various tools can be used, and proves some general theorems which justify such calculations. Part 1 develops comma categories, colimits, signatures and sorted theories, while Part 2 (to appear in Theoretical Computer Science 31 (3)) introduces signed theories, abstract theories, and further applications to specification.


Citations (36)


... We should mention as the sources of the descriptive view on image analysis and consequently of DA the works of 1970 -1990 by F. Ambler, G. Barrow, R. Burstall [1], T. ...

Reference:

ICPRAI 2018 SI: Descriptive Image Analysis. Foundations and Descriptive Image Algebras
SOME TECHNIQUES FOR RECOGNISING STRUCTURES IN PICTURES
  • Citing Chapter
  • December 1972

... The mechanism for referring to non-local variables which is built into the BCPL language is inadequate to deal naturally with this situation. (So, for that matter, are those in ALGOL 60, ALGOL 68 and PL/1; two languages which are sufficiently powerful are PAL (Evans, 1968) and POP-2 (Burstall, Collins, and Popplestone, 1968).) This means that we must make special provisions to preserve the information ourselves, which we do by keeping it all in the vector S. The length of S may therefore vary from stream to stream, but the first few elements are always reserved for the basic functions and routines. ...

POP-2 Papers
  • Citing Article
  • February 1970

The Mathematical Gazette

... The preceding section described an example of arity-generic programming. We consider now a type-generic task: proving the no confusion property [4] of datatype constructors (that is, they are injective and disjoint) for a closed universe of strictly positive types. For defining a universe of datatypes, the idea (describe in more detail by Dagand and McBride [9]) is to define a type whose elements are interpreted as codes for datatype signatures and combine this with a type-level least fixedpoint operator. ...

Algebras, Theories and Freeness: An Introduction for Computer Scientists
  • Citing Article
  • January 1982

... The Poplog system developed at Sussex University in the early 1980s was an implementation of the Pop-11 language (a descendant of Pop-2 [Burstall and Popplestone 1968]) that included support for Prolog (hence the name). Subsequently, support for Common Lisp and later Standard ML was added to the implementation; the SML implementation was owed to Robert Duncan and Simon Nichols and corresponded roughly to the 1990 version of the language. ...

POP-2: Reference menual
  • Citing Article
  • January 1968

... Following this general approach, the first part of the paper develops the notion of derived structures and models. Burstall and Goguen [20,48] introduced the notion of derivor 4 to 4 Implemented as part of the mechanisms available in the programming language Clear [20] to synthesize operations out from a set of basic (and available) ones [47,48]. ...

Some fundamental algebraic tools for the semantics of computation : Part 2: Signed and abstract theories
  • Citing Article
  • December 1984

Theoretical Computer Science

... The graph correspondence problem between G a and G t then reduces to finding the maximum clique of the correspondence graph C [2]. This allows us to obtain matches between the 2D tree positions of the aerial and terrestrial clouds, used to solve for the relative planar pose using Umeyama's method [27]. ...

Subgraph isomorphism, matching relational structures and maximal cliques
  • Citing Article
  • January 1976

Information Processing Letters

... Freddy II was a "hand-eye" robot that could assemble toy wooden models from a heap of pieces. It used vision to identify and locate the parts, and was able to rearrange them to enable identification when they were obscured by other parts [1]. Given the state-of-the-art at the time, this required not only building the robot, but also designing and building the vision system, and a programming environment for controlling the various subsystems. ...

A versatile computer-controlled assembly system
  • Citing Article
  • June 1975

Artificial Intelligence

A.P. Ambler

·

·

C.M. Brown

·

[...]

·

R.J. Popplestone

... • Mathematical induction • Equational logic • CASL [326] • OBJ [327] • Clear [328] • Larch [329] • ACT-ONE [330] • LOTOS [331] Declarative modeling Logic-based languages, functional languages, rewriting languages, and languages for defining formal semantics. ...

An informal introduction to specifications using clear
  • Citing Article
  • January 1980