Article

Programming in POP-2

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... + "[]." ); } 18 p.add( "public class " + joinClassName ); 19 20 p.indent(); 21 // Generate the 'join' method. 22 p.add( "public static Object join( Object arg1, Object arg2 ) " ); 23 ...
... Linguistic reflective systems are categorised as to whether the reflection and program generation occurs during compilation or at run-time. Compile-time reflection in an untyped language essentially gives a macro system ‡ ‡ ‡ , such as in Scheme 21 or POP-2 22 . Strongly typed examples include ADABTPL 23 , TRPL 1 and CRML 24 ; these allow source code generating functions to be executed at compile-time. ...
... Finally, run-time linguistic reflection also appears in both untyped and typed systems. Untyped examples include the eval functions of Lisp 16 and the popval function of POP-2 22 . Strongly typed run-time linguistic reflection first appeared in PS-algol 7 , and has also been used in Napier88 8 and Java-the subject of this paper. ...
Preprint
Reflective systems allow their own structures to be altered from within. Here we are concerned with a style of reflection, called linguistic reflection, which is the ability of a running program to generate new program fragments and to integrate these into its own execution. In particular we describe how this kind of reflection may be provided in the compiler-based, strongly typed object-oriented programming language Java. The advantages of the programming technique include attaining high levels of genericity and accommodating system evolution. These advantages are illustrated by an example taken from persistent programming which shows how linguistic reflection allows functionality (program code) to be generated on demand (Just-In-Time) from a generic specification and integrated into the evolving running program. The technique is evaluated against alternative implementation approaches with respect to efficiency, safety and ease of use.
... The interpretation is that the string on the left hand side gives rise to the one on the right. [Burstall et al, 1971]. ...
... Burstall and R. J. Popplestone [Burstall et al, 1971]. ...
Thesis
The program described learns to improve its performance in the playing of a game, from experience. The main objectives of the project are that the system should observe the following principles: 1) The program should not rely on any special evaluation functions, which would embody domain-specific information. 2) Initial knowledge of the domain should be minimal, and further knowledge gained should be assimilated in terms of prior knowledge 3) The system of representation employed should as far as possible be independent of the domain, again avoiding the incorporation of domain-specific information. In customary Artificial Intelligence terms, the program is referred to as existing in a domain or environment. The model has a goal within this domain and has available certain actions which it may take in order to achieve its goal. The goal is represented as a Structure. This term will be used throughout to denote a set of objects from the domain, constrained by various domain-pertinent relationships. The actions, goals and objects are the initial known facts of the environment. The program has an innate ability to plan simple sequences of actions to achieve its goals. Inevitably, these plans do not take into account enough of the nature of the domain and prove inadequate. In such events the descriptive abilities of the program are invoked to correct the deficiency, and the program's model of its environment is enriched.
... More recently, Lopez [131] has done the same for a variety of statistical machine translation systems, and the GENPAR C++ class library [32] aids implementation of some such systems in a generic way. 33 For example, one can transform an arbitrary weighted context-free grammar into Chomsky Normal Form for use with Figure 11a, or coarsen a grammar for use as an A * heuristic [115]. 34 For example, the non-range-restricted rule rewrite(X/Z,X/Y,Y/Z). encodes the infinitely many "composition" rules of combinatory categorial grammar [178], in which a complex nonterminal such as s/(pp/np) denotes an incomplete sentence (s) that is missing an incomplete prepositional phrase (pp) that is in turn missing a noun phrase (np). ...
... Applications include business intelligence (e.g., LogicBlox [127]); stream processing for algorithmic equities trading (e.g., DBToaster [5]); user interfaces (e.g., Dynasty [72] and Fruit [53]); declarative animation (e.g., Fran [76]); query planners and optimizers (see §5.3); and even (incremental) compilers [33]. ...
Article
Modern statistical AI systems are quite large and complex; this interferes with research, development, and education. We point out that most of the computation involves database-like queries and updates on complex views of the data. Specifically, recursive queries look up and aggregate relevant or potentially relevant values. If the results of these queries are memoized for reuse, the memos may need to be updated through change propagation. We propose a declarative language, which generalizes Datalog, to support this work in a generic way. Through examples, we show that a broad spectrum of AI algorithms can be concisely captured by writing down systems of equations in our notation. Many strategies could be used to actually solve those systems. Our examples motivate certain extensions to Datalog, which are connected to functional and object-oriented programming paradigms.
... In the early 1970s, Landin's streams inspired the feature called dynamic lists in POP-2 [19]. A dynamic list is created by applying a primitive function fntolist to a generator function, which produces a new value each time it is called (normally depending on modified state). ...
... I had become interested in coroutines and related "nonstandard" control structures, and by the winter of 1976 I was experimenting with implementing coroutines in the POP-10 language 1 [19], which was the main language used in the School of AI at Edinburgh at that time. POP-10 had provided a kind of delimited first-class continuation, with operations barrierapply (a control delimiter), appstate (call/cc), and reinstate (throw). 2 I told Gilles about my experiments and he suggested that we try implementing the language from his 1974 paper. ...
Article
Full-text available
We start by providing some historical context, discussing earlier data flow models, coroutines, and Landin’s streams. We then summarize the development of Kahn networks through the series of papers Gilles published in the early to mid 1970s. We briefly review the related Kahn-Plotkin development of concrete data types. Next we consider the roughly contemporary development of lazy functional languages, and finally we discuss a few samples of the vast quantity of later work that built on or exploited these ideas. Our goal is to illuminate this particular aspect of Gilles Kahn’s scientific contributions by exploring its development in some detail and relating it to both earlier and contemporary developments
... [Moore, 1973, p. 68]. 9 For logic programming and Prolog see [Kowalski, 1974;, [Clocksin & Mellish, 2003]. 10 Cf. [Burstall &al., 1971]. ...
... [Bledsoe &al., 1972]. 107 Cf. [Burstall &al., 1971]. 108 Here is the actual wording of the timing result found on Page 171f. of : ...
Article
Full-text available
We review the history of the automation of mathematical induction
... Both compile-time and run-time reflection have been provided in previous languages. Compile-time reflection appears in the macro facilities of Scheme [Rees & Clinger, 1986] and POP-2 [Burstall et al., 1971]. Run-time reflection appears in the eval functions of Lisp [McCarthy, et al., 1962] and SNOBOL4 [Griswold et al., 1971] and the popval function of POP-2 [Burstall, et al., 1971]. ...
... Compile-time reflection appears in the macro facilities of Scheme [Rees & Clinger, 1986] and POP-2 [Burstall et al., 1971]. Run-time reflection appears in the eval functions of Lisp [McCarthy, et al., 1962] and SNOBOL4 [Griswold et al., 1971] and the popval function of POP-2 [Burstall, et al., 1971]. ...
Article
Full-text available
Persistent Application Systems (PASs) are of increasing social and economic importance. They have the potential to be long-lived, concurrently accessed and consist of large bodies of data and programs. Typical examples of PASs are CAD/CAM systems, office automation, CASE tools, software engineering environments and patient-care support systems in hospitals. Orthogonally persistent object systems are intended to provide improved support for the design, construction, maintenance and operation of PASs. The persistence abstraction allows the creation and manipulation of data in a manner that is independent of its lifetime thereby integrating the database view of information with the programming language view. This yields a number of advantages in terms of orthogonal design and programmer productivity which are beneficial for PASs. Design principles have been proposed for persistent systems. By following these principles, languages that provide persistence as a basic abstract...
... Pr~mary reinforcement corresponds to learning with problem solution; secondary reinforcem8nt corresponds to learning on the basis of correlates of success. Doran and Michie (1966) In this chapter we give a formal description of GT~, the pa:cticular Graph T·raverser algo:-i thm which has formed the basis of our POP-2 (Burstall, Collins and Popplestone 1971) implementations. ...
Thesis
In this thesis we investigate methods by which GT4, a revised and extended version of the Doran-Michie Graph Traverser, might in the course of its problem-solving activity learn about the domain in which it is searching and thereby improve its performance. We study first the problem of automatically optimizing the parameters of GT4' s evaluation function. In particular, we investigate the distance estimator method proposed by Doran and Michie. Using two sliding-block puzzles and the algebraic manipulation problems of Quinlan and Hunt we demonstrate experimentally the feasibility of this method of parameter optimization. An interesting feature of the work is that optimization is implemented by recursive call of GT4, the algorithm acting for this purpose as a pattern search numerical optimizer. A theoretical analysis of several factors affecting the success of the distance estimator method is then carried out and alternative approaches to parameter optimization are proposed and discussed. In Chapter 8 we describe the results of our experiments in automatic operator selection. We investigate, in particular, a promotional scheme for re-ordering ┌', the global operator list used by GT4, so that operators of proven utility are given preference in the order of application. Our results demonstrate that the scheme successfully improves the ordering of a list of 48 Eight-puzzle macro-operators.
... This is to save time. The program is written in the POP-2 programming language (Burstall, Collins and Popplestone, 1971) if there is none_X exit This reflects our belief that the better-structured the problem the less likely it is that any clause in H0 is generalised by more than one clause in a good explanation. Consequently we believe that the chance of failing, and so wasting a lot of computational effort, grows with the number of previous successes in explaining fi given ei. ...
Thesis
This thesis is concerned with algorithms for generating generalisations-from experience. These algorithms are viewed as examples of the general concept of a hypothesis discovery system which, in its turn, is placed in a framework in which it is seen as one component in a multi-stage process which includes stages of hypothesis criticism or justification, data gathering and analysis and prediction. Formal and informal criteria, which should be satisfied by the discovered hypotheses are given. In particular, they should explain experience and be simple. The formal work uses the first-order predicate calculus. These criteria are applied to the case of hypotheses which are generalisations from experience. A formal definition of generalisation from experience, relative to a body of knowledge is developed and several syntactical simplicity measures are defined. This work uses many concepts taken from resolution theory (Robinson, 1965). We develop a set of formal criteria that must be satisfied by any hypothesis generated by an algorithm for producing generalisation from experience. The mathematics of generalisation is developed. In particular, in the case when there is no body of knowledge, it is shown that there is always a least general generalisation of any two clauses, in the generalisation ordering. (In resolution theory, a clause is an abbreviation for a disjunction of literals.) This least general generalisation is effectively obtainable. Some lattices induced by the generalisation ordering, in the case where there is no body of knowledge, are investigated. The formal set of criteria is investigated. It is shown that for a certain simplicity measure, and under the assumption that there is no body of knowledge, there always exist hypotheses which satisfy them. Generally, however, there is no algorithm which, given the sentences describing experience, will produce as output a hypothesis satisfying the formal criteria. These results persist for a wide range of other simplicity measures. However several useful cases for which algorithms are available are described, as are some general properties of the set of hypotheses which satisfy the criteria. Some connections with philosophy are discussed. It is shown that, with sufficiently large experience, in some cases, any hypothesis which satisfies the formal criteria is acceptable in the sense of Hintikka and Hilpinen (1966). The role of simplicity is further discussed. Some practical difficulties which arise because of Goodman's (1965) "grue" paradox of confirmation theory are presented. A variant of the formal criteria suggested by the work of Meltzer (1970) is discussed. This allows an effective method to be developed when this was not possible before. However, the possibility is countenanced that inconsistent hypotheses might be proposed by the discovery algorithm. The positive results on the existence of hypotheses satisfying the formal criteria are extended to include some simple types of knowledge. It is shown that they cannot be extended much further without changing the underlying simplicity ordering. A program which implements one of the decidable cases is described. It is used to find definitions in the game of noughts and crosses and in family relationships. An abstract study is made of the progression of hypothesis discovery methods through time. Some possible and some impossible behaviours of such methods are demonstrated. This work is an extension of that of Gold (1967) and Feldman (1970). The results are applied to the case of machines that discover generalisations. They are found to be markedly sensitive to the underlying simplicity ordering employed.
... The program is written in POP-2 (Burstall, Popplestone and Collins (1971)) as implemented at the Departr:lent of Artificial ...
Thesis
Recent work in artificial intelligence has developed a number of techniques which are particularly appropriate for constructing a model of the process of understanding English sentences. These methods are used here in the definition of a framework for linguistic description, called "computational grammar". This framework is employed to explore the - details of the operations involved in transforming an representation English sentence into a general semantic Computational grammar includes both "syntactic" and "semantic" constructs, in order to clarify the interactions between all the various kinds of information, and treats the sentence-analysis process as having a semantic goal which may require syntactic means to achieve it. The sentence-analyser is based on the concept of an "augmented transition network grammar", modified to minimise unwanted top-down processing and unnecessary era bedding. The analyser does not build a purely syntactic ,structure for a sentence, but the semantic rules operate hierarchically in a way which reflects the traditional tree structure. The processing operations are simplified by using temporary storage to postpone premature decisions or to conflate different options. The computational grammar framework has been applied to a few areas of English, including relative clauses, referring expressions, verb phrases and tense. A computer program ( "MCHINE") has been written which implements the constructs of computational grammar and some of the linguistic descriptions of English. A number of sentences have been successfully processed by the program, which can carry on a simple. dialogue as well as building semantic representations for isolated sentences .
... All of the files in the program have been included; thus all the functions mentioned in the files will be defined, except for those belonging to the POP-2 system. Descriptions of these system functions are given in "Programming in POP-2" (Burstall et al, 1972 ::NIL; ...
Thesis
This paper is addressed to the problem of how it is possible to conduct coherent, purposeful conversations. It describes a computer model of a conversation between two robots, each robot being represented by a section of program. The conversation is conducted in a small subset of English, and is a mixed-initiative dialogue which can involve interruptions and the nesting of one segment of dialogue in another. The conversation is meant to arise naturally from a well defined setting, so that it is clear whether or not the robots are saying appropriate things. They are placed in a simple world of a few objects, and co-operate in order to achieve a practical goal in this world. Their conversation arises out of this common aim; they have to agree on a plan, exchange information, discuss the consequences of their actions, and so on. In previous language-using programs, the conversation has been conducted by a robot and a human operator, rather than by two robots. In these systems, it is almost always the human operator who takes the initiative and determines the overall structure of the dialogue, and the processes by which he does so are hidden away in his mind. The aim of our program is to make these processes totally explicit, and it is for this reason that we have used two robots and avoided human participation. Thus the main focus of interest is not the structuring of individual utterances, but the higher-level organisation of the dialogue, and how the dialogue is related to the private thoughts which underlie it. The program has two kinds of procedure, which we call ROUTINES and GAMES, the Games being used to conduct sections of conversation and the Routines to conduct the underlying thoughts. These procedures can call each other in the normal way. Thus the occurrence of a section of dialogue will be caused by the call of a Game by a Routine; and when the section of dialogue ends, the Game will exit, returning control to the Routine which called it. There are several Games, each corresponding to a common conversational pattern, such as a question and its answer, or a plan suggestion and the response to it. The Games determine what can be said, who will say it, how each remark will be analysed, and how it will be responded to. They are thus joint procedures, in which the instructions are divided up between the robots. When a section of dialogue occurs, the relevant Game will be loaded in the minds of both robots, but they will have adopted different roles in the Game, and will consequently perform different instructions and make different utterances.
... The expeditious implementation on a computer of the symbolic manipulation described in this paper obviously requires a language in which it is easy to implement a range of data-types and with "heap" rather than stack storage control. In fact we use POP-2 (see [2]). Much of the algebraic manipulation is not specified in POP-2, however, but in terms of productions which are input to an Algebra System written in POP-2. ...
Article
A program has been developed which takes a specification of a set of bodies and of spatial relations that are to hold between them in some goal state, and produces expressions denoting the positions of the bodies in the goal state together with residual equations linking the variables in these expressions.
... The system separates the search across the space of world situations (regarded as a graph whose nodes are situations and whose arcs are operator applications) from the question answering about a particular situation. INTERPLAN is an operational program written in POP-2 (Burstall, Collins and Popplestone, 1971). The HBASE (Barrow, 1975) data base system is used to store situations (as CONTEXTS) and the facts known about each particular situation (as assertions). ...
Article
Full-text available
This thesis describes a class of problems in which interactions occur when plans to achieve members of a set of simultaneous goals are concatenated in the hope of achieving the whole goal. They will be termed "interaction problems". Several well known problems fall into this class. Swapping the values of two computer registers is a typical example. A very simple 3 block problem is used to illustrate the interaction difficulty. It is used to describe how a simple method can be employed to derive enough information from an interaction which has occurred to allow problem solving to proceed effectively. The method used to detect interactions and derive information from them, allowing problem solving to be re-directed, relies on an analysis of the goal and subgoal structure being considered by the problem solver. This goal structure will be called the "approach" taken by the system. It specifies the order in which individual goals are being attempted and any precedence relationships between them (say because one goal is a precondition of an action to achieve another). We argue that the goal structure of a problem contains information which is simpler and more meaningful than the actual plan (sequence of actions) being considered. We then show how an analysis of the goal structure of a problem, and the correction of such a structure in the light of any interaction, can direct the search towards a successful solution. Interaction problems pose particular difficulties for most current problem solvers because they achieve each part of a composite goal independently and assume that the resulting plans can be concatenated to achieve the overall goal. This assumption is beneficial in that it can drastically reduce the search necessary in many problems. However, it does restrict the range of problems which can be tackled. The problem solver, INTERPLAN, to be described as a result of this investigation, also assumes that subgoals can be solved independently, but when an interaction is detected it performs an analysis of the goal structure of the problem to re-direct the search. INTERPLAN is an efficient system which allows the class of interaction problems to be coped with. INTERPLAN uses a data structure called a "ticklist" as the basis of its mechanism for keeping track of the search it performs. The ticklist allows a very simple method to be employed for detecting and correcting for interactions by providing a summary of the goal structure of the problem being tried.
... Untyped linguistic reflection is present in Lisp [MAE+62] and POP-2 [BCP71]. Strongly typed linguistic reflection is supported in the persistent languages PS-algol [PS88] and Napier88 [MBC+94], while TRPL [She90] and CRML [HS94] are languages in which the reflection takes place during program compilation. ...
Article
Reflection has been used to address many different problem areas, and the term reflection has itself been used to describe several distinct processes. This paper identifies three simple operations, generation, raising and dynamic rebinding, which may be composed to yield several varieties of reflection. These can be used to allow a self-contained programming system to evolve, through the incorporation of new behaviour into either the application programs or the interpreter which controls their execution. Reflection is a powerful mechanism and potentially dangerous. Used in the context of persistent programming systems, safety is an important consideration: the integrity of large amounts of data may be at stake. This has led to the use of type checking in conjunction with reflection in such systems to provide some guarantees of safety. The paper describes the nature of reflection in persistent systems and identifies some example applications.
... The Dyna version of continuous queries is discussed in §4.6 below. Example applications include business intelligence (e.g., LogicBlox [127] ); stream processing for algorithmic equities trading (e.g., DBToaster [5]); user interfaces (e.g., Dynasty [70] and Fruit [53]); declarative animation (e.g., Fran [74]); query planners and optimizers (see §5.3); and even (incremental) compilers [33]. In an AI system—for example, medical decision support—sensors may continously gather information from the world, users may state new facts or needs, and information integration may keep track of many large, dynamic datasets at other locations. ...
Conference Paper
Modern statistical AI systems are quite large and complex; this interferes with research, development, and education. We point out that most of the computation involves database-like queries and updates on complex views of the data. Specifically, recursive queries look up and aggregate relevant or potentially relevant values. If the results of these queries are memoized for reuse, the memos may need to be updated through change propagation. We propose a declarative language, which generalizes Datalog, to support this work in a generic way. Through examples, we show that a broad spectrum of AI algorithms can be concisely captured by writing down systems of equations in our notation. Many strategies could be used to actually solve those systems. Our examples motivate certain extensions to Datalog, which are connected to functional and object-oriented programming paradigms.
... Thus CPL was a legitimate ancestor of the world's most influential programming language, used in implementing many other languages and operating systems. CPL had another less eminent descendant, also influenced by LISP and ISWIM, namely the POP-2 language developed at Edinburgh University by Robin Popplestone and myself [5, 6]. Robin came to Edinburgh in 1965 with a LISP influenced language which we renamed POP-1 (he called it COWSEL which sounded to us too rural for a programming language). ...
Article
This paper forms the substance of a course of lectures given at the International Summer School in Computer Programming at Copenhagen in August, 1967. The lectures were originally given from notes and the paper was written after the course was finished. ...
Conference Paper
Full-text available
The ML family of strict functional languages, which includes F#, OCaml, and Standard ML, evolved from the Meta Language of the LCF theorem proving system developed by Robin Milner and his research group at the University of Edinburgh in the 1970s. This paper focuses on the history of Standard ML, which plays a central rôle in this family of languages, as it was the first to include the complete set of features that we now associate with the name “ML” (i.e., polymorphic type inference, datatypes with pattern matching, modules, exceptions, and mutable state). Standard ML, and the ML family of languages, have had enormous influence on the world of programming language design and theory. ML is the foremost exemplar of a functional programming language with strict evaluation (call-by-value) and static typing. The use of parametric polymorphism in its type system, together with the automatic inference of such types, has influenced a wide variety of modern languages (where polymorphism is often referred to as generics). It has popularized the idea of datatypes with associated case analysis by pattern matching. The module system of Standard ML extends the notion of type-level parameterization to large-scale programming with the notion of parametric modules, or functors. Standard ML also set a precedent by being a language whose design included a formal definition with an associated metatheory of mathematical proofs (such as soundness of the type system). A formal definition was one of the explicit goals from the beginning of the project. While some previous languages had rigorous definitions, these definitions were not integral to the design process, and the formal part was limited to the language syntax and possibly dynamic semantics or static semantics, but not both. The paper covers the early history of ML, the subsequent efforts to define a standard ML language, and the development of its major features and its formal definition. We also review the impact that the language had on programming-language research.
Chapter
Linguistic reflection is defined as the ability of a running program to generate new program fragments and to integrate these into its own execution. This is the basis for system evolution which itself is necessary to achieve adequate persistent application system (PAS) longevity. For safety reasons only strongly typed reflection has been investigated in the FIDE project.
Chapter
The aim of this paper is to present a unitary model of some of the mental processes involved in speaking and listening. The model is cast in the form of a computer program, but the program is not intended as a contribution to either Artificial Intelligence or Computer Simulation. We regard it as an exercise in programmatic psychology, and perhaps a few words are necessary in order to explain the nature of this claim. On the one hand, proponents of Artificial Intelligence (AI) aim to develop machines capable of intelligent behaviour and, in particular, to devise computer programs capable of such tasks as interpreting visual scenes, providing theorems, and understanding natural language. Although the methods implemented in these programs are likely to interest a psychologist, any resemblance to human performance may be entirely coincidental. AI is concerned with intelligence in general, not merely its embodiment in living organisms. On the other hand, proponents of Computer Simulation aim to understand human behaviour by simulating it.
Chapter
POP-2 is a programming language designed by RM Burstall and RJ Popplestone at Edinburgh University during the late 1960’s. As a language intended for use in artificial intelligence research, POP-2 is conversational and provides extensive facilities for non-numerical as well as numerical applications.
Chapter
L’intention de cet exposé est de préciser certains parallèles entre deux types d’automates: les robots et les ordinateurs.
Thesis
This thesis describes the results of two studies in computational logic. The first concerns a very efficient method of implementing resolution theorem provers. The second concerns a non-resolution program which automatically proves many theorems about LISP functions, using structural induction. In Part 1, a method of representing clauses, called 'structure sharing'is presented. In this representation, terms are instantiated by binding their variables on a stack, or in a dictionary, and derived clauses are represented in terms of their parents. This allows the structure representing a clause to be used in different contexts without renaming its variables or copying it in any way. The amount of space required for a clause is (2 + n) 36-bit words, where n is the number of components in the unifying substitution made for the resolution or factor. This is independant of the number of literals in the clause and the depth of function nesting. Several ways of making the unification algorithm more efficient are presented. These include a method od preprocessing the input terms so that the unifying substitution for derived terms can be discovered by a recursive look-up proceedure. Techniques for naturally mixing computation and deduction are presented. The structure sharing implementation of SL-resolution is described in detail. The relationship between structure sharing and programming language implementations is discussed. Part 1 concludes with the presentation of a programming language, based on predicate calculus, with structure sharing as the natural implementation. Part 2 of this thesis describes a program which automatically proves a wide variety of theorems about functions written in a subset of pre LISP. Features of this program include: The program is fully automatic, requiring no information from the user except the LISP definitions of the functions involved and the statement of the theorem to be proved. No inductive assertions are required for the user. The program uses structural induction when required, automatically generating its own induction formulas. All relationships in the theorem are expressed in terms of user defined LISP functions, rather than a secong logical language. The system employs no built-in information about any non-primitive function. All properties required of any function involved in a proof are derived and established automatically. The progeam is capable of generalizing some theorems in order to prove them; in doing so, it often generates interesting lemmas. The program can write new, recursive LISP functions automatically in attempting to generalize a theorem. Finally, the program is very fast by theorem proving standards, requiring around 10 seconds per proof.
Thesis
This thesis is concerned with problems of using computers to interpret scenes from television camera pictures. In particular, it tackles the problem of interpreting the picture in terms of lines and curves, rather like an artist's line drawing. This is very time consuming if done by a single, serial processor. However, if many processors were used simultaneously it could be done much more rapidly. In this thesis the task of line and curve extraction is expressed in terms of constraints, in a form that is susceptible to parallel computation. Iterative algorithms to perform this task have been designed and tested. They are proved to be convergent and to achieve the computation specified. Some previous work on the design of properly convergent, parallel algorithms has drawn on the mathematics of optimisation by relaxation. This thesis develops the use of these techniques for applying "continuity constraints" in line and curve description. First, the constraints are imposed "almost everywhere" on the grey-tone picture data, in two dimensions. Some "discontinuities" - places where the constraints are not satisfied - remain, and they form the lines and curves required for picture interpretation Secondly, a similar process is applied along each line or curve to segment it. Discontinuities in the angle of the tangent along the line or curve mark the positions of vertices. In each case the process is executed in parallel throughout the picture. It is shown that the specification of such a process as an optimisation problem is non-convex and this means that an optimal solution cannot necessarily be found in a reasonable time A method is developed for efficiently achieving a good sub-optimal solution. A parallel array processor is a large array of processor cells which can act simultaneously, throughout a picture. A software emulator of such a processor array was coded in C and a POP-2 based high level language, PARAPIC, to drive it was written and used to validate the parallel algorithms developed in the thesis It is argued that the scope, in a vision system, of parallel methods such as those exploited in this work is extensive. The implications for the design of hardware to perform low-level vision are discussed and it is suggested that a machine consisting of fewer, more powerful cells than in a parallel array processor would execute the parallel algorithms more efficiently.
Chapter
Reflective systems allow their own structures to be altered from within. In a programming system reflection can occur in two ways: by a program altering its own interpretation or by it changing itself. Reflection has been used to facilitate the production and evolution of data and programs in database and programming language systems. This paper is concerned with a particular style of reflection, called linguistic reflection, used in compiled, strongly typed languages. Two major techniques for this have evolved: compile-time reflection and run-time reflection. These techniques are described together with a definition and anatomy of reflective systems using them. Two illustrative examples are given and the uses of type-safe reflective techniques in a database programming language context are surveyed. These include attaining high levels of genericity, accommodating changes in systems, implementing data models, optimising implementations and validating specifications.
Article
We describe an interpreter for pLucid, a member of the Lucid family of functional dataflow languages. In appearance, pLucid is similar to Landin's Iswim, exept that individual variables and expressions denote streams (infinite sequences of data items), and function variables denote filters (stream-to-stream transformations). The actual data objects in pLucid (the components of streams) are those of POP2: numbers, strings, words, and lists. The 'inner syntax' (infix operations, conventions for denoting constants) are those of POP2 as well.The interpreter (which was written in C) is eductive: it uses a tagged demand-driven scheme. Demands for values in the output stream generate demands for values of other variables internal to the program. These demands, and the values returned in response, are tagged according to "time" (sequence index) and place (node in the tree of function calls). Once computed, values are stored in an associative memory (the "warehouse") in case they are demanded again later in the computation. The warehouse is periodically cleaned out using a heuristic called the "retirement plan". The heuristic is not perfect, but does not have to be: in an eductive computation, the program is not altered as in reduction. If discarded values are needed again, they can be recomputed.The pLucid interpreter performs extensive runtime checks and error messages quote the source line containing the offended operator. A special end-of-data object permits a very simple treatment of finite (terminating) input and output. Of special interest is its interface to UNIX, which allows any system command to be used as a filter inside a pLucid program.The interpreter performs well enough for nontrivial programs to be developed and tested. These include (simple versions of) a text formatter, a distributed airline reservation system, and a full screen editor.
Article
Elementary course work assessments at Brunel University consist of programming problems with guidelines as to the method of solution. With such an environment and a given language (Fortran) this paper provides a method whereby the set of submitted solutions ...
Article
Full-text available
It is argued that the problem of protection, of controlling mutual access rights to shared resources, is a topic appropriately treated as a major component of general systems theory. Although most widely studied and developed in the context of computer systems, protection modelsare equally applicable to biological systems, such as those involved in movement control. The paper first establishes the nature of the problem of protection in computer systems, noting that it only reaches its full potential complexity in large data-base systems with processes automatically invoked by “data interrupts”. The Graham and Denning model of protection and the concept of a “capability” are then described and the appropriate mathematical tools for the analysis of such models discussed. A detailed model of protection is then developed with examples of the role of algebraic, automata-theoretic, topological and modai/mutti-valued logical, techniques in its analysis. Finally, biological applications, general systems consequences and automatic design techniques, for protection structures are discussed.
Article
This article is based on a series of special lectures delivered at University College, London, in November 1972.
Article
Describes the development of a computer program which will transcribe a live performance of a classical melody into the equivalent of standard musical notation. It is intended to embody, in computational form, a psychological theory of how Western musicians perceive the rhythmic and tonal relationships between the notes of such melodies. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Here we give methods of mechanically converting programs that are easy to understand into more efficient ones, converting recursion equations using high level operations into lower level flowchart programs. The main transformations involved are (i) recursion removal (ii) eliminating common subexpressions and combining loops (iii) replacing procedure calls by their bodies (iv) introducing assignments which overwrite list cells no longer in use (compiletime garbage collection).
Chapter
This paper describes some experiments in using algebraic and categorical ideas to write programs. In particular a program to compute colimits in a category given coproducts and coequalisers has been written, also one to lift such colimits to comma categories. The discussion is informal and aims to show how categorical concepts can be painlessly realised in computational practice.
Chapter
This paper presents a brief exposition of the role of various mathematical techniques in the development and utilization of resource protection structures for computers. The first section is concerned with the semantics of the problem — the distinction between protection problems in general and those whose complexity necessitates deeper theoretical treatment. The second section considers the roles of algebraic, topological, and modal/multi-valued logic, techniques in the analysis of protection. Finally we give an analysis of a current protection model to illustrate the problems and techniques.
Conference Paper
The mechanism of linguistic reflection allows a programming system to generate new program fragments and incorporate them into the system. Although this ability has important applications in persistent systems, its use has been limited by the difficulty of writing reflective programs. This paper analyses the reasons for this difficulty and describes START, a hyper-text based tool for reflection which has been implemented in the Napier88 hyper-programming environment. START supports the definition of structured program generators which may contain embedded direct links to other generators and to values, locations and types extant in the persistent environment. The benefits are greater ease of understanding through a clean separation of generator components, and a safer and more efficient mechanism for communication between generator and generated code.
Conference Paper
We use the concept of evaluation up to a given threshold of information to generalize the semantics of call by value and assignment to non-discrete domains, and to define a formal semantics for generic procedures. We then prove the correctness of McCarthy's transformation of iterative programs into recursive ones provided the same threshold is used for assignment and parameter passing.
Conference Paper
A system, called GAP, which automatically produces LISP functions from example computations is described. GAP uses a knowledge of LISP programming to inductively infer the LISP function 'obviously' intended by a given 'iopair' (i.e. a single input to be presented to the function and the output which must result). The system is written in POPCORN (a CONNIVER-like extension of POP2) and represents its knowledge of LISP procedurally.
Conference Paper
Although Prolog undoubtedly has its good points, there are some tasks (such as writing a screen editor or network interface controller) for which it is not the language of choice. The most natural computational concepts [2] for these tasks are hard to reconcile with Prolog's declarative nature. Just as there is a need for even the most committed Prolog programmer to use "conventional" languages for some tasks, so too is there a need for "logic" oriented components in conventional applications programs, such as CAD systems [7] and relational databases [5]. At Sussex, the problems of integrating logic with procedural programming are being addressed by two projects. One of these [4] involves a distributed ring of processors communicating by message passing. The other project is the POPLOG system, a mixed language AI programming environment which runs on conventional hardware. This paper describes the way in which we have integrated Prolog into POPLOG.
Article
A comprehensive functional approach to interactive information systems is suggested. The pervasive use of functions in a variety of subfields ranging from data modeling and retrieval to software development and evaluation is summarized as a justification for further investigation into a comprehensive functional approach to interactive information systems. A purely functional architecture is described.
Article
Full-text available
This brief historical note describes research done in the period 1970-1973, and where continuations were introduced in a fairly pragmatic way together with partial evaluation in order to compile “rules” expressed as statements in first-order predicate calculus. Although the methods used at that time were quite straightforward, this work may shed some light on the early history of the concept of continuations. In particular, unlike other early contributions that addressed issues in mainstream programming languages, the present approach initially addressed implementation techniques for special-purpose languages.
Article
New directions in Artificial Intelligence research have led to the need for certain novel features to be embedded in programming languages. This paper gives an overview of the nature of these features, and their implementation in four principal families of AI languages: SAIL; PLANNER/CONNIVER; QLISP/INTERLISP; and POPLER/POP-2. The programming featurcs described include: new data types and accessing mechanisms for stored expressions; more flexible control structures, including multiple processes and backtracking; pattern matching to allow comparison of data item with a template, and extraction of labeled subexpressions; and deductive mechanisms which allow the programming system to carry out certain activities including modifying the data base and deciding which subroutines to run next using only constraints and guidelines set up by the programmer.
Article
Reflective systems allow their own structures to be altered from within. Here we are concerned with a style of reflection, called linguistic reflection, which is the ability of a running program to generate new program fragments and to integrate these into its own execution. In particular, we describe how this kind of reflection may be provided in the compiler-based, strongly typed object-oriented programming language Java. The advantages of the programming technique include attaining high levels of genericity and accommodating system evolution. These advantages are illustrated by an example taken from persistent programming, which shows how linguistic reflection allows functionality (program code) to be generated on demand (Just-In-Time) from a generic specification and integrated into the evolving running program. The technique is evaluated against alternative implementation approaches with respect to efficiency, safety and ease of use.
Article
There have been several attempts to combine the programming language Prolog with procedural programming languages. One practical and useful solution is to combine the programming languages Prolog and POP-II to one environment called POPLOG. The author of this paper has implemented a version of the AI multilanguage environment POPLOG, called McPOPLOG (McMaster version of POPLOG). This paper firstly describes the chief attributes of the implemented system McPOPLOG, and then focuses on the communication means between POP-11 and Prolog in McPOPLOG.
Article
The use of several levels of abstraction has proved to be very helpful in constructing and maintaining programs. When programs are designed with abstract data types such as sets and lists, programmer time can be saved by automating the process of filling in low-level implementation details. In the past, programming systems have provided only a single general purpose implementation for an abstract type. Thus the programs produced using abstract types were often inefficient in space or time. In this paper a system for automatically choosing efficient implementations for abstract types from a library of implementations is discussed. This process is discussed in detail for an example program. General issues in data structure selection are also reviewed.
Article
A scheme is described that allows languages supporting higher order functions to be efficiently implemented using a standard run-time stack. A machine for evaluating typed lambda expressions is first constructed. In essence, the machine simply extends the standard SECD machine to allow partial application of abstractions representing multiadic functions. Some transformations of these typed lambda expressions into a so-called simple form are then described. Evaluation of simple-form expressions on the extended SECD machine produces stacklike environment structures rather than the treelike structures produced by expressions of arbitrary form. This allows implementation of the machine using a standard runtime stack. The SECD machine is then further modified so that closures are applied "in situ" rather than returned as values. The order of reduction is also changed so that the evaluation of function-valued expressions is deferred until they can be applied to sufficient arguments to allow reduction to nonfunctional values. It is shown that this function-deferring machine can be implemented using a standard run-time stack and thus can evaluate arbitrary lambda expressions without prior transformation to simple form. Finally, application of the above schemes to standard programming languages, such as ALGOL, Pascal, Ada, and LISP, is considered.
Article
Proposals have been made for fast string search algorithms that can search a string for a given pattern without having to examine every character of the text passed over. These algorithms are effective only if search patterns are in practice long enough, and enough text is traversed in a search to justify the costs of initializing the sophisticated search algorithm. We report observations of routine uses of an editor instrumented to determine the characteristics of use. It appears that many searches are indeed for patterns of 3,5 or more characters, and that these searches cover substantial amounts of text. However, searches for patterns of a single character are very common, while only moving a short distance on average. The implementation of a sophisticated search algorithm must take care to deal with short single-character searches efficiently.
Article
The concept of suspended evaluation is used as an approach to co-routines. Problems from the literature involving infinite data structures are solved in a LISP-like applicative language to demonstrate that simple new semantics can enrich old and ‘friendly’ control structures. It appears that the very nature of these problems draws control structure and data structure together, so that issues of style may be studied at once for both.
Article
Two primitives for structured programming are introduced. The primitives allow a generalized procedure entry and return similar to the ‘loop’ and ‘break’ statements found in many algorithmic languages for control in repetitive commands. Examples are given and the practicality of the primitives especially for interactive programming is stressed. Finally, the detailed implementation of the primitives is discussed; they may be implemented as procedures within an existing language.
Article
In designing a POP-2 compiler for the CDC 6000 series machines, it became apparent that there was a requirement to link to precomiled FORTRAN subprograms in order to use the advantages of FORTRAN and to make use of the extensive liberaries available. This paper describes the design considerations, the principal problems involved and the subsequent implementation of the POP-2/FORTRAN system. It is concluded that the general design aims were successfully achieved, that is: the system is primarily a POP-2 system, calls to FORTRAN subprograms have exactly the same syntax as calls to POP-2 functions and the restrictions on FORTRAN subprograms are minimal. The space overhead of having FORTRAN unused is about 10 per cent.
Article
. Stream processing is a term that is used widely in the literature to describe a variety of systems. We present an overview of the historical development of stream processing and a detailed discussion of the different languages and techniques for programming with streams that can be found in the literature. This includes an analysis of dataflow, specialized functional and logic programming with streams, reactive systems, signal processing systems, and the use of streams in the design and verification of hardware. The aim of this survey is an analysis of the development of each of these specialized topics to determine if a general theory of stream processing has emerged. As such, we discuss and classify the different classes of stream processing systems found in the literature from the perspective of programming primitives, implementation techniques, and computability issues, including a comparison of the semantic models that are used to formalize stream based computation.
ResearchGate has not been able to resolve any references for this publication.