ArticlePDF Available

Algebraic Operations and Generic Effects

Authors:
Article

Algebraic Operations and Generic Effects

Abstract

Given a complete and cocomplete symmetric monoidal closed category V and a symmetric monoidal V-category C with cotensors and a strong V-monad T on C, we investigate axioms under which an Ob C-indexed family of operations of the form αx :(Tx)v →(Tx)w provides semantics for algebraic operations on the computational λ-calculus. We recall a definition for which we have elsewhere given adequacy results, and we show that an enrichment of it is equivalent to a range of other possible natural definitions of algebraic operation. In particular, we define the notion of generic effect and show that to give a generic effect is equivalent to giving an algebraic operation. We further show how the usual monadic semantics of the computational λ-calculus extends uniformly to incorporate generic effects. We outline examples and non-examples and we show that our definition also enriches one for call-by-name languages with effects.
A preview of the PDF is not available
... We move from the recent work by Abadi and Plotkin [2], where a monadic [22] approach to choice-based programming is developed. There, the authors show how the so-called selection monad [14,16,15] can be used to model (and to give semantics to) programs written in a choice-based language as effectful programs, and how it is possible to manage choices as algebraic operations [27,26,28], delegating the task of solving these choices to those who implement the selection mechanism. In this work, we are concerned with further developing this idea, although in a different direction. ...
... To the best of the authors' knowledge, the first work dealing with semantic foundations for choice-based programming languages is the recent work by Abadi and Plotkin [2]. There, a modular approach to higher-order choice-based programming languages is developed in terms of the selection monad [14,16,15] and its algebraic operations [27,26,28], both at the level of operational and denotational semantics. As such, that work is the closest to ours. ...
... We have explained in Section 3 why, when dealing with reinforcement learning, moving from the selection to the state monad is practically beneficial. From a theoretical point of view, such a shift can be seen as obtained by looking at comodels [30] of choice operations, which directly leads to view reinforcement learning algorithms as ultimately defining stateful runners [38,4] for such operations, and thus as generic effects [26] for the state monad. Following this path, we have consequently relied on handlers [29,5,31] to define modular implementations of such generic effects, and thus of reinforcement learning algorithms. ...
Preprint
Full-text available
We study the algebraic effects and handlers as a way to support decision-making abstractions in functional programs, whereas a user can ask a learning algorithm to resolve choices without implementing the underlying selection mechanism, and give a feedback by way of rewards. Differently from some recently proposed approach to the problem based on the selection monad [Abadi and Plotkin, LICS 2021], we express the underlying intelligence as a reinforcement learning algorithm implemented as a set of handlers for some of these algebraic operations, including those for choices and rewards. We show how we can in practice use algebraic operations and handlers -- as available in the programming language EFF -- to clearly separate the learning algorithm from its environment, thus allowing for a good level of modularity. We then show how the host language can be taken as a lambda-calculus with handlers, this way showing what the essential linguistic features are. We conclude by hinting at how type and effect systems could ensure safety properties, at the same time pointing at some directions for further work.
... He develops an axiomatization for quantum computation using this framework, which he shows to be fully complete. Staton then extracts an equational theory for a quantum programming language from his algebraic theory that uses generic effects rather than algebraic operations [42]. Finally, Staton remarks upon a variant of his theory [53, §6.2] that applies to the QRAM model, where instead of working with qubits, we work with references to qubits. ...
... We can easily translate the quantum-specific fragment of λ Q# to Staton's quantum programming language [53, §5]. Note that the generic effects of his quantum language are equivalent to the algebraic operations (of the algebraic theory) [42] apply U (e) ≡ gateap U 2 n (e) ...
Preprint
Full-text available
Q# is a standalone domain-specific programming language from Microsoft for writing and running quantum programs. Like most industrial languages, it was designed without a formal specification, which can naturally lead to ambiguity in its interpretation. We aim to provide a formal language definition for Q#, placing the language on a solid mathematical foundation and enabling further evolution of its design and type system. This paper presents $\lambda_{Q\#}$, an idealized version of Q# that illustrates how we may view Q# as a quantum Algol (algorithmic language). We show the safety properties enforced by $\lambda_{Q\#}$'s type system and present its equational semantics based on a fully complete algebraic theory by Staton.
... Algebraic effects and handlers Plotkin and Power, 2003) have emerged as a convenient, modular abstraction for controlling computational effects. ...
... Algebraic effects and their handlers have emerged as a convenient way to control impure behaviour. They are built upon a strong mathematical foundation Plotkin and Power, 2003), their semantics can be precisely defined (Kammar et al., 2013), and in some cases they provide an improvement in runtime complexity compared to pure languages (Hillerström et al., 2020). While providing a useful formal framework for reasoning about side effects, algebraic effects and handlers have also proved to be an increasingly convenient modular abstraction that has been adopted across many disciplines. ...
Preprint
Full-text available
Probabilistic programming is a growing area that strives to make statistical analysis more accessible, by separating probabilistic modelling from probabilistic inference. In practice this decoupling is difficult. No single inference algorithm can be used as a probabilistic programming back-end that is simultaneously reliable, efficient, black-box, and general. Probabilistic programming languages often choose a single algorithm to apply to a given problem, thus inheriting its limitations. While substantial work has been done both to formalise probabilistic programming and to improve efficiency of inference, there has been little work that makes use of the available program structure, by formally analysing it, to better utilise the underlying inference algorithm. This dissertation presents three novel techniques (both static and dynamic), which aim to improve probabilistic programming using program analysis. The techniques analyse a probabilistic program and adapt it to make inference more efficient, sometimes in a way that would have been tedious or impossible to do by hand.
Chapter
When modelling side-effects using a monad, we need to equip the monad with effectful operations. This can be done by noting that each algebra of the monad carries interpretations of the desired operations. We consider the analogous situation for graded monads, which are a generalization of monads that enable us to track quantitative information about side-effects. Grading makes a significant difference: while many graded monads of interest can be equipped with similar operations, the algebras often cannot. We explain where these operations come from for graded monads. To do this, we introduce the notion of flexibly graded monad, for which the situation is similar to the situation for ordinary monads. We then show that each flexibly graded monad induces a canonical graded monad in such a way that operations for the flexibly graded monad carry over to the graded monad. In doing this, we reformulate grading in terms of locally graded categories, showing in particular that graded monads are a particular kind of relative monad. We propose that locally graded categories are a useful setting for work on grading in general.
Article
The recently introduced _Perceus_ algorithm can automatically insert reference count instructions such that the resulting (cycle-free) program is _garbage free_: objects are freed at the very moment they can no longer be referenced. An important extension is reuse analysis. This optimization pairs objects of known size with fresh allocations of the same size and tries to reuse the object in-place at runtime if it happens to be unique. Unfortunately, current implementations of reuse analysis are fragile with respect to small program transformations, or can cause an arbitrary increase in the peak heap usage. We present a novel _drop-guided_ reuse algorithm that is simpler and more robust than previous approaches. Moreover, we generalize the linear resource calculus to precisely characterize garbage-free and frame-limited evaluations. On each function call, a frame-limited evaluation may hold on to memory longer if the size is bounded by a constant factor. Using this framework we show that our drop-guided reuse _is_ frame-limited and find that an implementation of our new reuse approach in Koka can provide significant speedups.
Article
Monadic computations built by interpreting, or handling , operations of a free monad are a compelling formalism for modeling language semantics and defining the behaviors of effectful systems. The resulting layered semantics offer the promise of modular reasoning principles based on the equational theory of the underlying monads. However, there are a number of obstacles to using such layered interpreters in practice. With more layers comes more boilerplate and glue code needed to define the monads and interpreters involved. That overhead is compounded by the need to define and justify the relational reasoning principles that characterize the equivalences at each layer. This paper addresses these problems by significantly extending the capabilities of the Coq interaction trees (ITrees) library, which supports layered monadic interpreters. We characterize a rich class of interpretable monads ---obtained by applying monad transformers to ITrees---and show how to generically lift interpreters through them. We also introduce a corresponding framework for relational reasoning about "equivalence of monads up to a relation R". This collection of typeclasses, instances, new reasoning principles, and tactics greatly generalizes the existing theory of the ITree library, eliminating large amounts of unwieldy boilerplate code and dramatically simplifying proofs.
Article
Probabilistic programming languages (PPLs) allow programmers to construct statistical models and then simulate data or perform inference over them. Many PPLs restrict models to a particular instance of simulation or inference, limiting their reusability. In other PPLs, models are not readily composable. Using Haskell as the host language, we present an embedded domain specific language based on algebraic effects, where probabilistic models are modular, first-class, and reusable for both simulation and inference. We also demonstrate how simulation and inference can be expressed naturally as composable program transformations using algebraic effect handlers.
Article
A large class of monads used to model computational effects have natural presentations by operations and equations, for example, the list monad can be presented by a constant and a binary operation subject to unitality and associativity. Graded monads are a generalization of monads that enable us to track quantitative information about the effects being modelled. Correspondingly, a large class of graded monads can be presented using an existing notion of graded presentation. However, the existing notion has some deficiencies, in particular many effects do not have natural graded presentations. We introduce a notion of flexibly graded presentation that does not suffer from these issues, and develop the associated theory. We show that every flexibly graded presentation induces a graded monad equipped with interpretations of the operations of the presentation, and that all graded monads satisfying a particular condition on colimits have a flexibly graded presentation. As part of this, we show that the usual algebra-preserving correspondence between presentations and a class of monads transfers to an algebra-preserving correspondence between flexibly graded presentations and a class of flexibly graded monads.
Article
Reasoning about the use of external resources is an important aspect of many practical applications. Effect systems enable tracking such information in types, but at the cost of complicating signatures of common functions. Capabilities coupled with escape analysis offer safety and natural signatures, but are often overly coarse grained and restrictive. We present System C, which builds on and generalizes ideas from type-based escape analysis and demonstrates that capabilities and effects can be reconciled harmoniously. By assuming that all functions are second class, we can admit natural signatures for many common programs. By introducing a notion of boxed values, we can lift the restrictions of second-class values at the cost of needing to track degree-of-impurity information in types. The system we present is expressive enough to support effect handlers in full capacity. We practically evaluate System C in an implementation and prove its soundness.
Article
Effect systems have been a subject of active research for nearly four decades, with the most notable practical example being checked exceptions in programming languages such as Java. While many exception systems support abstraction, aggregation, and hierarchy (e.g., via class declaration and subclassing mechanisms), it is rare to see such expressive power in more generic effect systems. We designed an effect system around the idea of protecting system resources and incorporated our effect system into the Wyvern programming language. Similar to type members, a Wyvern object can have effect members that can abstract lower-level effects, allow for aggregation, and have both lower and upper bounds, providing for a granular effect hierarchy. We argue that Wyvern’s effects capture the right balance of expressiveness and power from the programming language design perspective. We present a full formalization of our effect-system design, showing that it allows reasoning about authority and attenuation. Our approach is evaluated through a security-related case study.
Article
Full-text available
We begin to develop a uniÞed account of modularity for com- putational eects. We use the notion of enriched Lawvere theory, to- gether with its relationship with strong monads, to reformulate Moggi's paradigm for modelling computational eects; we emphasise the impor- tance here of the operations that induce computational eects. Eects qua theories are then combined by appropriate bifunctors (on the cate- gory of theories). We give a theory of the commutative combination of eects, which in particular yields Moggi's side-eects monad transformer (an application is the combination of side-eects with nondeterminism). And we give a theory for the sum of computational eects, which in par- ticular yields Moggi's exceptions monad transformer (an application is the combination of exceptions with other eects).
Book
To construct a compiler for a modern higher-level programming languagel one needs to structure the translation to a machine-like intermediate language in a way that reflects the semantics of the language. little is said about such struc­ turing in compiler texts that are intended to cover a wide variety of program­ ming languages. More is said in the Iiterature on semantics-directed compiler construction [1] but here too the viewpoint is very general (though limited to 1 languages with a finite number of syntactic types). On the other handl there is a considerable body of work using the continuation-passing transformation to structure compilers for the specific case of call-by-value languages such as SCHEME and ML [21 3]. ln this paperl we will describe a method of structuring the translation of ALGOL-like languages that is based on the functor-category semantics devel­ oped by Reynolds [4] and Oles [51 6]. An alternative approach using category theory to structure compilers is the early work of F. L. Morris [7]1 which anticipates our treatment of boolean expressionsl but does not deal with procedures. 2 Types and Syntax An ALGOL-like language is a typed lambda calculus with an unusual repertoire of primitive types. Throughout most of this paper we assume that the primi­ tive types are comm(and) int(eger)exp(ression) int(eger)acc(eptor) int(eger)var(iable) I and that the set 8 of types is the least set containing these primitive types and closed under the binary operation -.
Article
Much of theoretical computer science is based on use of inductive complete partially ordered sets (or ipos). The aim of this thesis is to extend this successful theory to make it applicable to probabilistic computations. The method is to construct a "probabilistic powerdomain" on any ipo to represent the outcome of a probabilistic program which has outputs in the original ipo. In this thesis it is shown that evaluations (functions which assign a probability to open sets with various conditions) form such a powerdomain. Further, the powerdomain is a monadic functor on the categoy Ipo. For restricted classes of ipos a powerdomain of probability distributions, or measures which only take values less than one, has been constructed (by Saheb-Djahromi). In the thesis we show that this powerdomain may be constructed for continuous ipos where it is isomorphic to that of evaluations. The powerdomain of evaluations is shown to have a simple Stone type duality between it and sets of upper continuous functions. This is then used to give a Hoare style logic for an imperative probabilistic language, which is the dual of the probabilistic semantics. Finally the powerdomain is used to give a denotational semantics of a probabilistic metalanguage which is an extension of Moggi's lambda-c-calculus for the powerdomain monad. This semantics is then shown to be equivalent to an operational semantics.
Book
I. Categories, Functors and Natural Transformations.- 1. Axioms for Categories.- 2. Categories.- 3. Functors.- 4. Natural Transformations.- 5. Monics, Epis, and Zeros.- 6. Foundations.- 7. Large Categories.- 8. Hom-sets.- II. Constructions on Categories.- 1. Duality.- 2. Contravariance and Opposites.- 3. Products of Categories.- 4. Functor Categories.- 5. The Category of All Categories.- 6. Comma Categories.- 7. Graphs and Free Categories.- 8. Quotient Categories.- III. Universals and Limits.- 1. Universal Arrows.- 2. The Yoneda Lemma.- 3. Coproducts and Colimits.- 4. Products and Limits.- 5. Categories with Finite Products.- 6. Groups in Categories.- IV. Adjoints.- 1. Adjunctions.- 2. Examples of Adjoints.- 3. Reflective Subcategories.- 4. Equivalence of Categories.- 5. Adjoints for Preorders.- 6. Cartesian Closed Categories.- 7. Transformations of Adjoints.- 8. Composition of Adjoints.- V. Limits.- 1. Creation of Limits.- 2. Limits by Products and Equalizers.- 3. Limits with Parameters.- 4. Preservation of Limits.- 5. Adjoints on Limits.- 6. Freyd's Adjoint Functor Theorem.- 7. Subobjects and Generators.- 8. The Special Adjoint Functor Theorem.- 9. Adjoints in Topology.- VI. Monads and Algebras.- 1. Monads in a Category.- 2. Algebras for a Monad.- 3. The Comparison with Algebras.- 4. Words and Free Semigroups.- 5. Free Algebras for a Monad.- 6. Split Coequalizers.- 7. Beck's Theorem.- 8. Algebras are T-algebras.- 9. Compact Hausdorff Spaces.- VII. Monoids.- 1. Monoidal Categories.- 2. Coherence.- 3. Monoids.- 4. Actions.- 5. The Simplicial Category.- 6. Monads and Homology.- 7. Closed Categories.- 8. Compactly Generated Spaces.- 9. Loops and Suspensions.- VIII. Abelian Categories.- 1. Kernels and Cokernels.- 2. Additive Categories.- 3. Abelian Categories.- 4. Diagram Lemmas.- IX. Special Limits.- 1. Filtered Limits.- 2. Interchange of Limits.- 3. Final Functors.- 4. Diagonal Naturality.- 5. Ends.- 6. Coends.- 7. Ends with Parameters.- 8. Iterated Ends and Limits.- X. Kan Extensions.- 1. Adjoints and Limits.- 2. Weak Universality.- 3. The Kan Extension.- 4. Kan Extensions as Coends.- 5. Pointwise Kan Extensions.- 6. Density.- 7. All Concepts are Kan Extensions.- Table of Terminology.
Chapter
Without Abstract