Programme
The schedule consists of regular talks, invited talks (by either CALCO or MFPS), and joint invited talks (by CALCO and MFPS together). For regular papers with multiple authors, the presenting author is indicated by italic text (as far as is known). Click on the respective local time for a conversion into your local timezone.
The pre-proceedings for CALCO are available here.
Days: | Monday (Jun 16) | Tuesday (Jun 17) | Wednesday (Jun 18) | Thursday (Jun 19) | Friday (Jun 20) |
Monday, June 16, 2025
8:30
- | 8:55 |
Registration & Coffee
|
8:55
- | 9:00 |
Opening
|
Chair: Alexander Knapp | |
9:00
- | 10:00 |
Invited Talk:
Logic Enriched Over a Quantale
Abstract Many-valued logics have a long history in mathematical logic as well as
in applications to the semantics of programming languages and to
engineering more generally. Typically these logics are rich with
features motivated by the particular applications they stem from. In his
1973 article “Metric Spaces, Generalized Logic, and Closed Categories”,
Lawvere argued that any quantale Ω gives rise to a
generalized Ω-valued logic that has as its models the categories enriched
over the quantale. This suggests developing a uniform framework for
many-valued logics parameterized in a quantale. In this talk we will
review some previous and ongoing work in that direction, drawing on
previous work co-authored with Octavian Babus, Adriana Balan, Marta
Bilkova, Fredrik Dahlqvist, Daniela Petrisan, Wolfgang Poiger, Bruno
Teheux and Jiri Velebil as well as on ongoing work with Giuseppe Greco
and Apostolos Tzimoulis.
|
10:00
- | 10:30 |
Break and Registration
|
CALCO: Foundations | |
Chair: Clemens Kupke | |
10:30
- | 11:00 |
Terminal Coalgebras for Finitary Functors (Regular Paper)
Abstract We present a collection of results that imply that an endofunctor on a category has a terminal coalgebra obtainable as a countable limit of its terminal-coalgebra sequence. This holds for finitary endofunctors preserving finite nonempty intersections on locally finitely presentable categories, assuming that the posets of strong quotients and subobjects of finitely presentable objects satisfy the descending chain condition. This allows one to adapt finiteness arguments that were originally advanced by Worrell concerning terminal coalgebras for finitary set functors. Examples include the categories of sets, posets, vector spaces, graphs, and nominal sets. A similar argument is presented for the category of metric spaces (although it is not locally finitely presentable).
Best Presentation Award
|
11:00
- | 11:30 |
A Coinductive Representation of Computable Functions (Regular Paper)
Abstract We investigate a representation of computable functions as total
functions over the set \(2^\infty\) of finite and infinite sequences
over \(\{0,1\}\). Infinite sequences are non-terminating
computations, and a finite sequence represents the sum of its
digits. We introduce a new definition principle, \emph{function
space corecursion} that simultaneously generalises minimisation and
primitive recursion. This defines a class of \emph{computable
corecursive functions} that is closed under composition and function
space corecursion. We show that computable corecursive functions
represent all partial recursive functions, and show that all
computable corecursive functions are indeed computable by
translating into the untyped $\lambda$-calculus. |
11:30
- | 12:00 |
Cancellative Convex Semilattices (Regular Paper)
Abstract Convex semilattices (CSL) are algebras that are at the same time a convex algebra (CA) and a semilattice, together with a distributivity axiom. These algebras have attracted some attention in the last years as they are suitable for modelling probability and nondeterminism, in particular by being the Eilenberg-Moore algebras of the nonempty finitely-generated convex subsets of distributions monad.
A CSL is cancellative if the underlaying CA is cancellative. Cancellative CA have been characterized by M.H.Stone in
1949 and a few years later independently by H.Kneser: A CA is cancellative if and only if it is isomorphic to a convex subset of a vector space (with canonical CA operations).
In this short paper we prove an analogous theorem for CSL: A CSL is cancellative if and only if it is isomorphic to a convex subset of a Riesz space, i.e., a lattice-ordered vector space (with canonical CSL operations). |
12:00
- | 12:30 |
Distributive Laws of Monadic Containers (Regular Paper)
Abstract Containers are used to carve out a class of strictly positive data types in terms of shapes and positions. They can be interpreted via a fully-faithful functor into endofunctors on Set. Monadic containers are those containers whose interpretation as a Set functor carries a monad structure. The category of containers is closed under container composition and is a monoidal category, whereas monadic containers do not in general compose.
In this paper, we develop a characterisation of distributive laws of monadic containers. Distributive laws were introduced as a sufficient condition for the composition of the underlying functors of two monads to also carry a monad structure. Our development parallels Ahman and Uustalu's characterisation of distributive laws of directed containers, i.e. containers whose Set functor interpretation carries a comonad structure. Furthermore, by combining our work with theirs, we construct characterisations of mixed distributive laws (i.e. of directed containers over monadic containers and vice versa), thereby completing the `zoo' of container characterisations of (co)monads and their distributive laws.
We have found these characterisations amenable to development of existence and uniqueness proofs of distributive laws, particularly in the mechanised setting of Cubical Agda, in which most of the theory of this paper has been formalised. |
12:30
- | 14:00 |
Lunch
|
CALCO: Rewriting and Operational Semantics | |
Chair: Filippo Bonchi | |
14:00
- | 14:30 |
EGGs are adhesive! (Regular Paper)
Abstract The use of rewriting-based visual formalisms is on the rise. In the formal methods community, this is due also to the introduction of adhesive categories, where most properties of classical approaches to graph transformation, such as parallelism and confluence, can be rephrased and proved in a general and uniform way. E-graphs (or EGGs) are a formalism for program optimisation via an efficient implementation of equality saturation. In short, EGGs can be defined as (acyclic) term graphs with an additional notion of equivalence on nodes that is closed under the operators of the signature. Instead of replacing the components of a program, the optimisation step is performed by adding new components and linking them to the existing ones via an equivalence relation, until an optimal program is reached. This work describes EGGs by means of adhesive categories. Besides the benefits per se of a formal presentation, making precise the properties of the data structure, describing the addition of equivalent program components via the standard tools of graph transformation offers the advantages given by the adhesive framework in modelling e.g. concurrent updates. |
14:30
- | 14:50 |
Stateful Mathematical Operational Semantics (Early Ideas)
PDF
Abstract Reasoning about program equivalence in imperative languages is notoriously challenging, as the presence of states (in the form of variable stores) fundamentally increases the observational power of program terms. The key desideratum for any notion of equivalence is compositionality, guaranteeing that subprograms can be safely replaced by equivalent subprograms regardless of the context. To facilitate compositionality proofs and avoid boilerplate work, one would hope to employ the abstract bialgebraic methods provided by Turi and Plotkin's powerful theory of mathematical operational semantics (a.k.a. abstract GSOS) or its recent extension by Goncharov et al. to higher-order languages. However, multiple attempts to apply abstract GSOS to stateful languages have thus failed. We propose a novel approach to the operational semantics of stateful languages based on the formal distinction between readers (terms that expect an initial input store before being executed), and writers (running terms that have already been provided with a store). In contrast to earlier work, this style of semantics is fully compatible with abstract GSOS, and we can thus leverage the existing theory to obtain coinductive reasoning techniques. We demonstrate that our approach generates non-trivial compositionality results for stateful languages with first-order and higher-order store and that it flexibly applies to program equivalences at different levels of granularity, such as trace, cost, and natural equivalence. |
14:50
- | 15:10 |
From Abstract Higher-Order GSOS to Abstract Big-Step Semantics, Abstractly (Early Ideas)
PDF
Abstract Small-step and big-step operational semantics are two fundamental styles of structural operational semantics (SOS). Small-step semantics defines a fine-grained, one-step reduction relation on programs, while big-step semantics directly relates programs to their final evaluation results with respect to the ambient evaluation strategy. Although their agreement is critical for program analysis and reasoning, this is typically proven on a case-by-case basis, lacking a general theoretical foundation.
We address this by leveraging higher-order mathematical operational semantics -- an abstract framework recently introduced as a generalization of its established first-order counterpart due to Turi and Plotkin.
Specifically, we introduce an abstract categorical notion of big-step SOS, complementing the existing (small-step) notion of abstract higher-order GSOS. We provide a general construction for deriving big-step semantics from small-step semantics, and prove an abstract equivalence between the two. |
15:10
- | 15:30 |
(Abstract) GSOS for Trace Equivalence (Early Ideas)
PDF
Abstract Abstract GSOS is a categorical framework for operational semantics, in which rules are modeled as natural transformations of a certain type. Rule systems that fit the constraints imposed by the framework are guaranteed to have desirable properties, mainly that coalgebraic behavioural equivalence, which roughly corresponds to bisimilarity, is a congruence. While bisimilarity is central in process algebra, it is far from the only notion of process equivalence of interest. However, abstract GSOS is not easily applicable to these other equivalences. This work focuses on the other extremum of the linear time-branching time spectrum (bisimilarity being the finest), namely trace equivalence. We wonder under which assumptions on abstract GSOS laws this notion of equivalence is a congruence.
A study of trace equivalence for a concrete instance of abstract GSOS (to labelled transition systems with explicit termination) identifies necessary and sufficient conditions to this end. We then transpose abstract GSOS to Kleisli semantics and investigate how, with conditions on the monad (affineness) and the GSOS law (dubbed linearity and smoothness) inspired by the concrete study, trace equivalence can be shown to be a congruence. |
15:30
- | 16:00 |
Break
|
CALCO: Probability and Logics | |
Chair: Ana Sokolova | |
16:00
- | 16:30 |
Drawing and Recolouring (Regular Paper)
Abstract Drawing a ball from an urn filled with balls of different colours is
one of the basic models in probability theory. The probability of
drawing a ball of a particular colour is determined by the proportion
/ fraction of balls of that colour. This paper introduces a new
stochastic model for such urns: draw a ball, recolour (repaint) it,
and put it back into the urn. One can distinguish four modes of
drawing-and-recolouring, namely whether done proportionally or
uniformly (both for drawing and recolouring). These modes can be
reformulated in financial terms as redistribution of wealth or in
biological terms as evolutionary drift. The resulting four operations
form a coalgebra for the distribution monad, on the set of multisets
of a fixed size. In fact they form a Markov chain and even a hidden
Markov model, in combination with the frequentist learning map as
emission channel. This paper identifies fixed points, capturing
stable situations, for these four modes of drawing-and-recolouring. |
16:30
- | 17:00 |
Safety and strong completeness via reducibility for many-valued coalgebraic dynamic logics (Regular Paper)
Abstract We present a coalgebraic framework for studying generalisations of dynamic modal logics such as PDL and game logic in which both the propositions and the semantic structures can take values in a FLew-algebra A. More precisely, we work with coalgebraic modal logic via A-valued predicate liftings, and interpret actions (abstracting programs and games) as F-coalgebras where the functor F represents some type of A-weighted system. We also allow combinations of crisp propositions with A-weighted systems and vice versa.
We introduce coalgebra operations and tests, with a focus on operations which are reducible in the sense that modalities for composed actions can be reduced to compositions of modalities for the constituent actions. We prove that reducible operations are safe for bisimulation and behavioural equivalence, and prove a general strong completeness result, from which we obtain new strong completeness results for 2-valued iteration-free PDL with A-valued accessibility relations when A is a finite chain, and for many-valued iteration-free game logic with many-valued strategies based on finite Łukasiewicz logic. |
17:00
- | 17:20 |
Semantics for the coinductive structures in stochastic processes: Infinite product measures in quasi-Borel spaces (Early Ideas)
PDF
Abstract Stochastic processes include random functions (such as Brownian motion) and random sequences (such as Markov chains). Categorically, functions are modelled using cartesian closed categories, and sequences are modelled by final coalgebras. In this work, we investigate constructions for random sequences inside a model of random functions, quasi-Borel spaces. Although a general coinductive construction remains an open problem, we have resolved the open question of infinite product spaces. |
17:20
- | 17:40 |
Coalgebraic Analysis of Social Systems (Early Ideas)
PDF
Abstract The algebraic analysis of social systems, or algebraic social network analysis, refers to a collection of methods designed to extract information about the structure of social systems modeled as directed graphs. These methods focus on identifying the roles within a system, understood as compound relations, as well as the social positions, understood as groups of similarly connected actors. The primary focus is on pairwise relationships between social actors, modeled through directed graphs.
Higher-order relationships in social systems have received considerable attention in recent years. In this work, we use the theory of universal coalgebra to formalize these notions for graphs and extend them to models of social systems that also account for higher-order interactions between social actors, such as hypergraphs. Finally, we unify the analysis of social roles through a functoriality theorem. |
Tuesday, June 17, 2025
8:30
- | 9:00 |
Coffee
|
Chair: Corina Cı̂rstea | |
9:00
- | 10:00 |
Invited Talk:
Effectful Mealy Machines
Abstract We introduce effectful Mealy machines – a general notion of Mealy machine
with global effects – and give them semantics in terms of both
bisimilarity and traces. Bisimilarity of effectful Mealy machines is
characterized syntactically, via free uniform feedback. Their traces are
given a novel semantic coinductive universe in terms of effectful
streams. We prove that this framework generalizes standard causal
processes and captures existing flavours of Mealy machine, bisimilarity,
and trace.
|
10:00
- | 10:30 |
Break
|
CALCO: Graphical Models | |
Chair: Fabio Gadducci | |
10:30
- | 11:00 |
Tape Diagrams for Monoidal Monads (Regular paper)
Abstract Tape diagrams provide a intuitive graphical representation for arrows of rig categories, namely categories equipped with two monoidal structures, $\oplus$ and $\otimes$, where $\otimes$ distributes over $\oplus$. However, their applicability is limited to categories where $\oplus$ is a biproduct, i.e., both a categorical product and a coproduct. In this work, we extend tape diagrams to deal with Kleisli categories of symmetric monoidal monads, presented by algebraic theories. |
11:00
- | 11:30 |
String Diagrams for Graded Monoidal Theories, with an Application to Imprecise Probability (Regular paper)
Abstract We introduce string diagrams for graded symmetric monoidal categories. Our approach includes a definition of graded monoidal theory and the corresponding freely generated syntactic category. Also, we show how an axiomatic presentation for the graded theory may be modularly obtained from one for the grading theory and one for the base category. The Para construction on monoidal actegories is a motivating example for our framework. As a case study, we show how to axiomatise a variant of the graded category ImP, recently introduced by Liell-Cock and Staton to model imprecise probability. This culminates in a representation, as string diagrams with grading wires, of programs with primitives for nondeterministic and probabilistic choices and conditioning. |
11:30
- | 12:00 |
An Algebraic Approach to Moralisation and Triangulation of Probabilistic Graphical Models ((Co)algebraic Pearls submission)
Abstract Moralisation and Triangulation are transformations allowing to switch between different ways of factoring a probability distribution into a graphical model. Moralisation allows to view a Bayesian network (a directed model) as a Markov network (an undirected model), whereas triangulation works in the opposite direction.
We present a categorical framework where these transformations are modelled as functors between a category of Bayesian networks and one of Markov networks. The two kinds of network (the objects of these categories) are themselves represented as functors, from a `syntax' domain to a `semantics' codomain. Notably, moralisation and triangulation are definable inductively on such syntax, and operate as a form of functor pre-composition. This approach introduces a modular, algebraic perspective in the theory of probabilistic graphical models.
This work is intended for the "(Co)Algebraic Pearls" submission category. |
12:00
- | 12:30 |
Pareto Fronts for Compositionally Solving String Diagrams of Parity Games (Regular paper)
Abstract Open parity games are proposed as a compositional extension of parity games with algebraic operations, forming string diagrams of parity games. A potential application of string diagrams of parity games is to describe a large parity game with a given compositional structure and solve it efficiently as a divide-and-conquer algorithm by exploiting its compositional structure.
Building on our recent progress in open Markov decision processes, we introduce Pareto fronts of open parity games, offering a framework for multi-objective solutions. We establish the positional determinacy of open parity games with respect to their Pareto fronts through a novel translation method. Our translation converts an open parity game into a parity game tailored to a given single-objective. Furthermore, we present a simple algorithm for solving open parity games, derived from this translation that allows the application of existing efficient algorithms for parity games. Expanding on this foundation, we develop a compositional algorithm for string diagrams of parity games. |
12:30
- | 14:00 |
Lunch
|
CALCO: Hybrid and Continuous Systems | |
Chair: Jurriaan Rot | |
14:00
- | 14:30 |
Expressivity of bisimulation pseudometrics over analytic state spaces (Regular paper)
Abstract Markov decision process is a state-based dynamical system capable of describing both nondeterministic and probabilistic behaviour. In this paper we view Markov decision processes as coalgebras living in the category of analytic spaces. Our results are twofold. First we define bisimulation pseudometrics over such coalgebras using the framework of fibrations. Second, we develop a quantitative modal logic for such coalgebras and prove a quantitative form of Hennessy-Milner theorem in this new setting stating that the bisimulation pseudometric corresponds to the logical distance induced by modal formulae. |
14:30
- | 14:50 |
Graded Coalgebras of Monads for Continuous Dynamics (Early Ideas)
PDF
Abstract We argue for the time-graded coalgebras of probabilistic and non-determinisic monads to be suitable coalgebraic continuous-time dynamical systems. |
14:50
- | 15:10 |
Lyapunov theory for coalgebras (Early Idea)
PDF
Abstract Lyapunov's theorem provides a foundational characterization of the stability of dynamical systems. We present a framework for Lyapunov stability theory for coalgebras. Under certain assumptions, we develop various notions of stability of fixed points for coalgebras, and demonstrate that the existence of Lyapunov morphisms is necessary and sufficient for establishing stability. |
15:10
- | 15:30 |
Verifying Quantum Protocols via Quantum Distributions (Early Ideas)
PDF
Abstract The development of quantum communication protocols sparked the interest in quantum extensions of process calculi. The prominent approaches, however, are still based on probabilistic models, which are inadequate for the quantum case, as they do not capture the observational limitations of quantum theory. In Physics, indeed, probabilistic quantum systems are not represented with probabilities, but with density operators. By lifting this concept to the Computer Science setting, we introduce Operator Distributions: they employ density operators as weights, just like probability distributions use real numbers. We give semantics to quantum protocols in a natural way, as an LTS of operator distributions, guaranteeing that only physically accessible information about the systems are represented. Moreover, these "quantum" distributions pave the way for symbolic verification of protocols whose behaviour depends on an unknown initial state: instead of considering infinitely-many LTSs, one for each possible initialization, we can build a single symbolic LTS and prove properties on it. |
15:30
- | 16:00 |
Break
|
CALCO: Trees and Learning | |
Chair: Harsh Beohar | |
16:00
- | 16:30 |
Active learning of upward-closed sets of words ((Co)Algebraic Pearl)
Abstract We give a new proof of a result from well quasi-order theory on the computability of bases for upwards-closed sets of words. This new proof is based on Angluin's L* algorithm, that learns an automaton from a minimally adequate teacher. This relates in particular two results from the 1980s: Angluin's L* algorithm, and a result from Valk and Jantzen on the computability of bases for upwards-closed sets of tuples of integers.
Along the way, we describe an algorithm for learning quasi-ordered automata from a minimally adequate teacher, and extend a generalization of Valk and Jantzen's result, encompassing both words and integers, to finitely generated monoids. |
16:30
- | 17:00 |
Trees in Coalgebra from Generalized Reachability ((Co)algebraic pearl)
Abstract The notion of reachability of state-based systems is that every state is reachable from the initial state. This has been generalized coalgebraically in two ways: one via a universal property on pointed coalgebras (namely that a reachable coalgebras has no proper subcoalgebra) and second by iteratively computing successor states.
In the current paper, we present corresponding universal properties and iterative constructions for trees. The universal property captures when a coalgebra is a tree, namely when it has no proper tree unravelling. The iterative construction unravels an arbitrary coalgebra to a tree. We show that this yields the expected notion of tree for a variety of standard examples.
Surprisingly, both the universal property and the iterative construction turn out to be an instance of a generalization of the previous theory on reachable coalgebras.
Best Paper Award
|
17:00
- | 17:20 |
A Categorical Framework for Testing Generalised Tree Automata (Early Ideas)
PDF
Abstract Conformance testing of automata is about checking the equivalence of a known specification and a black-box implementation. An important notion in conformance testing is that of a complete test suite, which guarantees that if an implementation satisfying certain conditions passes all tests, then it is equivalent to the specification.
In recent work (FoSSaCS 2025), we introduced a framework for proving completeness of test suites at the abstract level of automata in monoidal closed categories, and provided a generalisation of a classical conformance testing technique, the W-method. Our framework considered automata with a particularly simple transition structure analogous to that of DFAs. As a continuation of this line of work, we present a generalisation of this result for proving completeness of test suites for automata whose transition structure is described by an arbitrary endofunctor satisfying certain conditions. As a special case, we obtain an analogue of the W-method for all adjoint automata, as well as tree automata, which were not covered by the previous framework. |
17:20
- | 17:40 |
Kan and Grothendieck Commute Together (Early Ideas)
PDF
Abstract This abstract starts with a motivating question: how can a free category be computed on a decomposition of graphs? As free categories may represent either languages on abstract machines (in the internal case) or a class of computational problems called the algebraic path problem (in the enriched case) the answer to our motivating question promises to give insight into the compositionality of a wide class of problems. We answer our question through generalization: there is a square of functors expressing the commutation of generalized Grothendieck constructions and left Kan extensions which specializes to the case of free categories on decompositions of graphs. |
18:30
- | 20:00 |
Lord Provost and Glasgow City Council’s Civic Reception
|
Wednesday, June 18, 2025
8:30
- | 9:00 |
Registration & Coffee
|
Chair: Stefan Milius | |
9:00
- | 10:00 |
TL325
Joint Invited Talk:
Differentiating Functional Reactive Programming
|
10:00
- | 10:30 |
Break
|
Joint Special Session on Quantitative Semantics TL325 | |
Chair: Ugo Dal Lago | |
10:30
- | 10:35 |
Opening Remarks
|
10:35
- | 11:15 |
Up-to Techniques for Behavioural Metrics
Abstract When analyzing the behavior of state-based systems that incorporate quantitative information, such as probabilities, bisimulations are usually replaced by behavioural metrics. In this talk, I first explain how behavioural metrics between systems modeled as coalgebras can be understood as coinductive predicates in a fibration. I then focus on how to compute such metrics more efficiently, using the so called up-to techniques. These provide a well-known method for enhancing coinductive proofs of behavioural equivalences. I discuss an extension of up-to techniques for behavioural metrics and provide abstract results to prove their soundness in a compositional way.
|
11:15
- | 11:55 |
Overview on Quantitative Algebraic and Logical Reasoning
Abstract This talk is meant to be a tutorial on the current research in the field of Quantitative Equational Logic and Quantitative Algebras. The intention is to cover the main concepts and constructions, the relevant examples, the varieties and quasivarieties theorems, as well as the extensions of the to fix-point theories. The talk will also describe recent directions and future challenges. |
11:55
- | 12:35 |
On Generalized Metrics for Higher-Order Languages
Abstract The pervasive role of probabilistic and approximate programming paradigms has motivated the look for semantic approaches not focused on the familiar notion of program equivalence (do two programs compute the same function?), but rather on some notion of program similarity (are the values produced by the two programs close enough?). In these approaches programs are typically interpreted in categories of metric spaces and Lipschitz-continuous functions. However, interpreting higher-order languages in this setting may be challenging. In fact, while the category Met provides a natural and elegant model for linear lambda-calculi like FUZZ, it cannot model standard (i.e. non-linear) typed lambda-calculi, since it is not cartesian closed.
In this talk I will illustrate a few strategies to construct metric models of typed and untyped lambda-calculi, by either restricting to well-behaved classes of metric spaces, by relaxing the required Lipschitz conditions, or by considering alternative formulations of the usual metric axioms (as in Matthews' partial metric spaces)."
|
12:35
- | 14:00 |
Lunch
|
MFPS TL325 | |
Chair: Alex Simpson | |
14:00
- | 14:25 |
Compact Quantitative Theories of Convex Algebras
PDF
Abstract We introduce the concept of compact quantitative equational theory.
A quantitative equational theory is defined to be compact if all its consequences are derivable by means of finite proofs. We prove that the theory of interpolative barycentric (also known as convex) quantitative algebras of Mardare et. al. is compact. This serves as a paradigmatic example, used to derive other compact quantitative equational theories of convex algebras corresponding to distances on finitely supported probability distributions. |
14:25
- | 14:50 |
A fresh look at Bivariate Binomial Distributions
PDF
Abstract Binomial distributions capture the probabilities of `heads' outcomes
when a (biased) coin is tossed multiple times. The coin may be
identified with a distribution on the two-element set $\{0,1\}$, where
the $1$ outcome corresponds to `head'. One can also toss two separate
coins, with different biases, in parallel and record the
outcomes. This paper investigates a slightly different `bivariate'
binomial distribution, where the two coins are dependent (also called:
entangled, or entwined): the two-coin is a distribution on the product
$\{0,1\} \times \{0,1\}$. This bivariate binomial exists in the
literature, with complicated formulations. Here we use the language of
category theory to give a new succint formulation. This paper
investigates, also in categorically inspired form, basic properties of
these bivariate distributions, including their mean, variance and
covariance, and their behaviour under convolution and under updating,
in Laplace's rule of succession. Furthermore, it is shown how
Expectation Maximisation works for these bivariate binomials, so that
mixtures of bivariate binomials can be recognised in data. This paper
concentrates on the bivariate case, but the binomial distributions may
be generalised to the multivariate case, with multiple dimensions, in
a straightforward manner.
|
14:50
- | 15:15 |
Compositional Inference for Bayesian Networks and Causality
PDF
Abstract Inference is a fundamental reasoning technique in probability theory.
When applied to a large joint distribution, it involves updating with
evidence (conditioning) in one or more components (variables) and
computing the outcome in other components. When the joint distribution
is represented by a Bayesian network, the network structure may be
exploited to proceed in a compositional manner --- with great
benefits. However, the main challenge is that updating involves
(re)normalisation, making it an operation that interacts badly with
other operations.
String diagrams are becoming popular as a graphical technique for
probabilistic (and quantum) reasoning. Conditioning has appeared in
string diagrams, in terms of a disintegration, using bent wires and
shaded (or dashed) normalisation boxes. It has become clear that such
normalisation boxes do satisfy certain compositional rules. This paper
takes a decisive step in this development by adding a removal rule to
the formalism, for the deletion of shaded boxes. Via this removal rule
one can get rid of shaded boxes and terminate an inference argument.
This paper demonstrates via many (graphical) examples how the resulting
compositional inference technique can be used for Bayesian networks,
causal reasoning and counterfactuals. |
15:15
- | 15:40 |
Order in Partial Markov Categories
PDF
Abstract Partial Markov categories are a recent framework for categorical probability theory, providing an abstract account of partial probabilistic computation. In this article, we discuss order relations on the morphisms of a partial Markov category. In
particular, we prove that every partial Markov category is canonically enriched over the category of preordered sets and monotone maps. Our construction recovers several
well-known order enrichments. We relate our enriching order to the normalisation order on `self-normalising' maps, and demonstrate how comparators relate to the existence of least conditionals for the normalisation order. Finally, we propose a synthetic version of the Cauchy--Schwarz inequality to facilitate inequational reasoning in partial Markov categories. We apply this new axiom to prove, synthetically, that updating a prior distribution with an evidence predicate increases the likelihood of the evidence in the posterior. |
15:40
- | 16:05 |
Strong Dinatural Transformations and Generalised Codensity Monads
PDF
Abstract We generalise pointwise codensity monads generated by functors to monads generated by mixed-variant bifunctors. Our construction is based on the notion of strong dinaturality (aka Barr dinaturality), and is inspired by denotational models of certain types in polymorphic lambda calculi - in particular, a form of continuation monads with universally quantified variables, such as the Church encoding of the list monad in System F. Extending our previous results on Cayley-style representations, we provide a set of sufficient conditions to establish an isomorphism between a monad and an instance of our construction. Then, we focus on the class of monads obtained by instantiating our construction with hom-functors and, more generally, bifunctors given by objects of homomorphisms, that is, internalised hom-sets between Eilenberg-Moore algebras. This gives us, for example, novel presentations of monads generated by different kinds of (semi)rings and other theories used to model nondeterministic computation. |
16:05
- | 16:30 |
Break
|
16:30
- | 23:59 |
New Lanark
Excursion
andDinner |
Thursday, June 19, 2025
9:00
- | 17:35 |
SW105
On Thursday, we are in the room SW105 in the Stenhouse Wing of the
Strathclyde Business School (see this map for
the route)
|
8:30
- | 9:00 |
Coffee
|
Chair: Bart Jacobs | |
9:00
- | 10:00 |
SW105
Invited Talk:
Free Quantum Computing
Abstract Quantum computing improves substantially on known classical algorithms
for various important problems, but the nature of the relationship
between quantum and classical computing is not yet fully understood. This
relationship can be clarified by free models, that add to classical
computing just enough physical principles to represent quantum computing
and no more. I will discuss an axiomatisation of quantum computing that
replaces the standard continuous postulates with a small number of
discrete equations, as well as a free model that replaces the standard
linear-algebraic model with a category-theoretical one. The axioms and
model are based on reversible classical computing, and isolate quantum
advantage in the ability to take certain well-behaved square roots. The
free model may be interpreted as a programming language for quantum
computers, that has the same expressivity and computational universality
as the standard model, but additionally allows automated verification and
reasoning.
|
10:00
- | 10:30 |
Break
|
Special Session on Mathematics of Natural Language SW105 | |
Chair: Mehrnoosh Sadrzadeh | |
10:30
- | 10:45 |
Opening Remarks
|
10:45
- | 11:20 |
Judicious Incoherence: Probabilistic Modelling Without A Joint Distribution
Abstract There are many processes of language understanding that can be naturally
related to each other in terms of Bayesian inference: for example, judging
whether a description matches a scene, versus imagining a scene based on a
description of it. However, it is known that Bayesian inference is
intractable (in fact, #P-complete), even in seemingly simple settings.
Assuming that a human mind cannot solve intractable problems in a split
second, such cognitive processes cannot be coherent, in a strict Bayesian
sense. I will sketch an approach to modelling cognitive processes as
incoherent conditional distributions, without a joint distribution.
Nonetheless, we can view coherence as an ideal (even if unreachable), and
consider how far we must deviate from it. This allows us to explain some
apparent inconsistencies in human behaviour as judiciously incoherent given
our minds' computational constraints.
|
11:20
- | 11:55 |
DisCoCirc: Theory and Experiments
Abstract Natural language processing (NLP) has seen a huge boom with the success of large language models, however a common issue raised is their black box nature: when something goes wrong, we typically don't understand why.
DisCoCirc is a model that aims to provide an easier path towards interpretability by leveraging compositionality. This leads to two key notions: compositional generalisation - the capacity for the model to be trained on small instances of a problem, and then generalise to larger instances at test time - as well as compositional interpretability, by inspecting how the individual components of the model interact with each other in order to gain an understanding of the behaviour of the model as a whole. For the first half of the talk, we will lay out the theoretical groundwork underpinning the DisCoCirc formalism, in relation to both compositionality, and consequently interpretability. The second half will provide a picture of the current state of research relating to DisCoCirc, including recent results and experiments. |
11:55
- | 12:30 |
Extracting structure from an LLM - how to improve on surprisal-based models of Human Language Processing
Abstract Prediction and reanalysis are considered two key processes that underly humans’ capacity to comprehend language in real time. Computational models capture it using Large Language Models (LLMs) and a statistical measure known as ‘surprisal’. Despite successes of LLMs, surprisal-based models face challenges when it comes to sentences requiring reanalysis due to pervasive temporary structural ambiguities, such as garden path sentences. We ask whether structural information can be extracted from LLM’s and develop a model that integrates it with their learnt statistics. When applied to a dataset of garden path sentences, the model achieved a significantly higher correlation with human reading times than surprisal. It also provided a better prediction of the garden path effect and could distinguish between sentence types with different levels of difficulty.duce disambiguation
patterns for unseen phrases.
|
12:30
- | 14:00 |
Lunch
|
Chair: Clemens Kupke | |
14:00
- | 15:00 |
SW105
Invited Talk:
Behavioural Apartness of State-Based Systems
Abstract Behavioural equivalences such as bisimilarity have been widely studied,
for a variety of state-based systems such as transition systems and
automata. In this talk I will focus on the formal dual of bisimilarity,
referred to as apartness by Geuvers and Jacobs (2022), inspired by
constructive mathematics. Where bisimilarity can be viewed as a
coinductive characterisation of behavioural equivalence, apartness
instead is an inductive characterisation of differences in behaviour of
systems. Indeed, this form of apartness allows the derivation of explicit
proofs of differences. In the talk, I will describe the basic notions of apartness of state-based systems through several examples, discuss connections to distinguishing formulas in modal logic and to bisimilarity games, and finally discuss ongoing work on quantitative notions of behavioural apartness (the latter is joint work with Ruben Turkenburg, Harsh Beohar, Franck van Breugel and Clemens Kupke). |
15:00
- | 15:30 |
Break
|
MFPS SW105 | |
Chair: Elena Di Lavore | |
15:30
- | 15:55 |
Unambiguous Acceptance of Thin Coalgebras
PDF
Abstract Automata admitting at most one accepting run per structure, known as unambiguous automata, find applications in verification of reactive systems as they extend the class of deterministic automata whilst maintaining some of their desirable properties. In this paper, we generalise a classical construction of unambiguous automata from thin trees to thin coalgebras for analytic functors. This achieves two goals: extending the existing construction to a larger class of structures, and providing conceptual clarity and parametricity to the construction by formalising it in the coalgebraic framework. As part of the construction, we link automaton acceptance of languages of thin coalgebras to language recognition via so-called coherent algebras, which were previously introduced for studying thin coalgebras. This link also allows us to establish an automata-theoretic characterisation of languages recognised by finite coherent algebras. |
15:55
- | 16:20 |
Traces via Strategies in Two-Player Games
PDF
Abstract Traces are a coarse notion of semantic equivalence between states of a process, and have been studied coalgebraically for
various types of system. We instantiate the framework of finitary coalgebraic trace semantics of Hasuo et al. to controller-
versus-environment games, for both nondeterministic and probabilistic environments. While our choice of monads is guided
by the requirements of this abstract framework, the suitable monads allow us to recover familiar game-theoretic concepts.
Concretely, we show that in these games, each element in the trace map corresponds to a collection (a subset or distribution)
of plays the controller can force. Furthermore, each element can be seen as the outcome of following a controller strategy.
Our results are parametrised by a weak distributive law, which computes what the controller can force in one step. |
16:20
- | 16:45 |
Continuation Semantics for Fixpoint Modal Logic and Computation Tree Logics
PDF
Abstract We introduce continuation semantics for both fixpoint modal logic (FML) and Computation Tree Logic* (CTL*), parameterised by a choice of branching type and quantitative predicate lifting.
Our main contribution is proving that they are equivalent to coalgebraic semantics, for all branching types.
Our continuation semantics is defined over coalgebras of the continuation monad whose answer type coincides with the domain of truth values of the formulas.
By identifying predicates and continuations, such a coalgebra has a canonical interpretation of the modality by evaluation of continuations.
We show that this continuation semantics is equivalent to the coalgebraic semantics for fixpoint modal logic.
We then reformulate the current construction for coalgebraic models of CTL*.
These models are usually required to have an infinitary trace/maximal execution map, characterized as the greatest fixpoint of a special operator.
Instead, we allow coalgebraic models of CTL* to employ non-maximal fixpoints, which we call execution maps.
Under this reformulation, we establish a general result on transferring execution maps via monad morphisms.
From this result, we obtain that continuation semantics is equivalent to the coalgebraic semantics for CTL*.
We also identify a sufficient condition under which CTL can be encoded into fixpoint modal logic under continuation semantics. |
16:45
- | 17:10 |
Cyclic Proofs in Hoare Logic and its Reverse
PDF
Abstract We examine the relationships between axiomatic and cyclic proof systems for the partial and total versions of Hoare logic and those of its dual, known as reverse Hoare logic (or sometimes incorrectness logic).
In the axiomatic proof systems for these logics, the proof rules for looping constructs involve an explicit loop invariant, which in the case of the total versions additionally require a well-founded termination measure. In the cyclic systems, these are replaced by rules that simply unroll the loops, together with a principle allowing the formation of cycles in the proof, subject to a global soundness condition that ensures the well-foundedness of the circular reasoning. Interestingly, the cyclic soundness conditions for partial Hoare logic and its reverse are similar and essentially coinductive in character, while those for the total versions are also similar and essentially inductive. We show that these cyclic systems are sound, by direct argument, and relatively complete, by translation from axiomatic to cyclic proofs. |
17:10
- | 17:35 |
Reversible computations are computations
PDF
Abstract Causality serves as an abstract notion of time for concurrent systems. A computation is causal, or simply valid, if each observation of a computation event is preceded by the observation of its causes. The present work establishes that this simple requirement is equally relevant when the occurrence of an event is invertible. We propose a conservative extension of causal models for concurrency that accommodates reversible computations. We first model reversible computations using a symmetric residuation operation in the general model of configuration structures. We show that stable configuration structures, which correspond to prime algebraic domains, remain stable under the action of this residuation. We then derive a semantics of reversible computations for prime event structures, which is shown to coincide with a switch operation that dualizes conflict and causality. |
Friday, June 20, 2025
9:00
- | 17:45 |
Conference returns to the room TL325 (same room as on Wednesday in the Learning & Teaching
Building)
|
8:30
- | 9:00 |
Coffee
|
Chair: Elaine Pimentel | |
9:00
- | 10:00 |
TL325
Invited Talk:
Observational Equivalence by Means of Intersection Types
Abstract Observational equivalence (also known as contextual equivalence)
identifies two programs as equivalent when no context can distinguish
between them based on their observable behavior. More precisely, two
programs A and B are observationally equivalent if, for every possible
context C, the observable outcomes of C[A] and C[B] are
indistinguishable. This talk revisits classical results on observational equivalence in the call-by-name setting and presents new contributions in the call-by-value paradigm. The approach relies on intersection types, a powerful and flexible typing discipline that allows a single program to be assigned multiple types simultaneously, each one reflecting a different aspect of the term’s behavior along an evaluation path. |
10:00
- | 10:30 |
Break
|
Special Session on Types and the Extraction of Correct Programs TL325 | |
Chair: Ulrich Berger | |
10:30
- | 11:10 |
Verified Exact Real Computation using cAERN
Abstract Exact real computation (ERC) offers a real number data type whose semantics matches real numbers as used in mathematics and physics, hiding from the programmer issues of accuracy loss due to rounding, making it easier to reason about the correctness of such programs. It is still desirable to formally verify ERC programs as they tend to be complex in subtle ways. Firstly, the representations of exact real numbers are either infinite objects or functions, and care is needed to avoid programs using excessive resources. Secondly, some ERC operations are partial (comparison, limit), and tasks often require non-deterministic branching. Our work focuses on the latter area of complexity with the help of computable analysis. cAERN supports writing partial and nondeterministic ERC programs in Rocq, stating and verifying the functional properties, and extracting reasonably efficient Haskell programs that rely on the Haskell ERC library AERN. cAERN does not define or verify a concrete datatype of real numbers. Instead, it axiomatizes such a datatype as a computable version of an Archimedean ordered complete field and trusts the AERN real-number representation and its basic operations. I will present key cAERN concepts, illustrate them using examples, and compare cAERN with other approaches to verified ERC. This is joint work with Holger Thies and Sewon Park. |
11:10
- | 11:50 |
Exponentiation of Ordinals in Homotopy Type Theory
Abstract An important aspect of classical ordinals is their arithmetic theory,
where addition, multiplication and exponentiation is generalised from
natural numbers to the transfinite. Addition and multiplication can be
defined using well-known and constructively non-problematic
constructions, but exponentiation is usually defined using case
distinctions on whether the exponent is zero, a successor or a limit -- a
definitional principle which is not available constructively. In this
talk, I will instead present two seemingly different constructive
definitions of ordinal exponentiation in the setting of homotopy type
theory. The first is abstract, uses suprema of ordinals, and is solely
motivated by the expected equations. The second is more concrete, based
on decreasing lists, and can be seen as a constructive version of a
classical construction by Sierpiński based on functions with finite
support. A key result is that the two approaches are equivalent (whenever
it makes sense to ask the question), and we can use this equivalence to
prove algebraic laws and decidability properties of the exponential. We
have formalised all our results in Agda. This talk is based on material
that will appear at LICS 2025, and is joint work with Tom de Jong,
Nicolai Kraus, and Chuangjie Xu.
|
11:50
- | 12:30 |
Extracting concurrent programs
Abstract We present a logical system CFP (Concurrent Fixed Point Logic) supporting
the extraction of provably correct programs that permit a limited form of
concurrency (two threads attempting concurrently to modify the same memory
cell). We discuss two application in the area of exact real number
computation: Conversion of infinite Gray code to signed digit
representation, and Gaussian elimination. In both cases the extracted
programs are nondeterministic. A noteworthy feature of CFP is the fact that
its proof rules include a variant of the classical law of excluded middle.
This is joint work with Hideki Tsuiki.
|
12:40
- | 14:00 |
Lunch
|
MFPS TL325 | |
Chair: Fredrik Nordvall Forsberg | |
14:00
- | 14:25 |
Polynomial Universes in Homotopy Type Theory
PDF
Abstract Awodey, later with Newstead, showed how polynomial functors with extra structure (termed "natural models") hold within them the categorical semantics for dependent type theory. Their work presented these ideas clearly but ultimately led them outside of the usual category of polynomial functors to a particular tricategory of polynomials in order to explain all of the structure possessed by such models. This paper builds off that work—explicating the categorical semantics of dependent type theory by axiomatizing them entirely in terms of the usual category of polynomial functors. In order to handle the higher-categorical coherences required for such an explanation, we work with polynomial functors in the language of Homotopy Type Theory (HoTT), which allows for higher-dimensional structures to be expressed purely within this category. The move to HoTT moreover enables us to express a key additional condition on polynomial functors—univalence—which is sufficient to guarantee that models of type theory expressed as univalent polynomials satisfy all higher coherences of their corresponding algebraic structures, purely in virtue of being closed under the usual constructors of dependent type theory. We call polynomial functors satisfying this condition polynomial universes. As an example of the simplification to the theory of natural models this enables, we highlight the fact that a polynomial universe being closed under dependent product types implies the existence of a distributive law of monads, which witnesses the usual distributivity of dependent products over dependent sums. |
14:25
- | 14:50 |
Initial Algebras of Domains via Quotient Inductive-Inductive Types
PDF
Abstract Domain theory has been developed as a mathematical theory of
computation and to give a denotational semantics to programming
languages. It helps us to fix the meaning of language concepts, to
understand how programs behave and to reason about programs. At the
same time it serves as a great theory to model various algebraic
effects such as non-determinism, partial functions, side effects and
numerous other forms of computation.
In the present paper, we present a general framework to construct
algebraic effects in domain theory, where our domains are DCPOs:
directed complete partial orders. We first describe so called "DCPO
algebras" for a signature, where the signature specifies the
operations on the DCPO and the inequational theory they obey. This
provides a method to represent various algebraic effects, like
partiality. We then show that initial DCPO algebras exist by defining
them as so called "Quotient Inductive-Inductive Types" (QIITs), known
from homotopy type theory. A quotient inductive-inductive type allows
one to simultaneously define an inductive type and an inductive
relation on that type, together with equations on the type. We
illustrate our approach by showing that several well-known
constructions of DCPOs fit our framework: coalesced sums, smash
products and free DCPOs (partiality and power domains). Our work
makes use of various features of homotopy type theory and is
formalized in Cubical Agda. |
14:50
- | 15:15 |
Strong normalization through idempotent intersection types: a new syntactical approach
PDF
Abstract It is well-known that intersection type assignment systems can be used to characterize strong normalization (SN). Typical proofs that typable lambda-terms are SN in these systems rely on semantical techniques. In this work, we study \Lambda_\cap^e, a variant of Coppo and Dezani's (Curry-style) intersection type system, and we propose a syntactical proof of strong normalization for it. We first design \Lambda_\cap^i, a Church-style version, in which terms closely correspond to typing derivations. Then we prove that typability in \Lambda_\cap^i implies SN through a measure that, given a term, produces a natural number that decreases along with reduction. Finally, the result is extended to \Lambda_\cap^e, since the two systems simulate each other. |
15:15
- | 15:40 |
A Proof-Theoretic Approach to the Semantics of Classical Linear Logic
PDF
Abstract Linear logic (LL) is a resource-aware, abstract logic programming language that refines both classical and intuitionistic logic.
The semantics of linear logic is typically presented in one of two ways: by associating each formula with the set of all contexts that can be used to prove it (e.g. phase semantics) or by assigning meaning directly to proofs (e.g. coherence spaces).
This work proposes a different approach by adopting a proof-theoretic perspective.
More specifically, we employ base-extension semantics (BeS) to characterise proofs through the notion of base support.
Recent developments have shown that BeS is powerful enough to capture proof-theoretic notions in structurally rich logics such as intuitionistic linear logic. In this paper, we extend this framework to the classical case, presenting a proof-theoretic approach to the semantics of the multiplicative-additive fragment of linear logic (MALL). |
15:40
- | 16:05 |
The Functional Machine Calculus III: Control
PDF
Abstract The Functional Machine Calculus (Heijltjes 2022) extends the lambda-calculus with computational effects while preserving the key features of confluent reduction and typed termination. The first instalment included global higher-order store, input/output, probabilities, and non-determinism, as well as imperative sequencing. The present paper extends the calculus with constructs for control flow: conditionals, exception handling, and iteration, as well as constants and algebraic data types.
The calculus is defined through a simple operational semantics in the form of a stack machine with multiple operand stacks and a continuation stack, extending the (simplified) Krivine machine for the lambda-calculus. It features a confluent reduction relation, extending beta-reduction, and a system of simple types that guarantees termination of the machine and strong normalization (without iteration). |
16:05
- | 16:30 |
Break
|
MFPS TL325 | |
Chair: Henning Urbat | |
16:30
- | 16:55 |
Safety, Relative Tightness and the Probabilistic Frame Rule
PDF
Abstract Probabilistic separation logic offers an approach to reasoning about imperative probabilistic programs in which a separating conjunction is used as a mechanism for expressing independence properties. Crucial to the effectiveness of the formalism is the frame rule, which enables modular reasoning about independent probabilistic state. We explore a semantic formulation of probabilistic separation logic in which the frame rule has the same simple formulation as in separation logic, without further side conditions. This is achieved by building a notion of safety into specifications, using which we establish a crucial property of specifications, called relative tightness, from which the soundness of the frame rule follows. |
16:55
- | 17:20 |
Functoriality of Enriched Data Types
PDF
Abstract In previous work, categories of algebras of endofunctors were shown to be enriched in categories of coalgebras of the same endofunctor, and the extra structure of that enrichment was used to define a generalization of inductive data types. These generalized inductive data types are parametrized by a coalgebra C, so we call them C-inductive data types; we call the morphisms induced by their universal property C-inductive functions.
We extend that work by incorporating natural transformations into the theory: given a suitable natural transformation between endofunctors, we show that this induces enriched functors between their categories of algebras which preserve C-inductive data types and C-inductive functions.
Such C-inductive data types are often finite versions of the corresponding inductive data type, and we show how our framework can extend classical initial algebra semantics to these types. For instance, we show that our theory naturally produces partially inductive functions on lists, changes in list element types, and tree pruning functions. |
17:20
- | 17:45 |
Modular abstract syntax trees (MAST): substitution tensors with second-class sorts
PDF
Abstract We adapt Fiore, Plotkin, and Turi's treatment of abstract syntax with
binding, substitution, and holes to account for languages with
second-class sorts. These situations include programming calculi such
as the Call-by-Value lambda-calculus (CBV) and Levy's
Call-by-Push-Value (CBPV). Prohibiting second-class sorts from
appearing in variable contexts means the presheaf of variables is no
longer a left-unit for Fiore et al's substitution tensor product. We
generalise their development to associative and right-unital
skew monoidal categories. We reuse much of the development
through skew bicategorical arguments. We apply the resulting theory by
proving substitution lemmata for varieties of CBV modularly. |