Skip to main content
European Commission logo print header

Coinduction for Verification and Certification

Periodic Reporting for period 4 - CoVeCe (Coinduction for Verification and Certification)

Okres sprawozdawczy: 2020-10-01 do 2021-09-30

In critical application domains like aeronautics, distant surgery
systems, or cryptography, one needs to avoid bugs at all costs.

Formal methods like verification provide automatic means of finding
some of these bugs. Certification, using proof assistants like Coq or
Isabelle/HOL, make it possible to guarantee the absence of bugs (up to
a certain point).

These two kinds of tools are crucial in order to design safer programs
and machines. Unfortunately, state-of-the art tools are not yet
satisfactory. Verification tools often face state-explosion problems
and require more efficient algorithms; certification tools need more
automation: they currently require too much time and expertise, even
for basic tasks that could be handled easily through verification.

In this project, we look for new techniques, new algorithms, or new
tools, that will make it possible to verify and/or certify complex
systems that are currently out of reach.

Our approach is based on mathematics: we study mathematical structures
that make it possible to reason about programs, and we look for
algorithms that make it possible to analyse those structures
efficiently. We use several tools from mathematics and computer
science: abstract coinduction, universal coalgebra, proof theory,
graph theory.
Kleene algebra are at the heart of the project: these algebraic
structures make it possible to represent programs abstractly, and
their equational theory can be decided using coinductive automata
algorithms. This is how we obtained automation tactics for program
verification in the Coq proof assistant. A substantial part of our
research is about understanding extensions of these structures.

We obtained several new results in that direction.
- First, we obtained decidability of allegories, a finitely based
equational theory accounting for the most common operations on binary
relations: union, intersection, composition, and transpose. We used
graph theory techniques to solve this problem, which was open since
- Second, we obtained a completeness proof for a form of Kleene algebra
with intersection. There, this is the addition of the intersection
operation which is challenging: from the semantics point of view, it
requires us to move from languages of words, which enjoy a solid
tradition in computer science, to languages of graphs, whose theory is
much less developed.
- Third, we managed to use tools from Logic (more precisely, linear
logic and cyclic proof theory), in order to characterise residuated
Kleene algebras. This lead us to a new completeness proof for
left-handed Kleene algebra and we hope to exploit these ideas in the
future to obtain new decidability/completeness results.

Still in the domain of Kleene and relation algebra, we proved some
negative results: some natural fragments of relation algebra cannot be
finitely axiomatised, and some extensions of Kleene algebra are
undecidable. For the latter, we actually defined a framework for
Kleene algebra with hypotheses, that make it possible to study easily
whether certain extensions are decidable/complete, and to to proceed
modularly with positive instances.

Unexpectedly, by studying cyclic linear proofs for Kleene algebra, we
discovered that these objects had a very nice computational content,
and we used them to obtain "implicit complexity results". Following
the Curry-Howard tradition, proofs can be seen as programs, and logics
as programming languages. With a single proof system, we characterised
the classes of regular languages, logspace languages, primitive
recursive functions, and Peano definable functions, just by
restricting (or not), the uses of the cut and contraction rules.

We also worked on techniques for the long-standing problem of
reasoning about systems combining non-determinism and probabilities.
We showed, unexpectedly, that such systems can be studied via the
so-called "generalised powerset construction". Indeed, while the monads
for non-determinism and probability distributions do not compose well,
one can use convex sets of probability distributions. This important
observation lead us to study presentations of the corresponding monad,
and to propose quantitative equational theories for reasoning about
such systems.

Another area we explored is that of abstract coinduction. Indeed, this
mathematical tool is hidden behind several verification algorithms, so
that better understanding this concept can help to design new or
faster algorithms. There we did a breakthrough by showing that
coinductive methods can be enhanced in a uniform way, using an object
called the companion. We further developed this theory using the
language of category theory (more specifically, universal coalgebra),
in order to widen its scope of applicability.

Lastly, an important aspect of the project is the formalisation in the
Coq proof assistant of some of the theories and algorithms mentioned
above. Indeed, we want in fine to provide certification tools, with
the highest degree of confidence. We have formalised several of our
results, and some of them required us to develop a general purpose
library for graph theory in Coq (coq-graph-theory). Concerning
enhanced coinduction, we released a library (coq-coinduction) which we
use as raw material for an upcoming book chapter on coinduction in
proof assistants.
The aforementioned results all go beyond the state of the art and the
project has ended.

Our work made it possible to open several research directions, which
we will investigate in the future (e.g. algebraic language theory for
graphs, computational content of cyclic proofs, Kleene algebra with
relation algebras