Skip to main content

Semantically correct automatic differentiation

Periodic Reporting for period 1 - SemanDiff (Semantically correct automatic differentiation)

Reporting period: 2020-03-15 to 2022-03-14

> What is the problem/issue being addressed?
The problem my project addresses is how to calculate derivatives of functions implemented by piece of computer code in a way that is simultaneously correct, efficient and generally applicable.

> Why is it important for society?
Such derivative calculations are used in practically every machine learning/AI/computational statistics application, as they are needed to use gradient based optimization (e.g. gradient descent) and integration techniques (e.g. Hamiltonian Monte Carlo).

> What are the overall objectives?
The overarching objectives of this action are
(I) To develop operational and denotational semantic foundations for differential programming.
(II) To use this semantics both to prove correct existing differential programming techniques and to inform
how to extend them soundly to cope with richer language features.
(III) To produce a reference implementation of an AD system to which these correctness proofs apply.

> Conclusion
The project objectives were realised. I developed various semantically grounded AD techniques, together with an implementation and correctness proof. A key insight during this project was that AD code transformations can be uniquely characterised as certain homomorphic functors on the syntax of programming languages. This homomorphism property is the key to their correctness and to their application to expressive programming languages.
> Summary of results achieved
This project has developed solid mathematical foundations of forward-mode automatic differentiation when applied to expressive functional languages. Further, it has made significant progress towards doing the same for reverse-mode automatic differentiation. It has specified algorithms (source-code transformations) for computing derivatives of a functional programs and demonstrated with formal proofs that these algorithms compute precisely the usual mathematical derivatives of the functions that the programs implement.


> Peer reviewed publications
1. Mathieu Huot*, Sam Staton*, and Matthijs Vákár*. "Correctness of Automatic Differentiation via Diffeologies and Categorical Gluing." FoSSaCS. 2020. (* equal contribution)
[Nominated for EATCS Award for the best ETAPS paper in theoretical computer science, EAPLS Award for the best ETAPS paper on programming languages and systems.]
Open access: arXiv preprint arXiv: 2101.06757.

2. Ryan Bernstein, Matthijs Vákár, and Jeannette Wing. "Transforming Probabilistic Programs for Model Checking." Proceedings of the 2020 ACM-IMS on Foundations of Data Science Conference. 2020.
Open access: arXiv preprint arXiv: 2008.09680

3. Matthijs Vákár. "Reverse AD at Higher Types: Pure, Principled and Denotationally Correct." ESOP. 2021.
[Nominated for EATCS best paper award for the best ETAPS paper in theoretical computer science]
Open access: arXiv preprint arXiv:2007.05283

4. Maria Gorinova, Andy Gordon, Charles Sutton, Matthijs Vákár Conditional independence by typing. ACM Trans. Program. Lang. Syst. 44, 1, Article 4 (March 2022). Open access: arXiv preprint arXiv:2010.11887


5. Matthijs Vákár, and Tom Smeding. "CHAD: Combinatory homomorphic automatic differentiation." arXiv preprint arXiv:2103.15776 (2021). Accepted for publication in TOPLAS.


> Papers under review


6. Fernando Lucatelli Nunes, and Matthijs Vákár. "CHAD for Expressive Total Languages." arXiv preprint arXiv:2110.00446 (2021). Under review at MSCS.



> Preprints
7. Matthijs Vákár. "Denotational Correctness of Forward-Mode Automatic Differentiation for Iteration and Recursion." arXiv preprint arXiv:2007.05282 (2020).



> Papers accepted for publication but not yet published

8. Mathieu Huot*, Sam Staton*, and Matthijs Vákár*. "Higher Order Automatic Differentiation of Higher Order Functions." arXiv preprint arXiv:2101.06757 (2021). Accepted for publication in LMCS. (* equal contribution)




> Dissemination
1. Presentation at FoSSaCS 2020
2. Presentation at ESOP 2021
3. Wiki page that I wrote for a general mathematical and computer science audience about the mathematical foundations of automatic differentiation, based on the results of this project:
https://ncatlab.org/nlab/show/automatic+differentiation
4. Lecture for MSc students about results of this project (notably CHAD, papers 3 and 5 above) in MSc course on program analysis at Utrecht University.
5. Presentation at FHPNC 2021


> Exploitation
Publicly available reference implementation of the algorithms developed in this project:
https://github.com/VMatthijs/CHAD
> Progress beyond the state of the art
- presentation of dual-number style forward mode AD algorithms of functional programs that make use of any combination of: higher-order functions, variant types, inductive types, iteration, recursion, real conditionals, recursive types;
- formal correctness proofs of these dual-number style forward mode AD algorithms, using novel logical relations techniques;
- extension of these dual-number style forward mode AD algorithms and their correctness proofs to compute higher-order derivatives;
- presentation of novel source-code transform reverse mode AD algorithms that operate on functional programs that make use of any combination of: higher-order functions, variant types, inductive types, coinductive types;
- relating these novel source-code transform reverse mode AD algorithms to a variant of forward mode AD that is precisely its mirror image;
- formal correctness proofs of these source-code transform reverse mode AD algorithms, using novel logical logical relations techniques;
- reference implementation of the algorithms above in Haskell.

> Expected results until the end of the project
No more results are expected with the following exceptions:
- papers 4, 5 and 6 above are likely to be accepted for publication;
- I am teaching 2 hour tutorial about the results of this Marie-Curie project as part of the thematic month on Logic and Interactions at CIRM (https://conferences.cirm-math.fr/2686.html) from 31 January to 4 February 2022 in Marseille.
screenshot-2021-11-16-at-11-23-47.png