Skip to main content
European Commission logo
français français
CORDIS - Résultats de la recherche de l’UE
CORDIS

A Reduction Theory for Codes and Lattices in Cryptography

Periodic Reporting for period 2 - ARTICULATE (A Reduction Theory for Codes and Lattices in Cryptography)

Période du rapport: 2022-07-01 au 2023-12-31

Cryptography, is the science and practice of securing data and communications, that is guaranteeing their authentication and confidentiality. It is constantly used under the hood of our digital infrastructure: on credit cards during payments, or at each connection to a website from a smartphone or computer.

This security backbone is currently undergoing a complete overhaul, because the currently used protocols are based on mathematical problem that may soon be solved by quantum computing (for example, the problem of factoring large numbers). While the quantum computing is not mature enough yet to actually break this existing cryptosystems, they might soon be, and we must be ready by then. Such a global transition takes care and time, so it is why this is being dealt with now, even if we do not expect quantum attack to be relevant before another decade.

This project is specifically interested in to mathematical objects that can provide quantum-safe cryptography, namely linear codes and Euclidean lattices. It includes research on cryptanalysis (mathematical attacks) of those problems and cryptosystems, first to make sure that a devastating attack would not have been missed, but also to cost as precisely as possible known attacks, so that we can parametrize those cryptosystems. The project also considers foundational questions on the hardness of codes and lattice problems mixing mathematical and complexity theoretic perspectives. At last, it also consider the invention of better ways to construct cryptosystems based on those mathematical objects.

One specificity of this project is to look jointly at both codes and lattices, in the hope of transferring ideas between two mathematical objects that share similar principals. Indeed, linear codes and Euclidean lattices are both regular grid of points, but in different types of spaces, and are both used outside of cryptography for similar purpose: error correction. This approach aims at getting more confidence that no stone was left unturned, when it comes to breaking those schemes. But it might also open new avenues in the construction of such cryptosystems.
The first result was practical cryptanalysis landmark for lattice-based cryptography, by porting the best known algorithm to modern Graphic Processing Units, in particular exploiting there Tensor Core; while these computation cores were designed with Machine Learning application in mind, they excel more generally at low precision linear algebra. With care, we were able to rephrase the core routine of lattice attacks in linear algebra term; this allowed us to solve the lattice challenge up to dimension 180. This improved the time and energy cost by a factor 100x compared to earlier records.

The project has also started with the work of transferring techniques back and forth between codes and lattices. First with a adaptation of the famous LLL algorithm to codes, which, while only proviving minor improvement of code cryptanalysis also brought more geometric insight into attacking computational problems on linear codes. We also looked at the notion of smoothing for codes and lattices, and performed a systematic comparison of the techniques from both fields. It turned out that many approach had been developed, but not all combinations had been considered. Our systematic approach considering both codes and lattices led to improve smoothing bounds for both objects, and in particular reaching optimal bounds for random codes and random lattices.

But the main bulk of work has been devoted to a direction that was completely unforeseen at the beginning of the project. Indeed, comparing how code-based cryptosystems and lattice-based cryptosystem were designed, it appeared that there were no clear equivalent of McEliece cryptosystems in the world of lattices, or more precisely, that past attempts suffered non-lattice attacks. Abstracting away the key notion at hands, it became clear that the Lattice Isomorphism Problem should provide such an translation of McEliece for lattices, and we provided a framework for doing so based on any lattice of interest. This framework allow the used of remarkable lattices in the design of lattice-based cryptosystems, for example lattice with better packing properties, which should lead to more efficient or compact cryptosystems. This work was followed up by a concrete design of a signature scheme, HAWK, that shine by its simplicity compared to usual lattice-based signature scheme, and notably outperforms alternative post-quantum signature schemes. And this scheme, was recently submitted to a new call for standardization from the NIST.

While the Lattice Isomorphism Problem (LIP) has been of interest before our cryptographic applications, the context and typical instances differs from those prior works. This means that we have some historical basis to understand the cryptanalysis, but that much work remains to do to make sure that the chosen instances of LIP are indeed secure. So the project also took this direction, in particular studying the notion of Genus of a lattice, to understand how useful this easily computable information can be. Fortunately, it doesn't seem so informative, revealing just a few bits. Further ideas from the cryptanalysis of linear codes have further been imported, in particular the notion of Hull, and we showed that it can have a significant effect on security. To avoid such concern, a certain class of lattices (called unimodular) are to be preferred, as their Hull reveals no further information.
In all the direction discussed above, the state of the art was significantly advanced.

First, we have demonstrated that modern hardware (GPU with tensor cores) could be exploited to there full power for the cryptanalysis of lattices. However, we are reaching the memory limit of a single machine, and we therefore do not expect further result in that direction during that project.

We also had successes in transferring techniques between lattices and codes, with a translation of the LLL algorithm from lattice to codes, and two-way exchange on the topic of smoothing bound. So far we do not see significant cryptanalytic impact of this LLL algorithm for code emerging, and we are instead exploring other cryptanalytic techniques to be transferred. We expect to either find improved attack on codes or lattices, or to build a sense of completion of the respective state of the art by showing that indeed, all known stones have been turned.

Regarding the smoothing bounds, a surprising development has occurred. While this was mostly motivated by theoretic interest, these new smoothing bound turned to to have elucidated a burning question in cryptanalysis of lattices. Indeed, there were recent claims of improved attacks against the scheme KYBER and DILITHIUM, currently being standardize. But those attack were based on some heuristic reasoning that hadn't been checked experimentally nor studied theoretically. This new smoothing bound demonstrated that this heuristic could not hold in the regime of interest. And we also infirm this heuristic experimentally. Much work remain in that direction: despite the heuristic being wrong in certain regime, the approach remains sensible, but modeling the performance of the approach is trickier than what the heuristic proposed. The current cost of this attack is unknown, which is an uncomfortable situation for the ongoing standardization. We hope to resolve the situation with further work of this project.

At last, the successful foundation of cryptographic scheme on the Lattice Isomorphism Problem (LIP), and the concrete proposal HAWK that followed are also opening many questions, and we certainly hope to tackle some of them during the remaining of the project. In particular, while the question of genus for unstructured lattice is well understood, HAWK actually use algebraically structured lattices, and it would seem natural to refine genus theory to such cases (module lattices over Complex Multiplication Number Fields).