Periodic Reporting for period 2 - FGC (Fine-Grained Cryptography)
Reporting period: 2023-07-01 to 2024-12-31
In public-key cryptography, in particular, the encrypting side uses a publicly known key to encrypt its information, while the decrypting side uses a secret, and different, key. Public-key cryptography is extremely useful in many applications, but it comes with two downsides: it is a slow process and finding hard problems for it is much more difficult than finding hard problems for private-key cryptography.
The kind of computational hardness we need for public-key cryptography is hard to come by, and in 45 years we only have found a handful of candidate hard problems. This is an important problem for society as all forms of communication and electronic commerce crucially rely on public-key encryption. For such applications to be viable in the medium and long term, it is imperative to develop a wide array of alterative encryption methods, to provide fallbacks in case current methods turn out to be insecure, an event that could potentially occur at any given day.
The FGC (Fine-Grained Cryptography) project aims, as one of its central goals, to find new hard problems for public-key cryptography. Traditionally, cryptography has been based on problems for which there is a conjectured exponential complexity gap between the easy and hard directions, where complexity can be considered a measure of the computational power needed to solve a problem.
The project explores a new avenue, where the complexity gap is not exponential, but "only" polynomial. For instance, if the complexity faced by the “good guy” is 128 raised to the power of two, the “bad guy” also faces 128 raised to a power. If the power is sufficiently high, the code is as secure as with an exponential complexity gap: 128 raised to a power of 20, for example, is roughly equal to 2 raised to a power of 140 and thus comparable to the exponential gap case.
We also started exploring generalized learning models with a single hidden layer, which have been extensively studied in the context of deep learning and statistical physics but less so from the cryptographic perspective.
Finally, among other directions, we have been looking into fast cryptographic protocols that could be based on “scaling down” of previously proposed protocol relying on problems related to permuted Reed-Muller codes. The latter approach builds on the fact that protocols sometimes deal with very large databases, which makes them natively well-suited to fine-grained security.
In tandem to the above, we are exploring promising interdisciplinary avenues of investigation along the lines of our project’s goals, specifically using problems emerging from the study of spin glasses in statistical physics. We are studying whether such average-case constraint satisfaction problems enjoy a cryptographic property called “collision resistance” with an eye towards further developments in the direction of public-key encryption.
Success in the above direction could potentially open a wide array of hard on average problems that could be utilized for cryptographic purposes down the line. It could also have implications in the other direction, namely in understanding of physical systems with large ensembles of particles.
Going back to fine-grained cryptography, we recently discovered new ways to “plant” solutions in a potentially hard-on-average way, in one of the central problems of fine-grained complexity theory: the orthogonal vectors problem. We view this development as central to the FGC project, as it goes along the main avenue we set out to explore and potentially opens up new exciting applications, such as fine-grained secure digital signatures and public-key encryption.
In addition to the above, the project studied the following topics: locally testable tree codes, with the ultimate goal of building “tree PCPs” an incremental way of proving the correctness of evolving computation, the composition of multi-round zero-knowledge protocols, efficiently proving the integrity of inference of Large Language Models with minimal overhead to the prover, absolute commitments allowing the strategy of an agent to depend on the commitments made by the other agents, and the complexity of the average-case k-SUM problem.
We intend to further pursue the directions outlined above, deepening the results gotten thus far.