Skip to main content

Cryptography with Low Complexity

Periodic Reporting for period 3 - CLC (Cryptography with Low Complexity)

Période du rapport: 2018-05-01 au 2019-10-31

The efficiency of cryptographic constructions is a fundamental question. Theoretically, it is important to understand how many computational resources are needed to guarantee strong notions of security.
Practically, highly efficient schemes are always desirable for real-world applications including ones which provide privacy to individuals.
More generally, the possibility of cryptography with low complexity has wide applications for problems in computational complexity, combinatorial optimization, and computational learning theory.

In this project, we aim to understand what are the minimal computational resources needed to perform basic cryptographic tasks. In a nutshell, we focus on three main objectives.
First, we would like to get a better understanding of the cryptographic hardness of random local functions. Such functions can be computed by highly-efficient circuits and their cryptographic hardness provides a strong and clean
formulation for the conjectured average-case hardness of constraint satisfaction problems—a fundamental subject which lies at the core of the theory of computer science.
Our second objective is to harness our insights into the hardness of local functions to improve the efficiency of basic cryptographic building blocks such as pseudorandom functions.
Finally, our third objective addresses the power of garbled circuits -- an important cryptographic tool for secure computation. The goal is to expand our theoretical understanding of garbled circuit, study their limitations, and improve their efficiency.

The project bridges across different regions of computer science such as random combinatorial structures, cryptography, and circuit complexity. It is expected to impact central problems in cryptography while enriching the general landscape of theoretical computer science.
Overall, we have made good progress in the project with respect to all three research goals which resulted in a relatively high number of publications (4 journal papers, 6 conference papers, and a book chapter) in leading venues. We continue with a brief summary of the achievements.
Objective 1: Obtain a better understanding of the cryptographic hardness of Random Local Functions.
• Very recently (FOCS’17), we related the hardness of Random Local Functions to a new exciting conjecture from complexity theory (Gap-ETH), and gave a new transformation from one-wayness to pseudorandomness that can be applied to random local functions while preserving exponential hardness.
• In a joint work with Lovett (STOC’16), we study the security of Random Local Functions against linear attacks and a new class of algebraic attacks in the high-end regime in which the output length is polynomial in the input length. We fully characterize which predicates provide security against such attacks, and refute previous conjectures.
• In a joint work with Ishai and Kushilevitz, (Journal of Cryptology), we show that random local functions are resilient to some form of structured leakage, and employ this fact to obtain new low-complexity one-way functions with optimal locality.
• We also published a survey (Special Issue of Springer’s Computational Complexity) that describes the state of the art of Cryptographic Hardness of Random Local Functions and suggests new directions for research and open questions.

Objective 2. Study applications of Locally Computable Cryptography.
• Fast Secure Arithmetic Computation. In a joint work with Damgard, Ishai, Nielsen, and Zichron, (CRYPTO ’17), we proposed an arithmetic variant of random local functions, and showed that it can be used to obtain the first secure protocol for arithmetic computation with constant computational overhead.
• Fast hash functions. In a recent ITCS paper (with Haramaty, Ishai, Kushilevitz, Vaikuntanathan), we use the intractability of sparse linear codes to obtain the first constructions of low-complexity collision resistant hash functions. The resulting functions have linear shrinkage and constant locality, leading to linear-size circuit implementation.
• Fast Pseudorandom Functions. In a joint work with Raykov (TCC16B). we construct low-complexity Pseudorandom Functions (PRFs) based on the hardness of random local functions. This includes weak PRFs which can be computed in linear time of on a RAM machine with logarithmic word size, or by a depth-3 circuit with unbounded fan-in AND and OR gates (AC0 circuit). We also obtain standard PRFs that can be computed by a quasilinear size circuit or by a constant-depth circuit with unbounded fan-in AND, OR and Majority gates (TC0).

Objective 3. Obtain a better understanding of the complexity of Garbled Circuits.
• Arithmetic Cryptography. In a joint work with Avron and Brzuska (JACM’17), we study the possibility of computing cryptographic primitives in a fully black-box arithmetic model over a finite field F. We show that most cryptographic tasks can be implemented in this model. However, in some cases (e.g. garbled circuits) a larger communication complexity is required. This reveals a qualitative difference between the standard Boolean model and the arithmetic model, and explains, in retrospect, some of the limitations of previous constructions.
• Conditional Disclosure of Secrets. In a joint work with Arkis, Raykov, and Vasudevan (CRYPTO ‘17) we study a weak variant of garbled circuit known as Conditional Disclosure of Secrets. We establish several positive and negative results regarding the complexity of CDS including closure properties, amplification theorems, amortization results, and lower-bounds and separations.
• From Private Simultaneous Messages to Zero-Information Arthur–Merlin Protocols and Back (with Raykov, published in Journal of Cryptology). We relate Zero-Information Arthur–Merlin Protocols (a new information-theoretic variant of zero-knowledge proofs) to standard information-theoretic primitives like Private-Simultaneous Message Protocols (a variant of Garbled Circuits) and Conditional Disclosure of Secrets. The new relation also allows us to derive new protocols with improved complexity.
• On the Relationship Between Statistical Zero-Knowledge and Statistical Randomized Encodings (with Raykov, published in CRYPTO2016). We study information-theoretic randomized encoding (aka garbled circuit) from a computational complexity point of view, and compare them to standard complexity classes such as SZK and NISZK. We show that variants of Randomized encoding (RE) are (non-uniformly) equivalent to non-interactive statistical zero-knowledge and point to barriers for proving equivalence between SZK and RE.
• Incompressible Functions, Relative-Error Extractors, and the Power of Nondeterministic Reductions (with Artemenko, Shaltiel and Yang, published in a Special Issue of Springer’s Computational Complexity). This paper presents the first constructions of incompressible functions and related objects. These results are used to derive a new lower-bound on the online communication of garbled circuits.
Due to the nature of theoretical research, it is hard to make predictions.
Still, we expect to make progress with respect to all three goals.
Concretely, For goal 1, current techniques yield only collections of local pseudorandom generators with high stretch.
We hope to remove the need for collections and to obtain a *single* function implementation.
Another concrete goal is to further develop the connection between local cryptography and hardness of approximation.
For Goal 2, we hope to find further applications of local functions, concretely, the current secure arithmetic protocol that we have achieves a weak form of privacy in a semi-honest setting.
We believe that it should be possible to extend this to the more realistic malicous setting.
For the third goal, we plan to further explore the communication complexity of information-theoretic garbled circuits and their variants.
We believe that one can improve both the existing lower-bounds and the upper-bounds.