Skip to main content
European Commission logo
English English
CORDIS - EU research results
CORDIS

Randomness Extractors: Constructions and Applications

Periodic Reporting for period 2 - EXT (Randomness Extractors: Constructions and Applications)

Reporting period: 2022-03-01 to 2023-08-31

A randomness extractor is an algorithm that produces, or "extracts", truly random bits from an imperfect source of randomness. Extractors can be traced back to the work of von Neumann from the 1950-s and was established as a research field in the late 80-s and early 90-s. Extractors have been invented for bridging the gap between the usefulness of randomness in algorithms and the fact that ideal randomness is hard to come by by physical means. Hence, an algorithm that transforms imperfect randomness to truly random bits, used as a piece of software between the physical mechanism that produces the imperfect randomness, and the algorithms that use randomness, is called for.

Although their original motivation was for that particular task, randomness extractors have found dozens of applications in computer science. The current ERC project is focused on the problem of constructing randomness extractors and, more importantly, on applications to fundamental problems, including the derandomization of space bounded computation, and the construction of locally correctable codes and variants thereof. We turn to briefly describe these objectives.

To get a taste for what randomness extractors are, without getting into the technical details, consider the simplified task of devising an algorithm that is given as input a sample from two random variables, one is uniform and the other may depend arbitrarily on the former. The goal is to “merge” the two random variables, despite their dependencies, to one truly random variable. Solving this particular task will result in better randomness extractors.

The problem of derandomizing space bounded algorithms is one of the central problems in complexity theory. The goal is to simulate deterministically a randomized algorithm with low overhead in space. The most useful type of extractors were invented for that particular purpose. An objective of this project is to make progress on this fundamental problem either by exploiting further insights on extractors, or otherwise.

Coding theory addresses the problem of communication over noisy channels. An important property of a code is the ability to decode any bit without reading the entire received message. Surprisingly, extractors play a role in the design of such codes. A second important type of codes are tree codes, which are used in more dynamic settings. In this ERC project we wish to improve the known constructions either by using extractors in more sophisticated ways, or otherwise.
Up to this point, the most successful results obtained as part of the current ERC project are in each of the applications to randomness extractors, as we elaborate below, starting with space bounded derandomization.

The state-of-the-art derandomization result by Saks and Zhou, obtained in the mid-90s, deterministically simulates space S randomized algorithm in space S^(3/2). The algorithm is based on Nisan's seminal pseudorandom generator (PRG) which too saw no improvement for about 30 years. In 2018 we introduced a novel relaxation of PRG dubbed “weighted PRG”, and constructed WPRG with better parameters than Nisan’s PRG. Further, we launched a research program for improving the SZ algorithm based on WPRG. In a CCC 2021 paper, my coauthors and I obtained a significantly simplified construction matching BCG using spectral techniques. More recently, in a joint work with my co authors, we obtained the first polynomial improvement over the SZ algorithm in 25 years, albeit in a regime of parameters that although is interesting and natural, is less relevant for the purpose of derandomization. This work will appear in STOC 2023.

In a sequence of joint works with my PhD student Tal Yankovitz (CCC 2021, ICALP 2022, FOCS 2022) we developed tools for the construction of locally correctable codes (LCC), improving upon the distance amplification procedure for these codes and introducing a novel rate amplification procedure. In particular, our third paper significantly improves upon the state-of-the-art relaxed LCC by introducing an alternative to tensoring, which is one of the most prevailing techniques in the field.

Tree codes are fundamental combinatorial objects that were introduced by Schulman in his seminal work from the mid 90-s, which launched the field of interactive coding. The holy grail is to obtain explicit constant distance tree codes with a constant number of colors - an open problem that has withstood any progress for nearly 25 years until our STOC 2018 paper. In a STOC 2022 paper, we improved upon the previous best construction, obtaining the state-of-the-art tree code construction.

Extractors. As for the construction of extractors, with my coauthors, we introduced a new notion of extractors dubbed “seed protecting extractors” which unifies several types of well-studied extractors. On top of the insight it provides, the paper gives rise to a new route for constructing extractors.
The progress made so far beyond the state-of-the-art results in the literature, as part of the current ERC project, can be summarized as follows:

Space bounded derandomization. In a joint work with my co authors, we obtained the first polynomial improvement over the seminal Saks-Zhou algorithm in nearly 30 years. The result has been obtained concurrently and independently by researchers from Harvard and MIT, based on an earlier manuscript we have published. This work will appear in STOC 2023.

Locally correctable codes. In a joint work with my PhD student Tal Yankovitz, that appeared in FOCS 2022, we significantly improved upon the state-of-the-art relaxed locally correctable codes by introducing an alternative to tensoring, which is one of the most prevailing techniques in the field.

Tree codes. In a STOC 2022 paper, joint with two students of mine, we improved upon the previous best construction of tree codes. We obtained the state-of-the-art tree code constructions both in terms of the number of colors, improving upon an earlier result from STOC 2018, and in terms of the distance, improving exponentially upon a result from SODA 2016.

Much of this progress ended up being based on techniques that are not directly related to randomness extractors. For the results we obtained in space bounded derandomization, spectral methods were used. This motivated us to initiate the study of problems in spectral graph theory, resulting in a sequence of papers (STOC 2021, ICALP 2022, STOC 2023). For our result on tree codes, we used a somewhat involved combinatorial approach. Finally, for our results on locally correctable codes, we used certain pseudorandom objects that are tailor-made for the task rather than extractors which we understood to be pseudorandom objects that are too general for the task at hand.

Looking forward, we expect to continue and develop the spectral methods used for space bounded derandomization, with the goal of improving the Saks-Zhou algorithm. Moreover, we consider using spectral methods for the construction of randomness extractors in novel ways. The technique we used, bypassing the limitations of tensoring, or variant thereof may be used in many of the other applications of tensoring, and we attempt to understand its potential. Lastly, the sequence of works we initiated on spectral methods took on a life of its own, and we wish to further investigate its potential with an eye towards the goals of this ERC project.