CORDIS - Resultados de investigaciones de la UE
CORDIS

Randomness and Computation

Final Report Summary - RAC (Randomness and Computation)

Randomized algorithms and protocols play an important role in many areas of Computer Science. A fundamental question in Complexity Theory is how much randomness (if any) is required for solving various computational problems? The holy grail of this area is to show that BPP=P, namely that every polynomial time randomized algorithm can be simulated by a polynomial time deterministic algorithm.

A key concept/tool in this research is a pseudorandom generator. This is a function G that receives an r bit string s (called "the seed") and outputs an n>r bit string (called "pseudorandom string"), with the property that efficient observers cannot distinguish a uniformly chosen n bit string, from a uniformly chosen pseudorandom string. The idea of pseudorandomness is that "randomness is in the eyes of the beholder" meaning that a distribution that may be very far from random (by standard statistical measures, say a distribution supported on only 2^r strings) can look random to observers of bounded complexity.

A pseudorandom generator can be used to reduce the amount of random bits used by a randomized algorithm from n bits to r bits (by using pseudorandom strings instead of random strings). If r is sufficiently small, one can often get rid of randomness altogether by enumerating over all 2^r choices of seeds. This gives motivation to design efficient pseudorandom generators with small seed length r.

It is known that the existence of pseudorandom generators (against polynomial time circuits) that give rise to derandomization of general poly-time randomized algorithms implies circuit lower bounds (that seem beyond the reach of our current techniques). Therefore, a successful approach is the "hardness versus randomness" paradigm, in which one assumes circuit lower bounds and concludes the existence of efficient pseudorandom generator (and derandomization).

In this research project we have made significant progress in understanding the power and limitations of this scientific approach:
- We give the first hardness versus randomness tradeoffs for NP complete problems.
- We give the first hardness versus randomness tradeoffs for randomized algorithms with very large error probabilities.
- We construct PRGs that fool non-boolean distinguishers and can be used to reduce randomness of sampling algorithms.
- We identify a barrier in all current pseudorandom generator constructions (the so called "hybrid argument" of Goldwasser and Micali) and show that it can sometimes be bypassed. These results are weak, but they demonstrate that the barrier can sometimes be bypassed.
- We introduce new notions of pseudorandom generators (and other related objects) with "relative error" and show that these weaker objects (which suffice for some applications) can be constructed under significantly weaker unproven assumptions.
- We show that pseudorandom generators can be used to construct incompressible functions, with applications to cryptography.
- We show limitations on the hardness versus randomness approach, showing that pseudorandom generators (and hardness amplification) for low error, cannot be achieved from standard assumptions using black-box techniques. This explains (in retrospect) certain limitations in past work in the area, and can point us to more suitable approaches for achieving our goals.
- We show that pseudorandom generators with additional properties can be used to construct optimal rate binary list-deocdable codes for computationally bounded channels, and used this to improve the known constructions.

In addition, our research also touches other fundamental pseudorandom objects such as extractors, dispersers and codes. We will not go into details here (as this requires more technical jargon). The main focus of our research is to give new explicit constructions of these objects. We also develop new applications of these objects in Cryprography, Coding Theory and Data Storage.