Skip to main content

Randomised Algorithms


In recent years, randomised algorithms and randomised complexity theory have become major subjects of study in the design of computer algorithms and in the theory of computation. For some computational problems, it now appears that randomised or pseudo-randomised algorithms are more efficient than deterministic ones (in terms of hardware size, running time, circuit depths, transparent descriptions, etc). Recent and very striking examples are the new randomised approximation algorithms for enumerating the number of perfect matchings in certain classes of graphs or for some enumeration problems in the boolean algebra and finite fields areas, and estimating the volume of convex bodies. Solutions to these problems have applications ranging from circuit design and coding theory to statistical mechanics and quantum field theory.

In this new and fast-growing area, we are confronted with a number of very fundamental questions for which the classical theory of computation knows few answers. There are still questions remaining, such as how much randomness is necessary to accomplish certain levels of efficiency in algorithms, and how efficient the deterministic simulation of some classes of algorithms can be.

The Group will address the above-mentioned problems and concentrate on the design of efficient (both sequential and parallel) randomised algorithms for some selected combinatorial, algebraic and geometric problems foundations of randomised complexity of computational problems; randomised approximation problems; computation with limited randomness resources (de-randomisation methods); and computational learning theory.
Fundamental issues of the design of efficient randomized algorithms as well as fundamental complexity questions of randomized computation have been addressed with respect to their time and space efficiency, computation with limited randomness resources, and the problems of deterministic simulation of randomized computation.

Substantial progress has been made in all the main research areas, especially in the design of efficient randomized algorithms, randomized approximation algorithms, foundations of randomized complexity theory, derandomizing algorithms, and computational learning theory.

RAND's research should shed some light on the relative power of randomisation as a computational resource for designing efficient (both sequential and parallel) algorithms, as well as facilitating the design of efficient pseudo-random generators for a number of selected algorithmic applications.


Rheinische Friedrich-Wilhelms-Universität Bonn
Römerstraße 164
53117 Bonn

Participants (4)

Ole Romers Vag, 1118
221 00 Lund
University of Leeds
United Kingdom
37 University Road
LS2 9JT Leeds
University of Oxford
United Kingdom
11 Keble Road
OX1 3QD Oxford
Université de Paris XI (Université Paris-Sud)
Avenue Georges Clémenceau
91405 Orsay