Skip to main content
Ir a la página de inicio de la Comisión Europea (se abrirá en una nueva ventana)
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
Contenido archivado el 2024-06-18

Algorithms beyond the Worst Case

Final Report Summary - BEYONDWORSTCASE (Algorithms beyond the Worst Case)

Efficient algorithms that can handle large amounts of data are needed in many applications coming from logistics, information retrieval, and other areas. In the classical theory, an algorithm is only considered efficient if it performs well in the worst case, that is, on all possible inputs. This point of view turned out to be too pessimistic in many situations. For many algorithms, worst-case inputs are rather contrived and occur hardly ever in practical applications. Hence, it is a common phenomenon that algorithms perform much better in practice than the theoretical analysis predicts. Even worse, for some important problems the classical theory recommends algorithms that perform badly in practice over algorithms that work well only because the artificial worst-case performance of the latter ones is bad.

In this project, we deviate from the classical worst-case perspective and study the performance of algorithms on inputs that are determined to some extend by random influences. We have shown that this leads in many situations to more realistic theoretical results than worst-case analysis. We obtained, for example, explanations why for many problems simple algorithms based on local search are successful in practice even though the classical theory shows that they are very bad in the worst case. We have also obtained novel results on clustering problems, which explain why certain simple clustering methods are popular in applications.
Mi folleto 0 0