Project description
Scalable algorithms for parallel computing
The digital revolution has profoundly changed science, engineering and everyday life. Computer applications process continuously increasing data volumes with ever more complex algorithms. Progress is in danger of reaching its limits: scalability of the programmes – their capacity to grow with their tasks – represents a major challenge. To solve bigger problems, many processors have to be used in parallel. The EU-funded ScAlBox project will work on algorithms and software libraries for basic software components that can be used in various ways and scaled to datasets of any size and millions of processors operating in parallel. Such components include searching, sorting, queueing administration, work distribution to parallel processors and communication between processes. This will also include processing complex relations between a large numbers of objects (graphs).
Objective
ScAlBox aims at basic algorithmic tools that can be used in a wide spectrum of applications and that scale orders of magnitude better than the state of the art with respect to input size or number of processors.
In the last decades, we witness the transition into the information age with profound effects on science, technology, and our daily life. This transition is driven by a growing spectrum of computer applications that process larger and larger data sets using increasingly complex algorithms. However, the scalability challenge has emerged as a major road block to this progress:
An explosion of the amount of data to be processed (big data) coincides with stagnating performance of a single processor core. This widening performance gap can only be closed using many parallel processors. However, parallel algorithms have long been neglected by algorithm theory while heuristic software development optimizes for existing machines and inputs but fails to give predictable scalability for future, larger data sets and more processors.
We want to overcome this roadblock by developing scalable solutions for the basic toolbox of algorithms and data structures that are needed in many applications (e.g. sorting, searching, queues, basic graph algorithms, collective communication, and load balancing). My goal is to provide algorithms and software libraries that scale to millions of processors and give hard performance guarantees for arbitrary inputs. This is challenging due to large gaps between theory and practice and because such algorithms have to integrate scalable fault tolerance and dynamic load balancing to an unprecedented extent. I am the right person to achieve this due to my extensive experience in parallel algorithms for irregular problems and my leading role in algorithm engineering that integrates modeling, design, theoretical analysis, implementation, and experimental evaluation.
Fields of science
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques.
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques.
- natural sciencescomputer and information sciencesdata sciencebig data
- natural sciencescomputer and information sciencessoftwaresoftware development
- natural sciencesmathematicspure mathematicsdiscrete mathematicsgraph theory
- natural sciencescomputer and information sciencesartificial intelligenceheuristic programming
Keywords
Programme(s)
Topic(s)
Funding Scheme
ERC-ADG - Advanced GrantHost institution
76131 Karlsruhe
Germany