Preprocessing for the purpose of simplifying problem instances is a universal algorithmic technique applied in almost every software implementation. Understanding and assessing preprocessing techniques is therefore of crucial practical importance in computer science. Kernelization is a notion developed in the area of parameterized complexity that provides a the only known reasonable mathematical model for analyzing preprocessing of NP-hard problems. Kernelization algorithms (or kernels) are polynomial time procedures which produce equivalent instances of a given problem whose sizes are bounded by some problem specific parameter which is associated with the inputs. These type of algorithms have become the central research focus of the parameterized complexity community, with many papers on the topic appearing each year, and an annual international workshop devoted entirely to them. Of particular interest are so-called meta-kernels, who are single preprocessing algorithms that apply to a multitude of problems simultaneously.
The current range of applicability of meta-kernelization is rather limited, and is restricted to certain subclasses of sparse graphs. This research proposal aims at remedying this situation by exploring new application domains for meta-kernelization. In doing so, we will explore new methods for obtaining such results, as well as obtain a better understanding of the limitations of current techniques. We suggest to look at three new directions:
1. The class of degenerate graphs which includes within it all classes of graphs for which meta-kernelization is currently known.
2. The class of claw-free graphs which is a non-sparse graph class, and whose structure has been recently unraveled due to the Chudnovsky-Seymour theory.
3. Meta-kernelization for structural parameterizations which measure structural aspects of the input rather than the solution size measured in standard parameterizations.
Fields of science
Call for proposal
See other projects for this call