Skip to main content

Nonlinear Eigenproblems for Data Analysis

Final Report Summary - NOLEPRO (Nonlinear Eigenproblems for Data Analysis)

In machine learning and exploratory data analysis, the major goal is the development of solutions for the automatic and efficient extraction of knowledge from data. This ability is key for further progress in science and engineering. A large class of data analysis methods is based on linear eigenproblems. While linear eigenproblems are well studied, and a large part of numerical linear algebra is dedicated to the efficient calculation of eigenvectors of all kinds of structured matrices, they are limited in their modeling capabilities for data analysis and are often used as loose approximation technique for certain data analysis methods based on a class of combinatorial optimization problems.

The objectives of NOLEPRO were to use the rich structure of nonlinear eigenproblems for data analysis in various applications, to enhance the theory of nonlinear eigenproblems and to develop a general framework for the computation of nonlinear eigenvectors.

These goals have been achieved by proposing a variety of methods for data analysis based on nonlinear eigenproblems for dimensionality reduction, clustering, community detection, graph matching, team formation and training of neural networks. In particular, in NOLEPRO we could identify a large class of combinatorial optimization problems which have an exact relaxation into a nonlinear eigenproblem and proposed fast methods for the computation of nonlinear eigenvectors. The obtained solutions for the combinatorial problems via the methods developed in NOLEPRO outperform other relaxations, in particular via linear eigenproblems, often by large margin. In NOLEPRO we developed efficient scalable algorithms for the general computation of nonlinear eigenvectors. Moreover, we extended the nonlinear Perron-Frobenius theory to order-preserving multi-homogeneous mappings which allowed us to significantly generalize the existing spectral theory for nonnegative tensors. While the computation of the maximal nonlinear eigenvector is in general NP-hard, for the class of nonnegative tensors we could enlarge significantly the class of spectral problems for which the maximal eigenvector can be computed efficiently.