Skip to main content
Go to the home page of the European Commission (opens in new window)
English English
CORDIS - EU research results
CORDIS

Complexity and Condition in Algebra and Numerics

Periodic Reporting for period 4 - COCAN (Complexity and Condition in Algebra and Numerics)

Reporting period: 2023-07-01 to 2024-12-31

This project connects three areas that are usually considered quite distant from each other: computational complexity, algebraic geometry, and numerics. Computational complexity is a foundational part of computer science, provides formal models for investigating algorithms and tries to understand the inherent limits of efficient computation in a broad sense. Algebraic geometry is a highly developed branch of pure mathematics that studies the set of solutions of polynomial equations. Numerical mathematics is concerned with the development and analysis of efficient algorithms for the solution of mathematical problems such as those arising in natural sciences and engineering.

In the last decade, it became clear that the fundamental questions of computational complexity (P vs NP) should be studied in algebraic settings, linking them to classic problems in algebraic geometry. This research direction, which goes under the name geometric complexity theory, led to surprising progress in computational invariant theory, which is also related to quantum information theory.

The project's goals was to explore these connections. An essential new ingredient to this was to tackle the arising algebraic computational problems by means of approximate numeric computations, taking into account the concept of numerical condition. A related goal of the proposal was to develop a theory of efficient and numerically stable algorithms for solving polynomial equations, which reflects the properties of structured systems of. While there are various heuristics, a satisfactory theory so far only existed for unstructured systems over the complex numbers (recent solution of Smale's 17th problem), which seriously limits its range of applications. In this framework, the quality of numerical algorithms is gauged by a probabilistic analysis that shows small average (or smoothed) running time. One of the main challenges consists of a probabilistic study of random structured polynomial systems. Another goal was to develop and analyze numerical algorithms for finding or describing the shape of the set of real solutions, e.g. in terms of their homology.
1. We initiated a systematic development of a theory of non-commutative optimization. It develops and analyzes algorithms for natural geodesically convex optimization problems on Riemannian manifolds that arise from the symmetries of non-commutative groups. These algorithms minimize the moment map (a non-commutative notion of the usual gradient) and test membership in null cones and moment polytopes (a vast class of polytopes, typically of exponential vertex and facet complexity, which arise from a priori non-convex, non-linear setting). This setting captures a diverse set of problems in different areas of computer science, mathematics, and physics.

We designed and analysed first and second order methods in this general framework. However, when restricted to the commutative case, these algorithms' guarantees do not match those of cut methods in the spirit of the ellipsoid algorithm or of interior point methods. For this reason, we developed an analogue of interior point methods in the geodesic framework of non-commutative group actions on Hadamard manifolds, generalizing the notion of self-concordance. However, the outcomes in the quantitative regime are less favorable than what we hoped for.

Surprisingly, during these investigations in computational invariant theory, a connection to to statistics popped up and was investigated as part of the project.

We also investigated the related orbit closure intersection problems and obtained a satisfactory general result for the actions of commutative groups (tori). The aspect of numerical robustness led to a surprising new connection between computational complexity and number theory (abc-conjecture). However, the complexity status of orbit problems in the general noncommutative case remains wide open.

2. Smale's 17th problem asked whether n polynomial equations in n variables can be solved over the complex numbers in average polynomial time. This problem had been solved affirmatively around ten years ago. However, in this model, the input size is generously measured by the number of coefficients. Relying on the method of rigid homotopies, Bürgisser, Cucker, and Lairez showed that an analogue of Smale's 17th problem holds in a setting of structured polynomial systems. Solutions can be computed in average polynomial time, when the input polynomials are given in the data structure of algebraic branching programs, which allows to encode polynomials of large degrees with few input parameters. The randomness comes from independent standard Gaussian coefficients. On the practical side, the freely available software „Homotopy Continuation.jl'' of numerical algebraic geometry by Breiding and Timme was further developed.

3. In the different direction of computing real solutions to polynomial equations, we established provably efficient and numerically stable algorithms for computing the topology of semialgebraic sets. These algorithms run in weak exponential time, while all previously known algorithms have doubly exponential complexity. The techniques and results are of relevance in high-dimensional data analysis.

4. A considerable amount of work went into an emerging theory that may be called probabilistic intersection theory. The general goal is to understand the real zero set of random polynomial systems. We proved two quantitative results on counting the real zeros of random polynomial systems: one of them connects to complexity theory: the claimed estimate of the real tau conjecture is shown to be true for random polynomials. A second result, for the first time, proves good upper bounds on the expected number of real zeros of random polynomial systems in terms of the number of monomials.
The publication „Towards a theory of non-commutative optimisation'' is a foundational paper for an entire new field at the interface of group actions and convex optimization. It could well be the start of a whole new research area. This work will be published in expanded and revised form as a research monograph in the Annals of Math. Study Series.

The article „Rigid continuation paths II. Structured polynomial systems'' is a milestone in our understanding of numerical algorithms in complex algebraic geometry. For the first time, a homotopy algorithm for computing a zero of a natural class of *structured* polynomial systems is presented, of which one rigorously prove its typicial efficiency.

It is a classical fact that the intersection of subvarieties of a given complex variety can be described by the multiplication in a ring (cohomology or Chow ring). This is called Schubert calculus in the case of complex Grassmann manifolds. For counting real zeros, we defined the probabilistic intersection ring of Riemannian homogeneous spaces, whose multiplication mirrors the intersection of randomly moved submanifolds. However, this ring typically is infinite dimensional (Banach algebra). The elements of this ring are classes of zonoids (certain convex bodies) in the space of skew symmetric forms on the tangent space of the homogeneous space. There is a close relation to integral geometry and convex valuations. This new structure bridges real algebraic geometry to convex geometry, mixed volumes, and the Alexandrov-Fenchel inequality. It appears to satisfy the properties of a Kähler package.
Norm minimization along a group orbit
My booklet 0 0