Periodic Reporting for period 2 - CID (Computing with Infinite Data)
Okres sprawozdawczy: 2019-04-01 do 2023-03-31
More and more applications require the use of computers, but in many situations, computer programs can present inaccuracies due to, for example, rounding errors. However, software inaccuracies can create vulnerabilities which can have unexpected consequences. For example, the first Ariadne 5 launch (a rocket used by the European Space Agency) failed due to that kind of inaccuracies. We develop tools which can help resolve these inaccuracies. The problem is that there is a dissociation between the mathematical theory and its implementation in computer programs.
One way out of the problem is to formally prove the correct functioning of engineering applications: The autopilot in an aircraft must work flawlessly, e.g.. Human life depends on the technology being correct at all times. Software is usually tested against a variety of possible scenarios. However, this does not guarantee that the software is completely safe. Vulnerabilities can still exist and can have unintended consequences. Using infinite-precision data helps to develop tools to formally prove that programs function according to their respective requirements, which means that we can be sure that the software works correctly.
In another part of the investigation, we would like to understand which problems are inherently difficult or even impossible to solve with the help of computers, and how do they fundamentally differ from easier problems. This knowledge is valuable for software developers, since they do no longer need to spend their time for searching for alternatives in such cases.
Work on all of these objectives has been very successful:
The logical system for specifying algorithms and extracting algorithms from proofs has been dramatically extended and improved: constructive reasoning is seamlessly integrated with classical logic; an extension for specifications of concurrent processes at a very high level is available and has proven practically useful since it improves efficiency; computational garbage can be completely avoided; computational complexity of computation on infinite data can be controlled through precise estimates of look-ahead.
For several important types of differential equations appearing in physics and engineering, the intrinsic complexity of solving them has been determined.
On the applied side, existing software packages using infinite-precision data have been further extended and the interplay of the different packages has been facilitated.
Two measures of descriptive complexity were studied: (1) how complex it is to obtain a set from open sets using Boolean operations; (2) how complex it is to test membership in the set. Both measures are equivalent on countably-based spaces, but not in general, the reason probably being the mismatch between topological and sequential aspects of topological spaces.
Much work has been done on computable problems. First of all, the computability of certain dynamical systems has been studied. A typical example of a "chaotic" attractor arising on systems that have a very complex behaviour and sensitive dependence on initial conditions, which appear in the context of many applications like meteorology, e.g. is the Lorenz attractor. The computability of geometric Lorenz attractors and their physical measured was derived.
Computable problems are classified according to the space and/or time resources necessarily needed for in their computation. The behaviour of dynamical systems was studied in this respect. Attractors can be typically computed in an efficient manner but, for some pathological parameters, the cost of computing can be arbitrarily high. For Lorenz-like attractors the main sources which contribute to their computational complexity have been identified. For other ones, the flow passing near a saddle equilibrium point plays an important role on the overall complexity.
Further highlights in this research are:
A restriction of the famous 3-Body Problem is computable in polynomial time on average. The 3-body problem features prominently in astronomy, e.g. when the motion of a space shuttle under the gravitational pull of, say, the earth and the moon should be computed.
One-soliton and two-soliton solutions to the Korteweg-de Vries partial differential equation for shallow water waves are computable in polynomial-time.
A variant of the (still open) Hilbert’s 16th problem was considered. Hilbert’s 16th problem asks for an upper bound on the number of limit cycles that a polynomial system of degree n has on the plane. Instead, the problem of determining the exact number of periodic orbits (or limit cycles) over the plane was considered. As shown, this problem is uniformly non-computable in general. However, the exact number of periodic orbits can be computed if the polynomial systems are structurally stable and considered over the unit ball.
A natural encoding of the space of divergence-free vector fields on the unit square in the Euclidean plane was constructed and used to show first the mild solution of the Stokes-Dirichlet problem and then a strong local solution to the non-linear incompressible Navier-Stokes initial value problem uniformly computable; thus solving a problem of our-El and Richards.
The computability result for geometric Lorenz attractors and their physical measures was an important achievement. Before this result Lorenz' research was mainly based on non-rigorous (i.e. truncation-error prone) numerical simulations, and an existence proof for Lorenz attractor was elusive.
As said, in research on extending descriptive set theory to larger classes of spaces quasi-Polish spaces played an eminent role so far. A major discovery was the identification of four spaces which can serve as canonical examples of spaces failing to be quasi-Polish. This result has a strong impact on further research.
An axiomatic theory of randomness has been developed, taking a fundamental notion of “independence” as primitive instead of the standard approach of defining randomness, based on computability-theoretic definability criteria. The axioms allow a development of probability theory along novel lines, with probability measures being determined by their random elements.