Skip to main content

Theoretical and Algorithmic Foundations for Future Proof Information and Inference Systems

Final Report Summary - FUTURE PROOF (Theoretical and Algorithmic Foundations for Future Proof Information and Inference Systems)

A critical technological challenge for emerging information systems is to acquire, analyze and learn from the ever-increasing high-dimensional data produced by natural and man-made phenomena. Sampling, streaming, and recoding of even the most basic applications now produce a data deluge that severely stresses the available analog-to-digital converter, digital communication and storage resources, and easily swamps the back-end processing and learning systems.

Surprisingly, while the ambient data dimension is large in many problems, the relevant information therein typically resides in a much lower dimensional space. Viewed combinatorially and geometrically, natural constraints often cause data to cluster along low-dimensional structures, such as unions-of-subspaces or manifolds, having a few degrees of freedom relative to their size. This powerful notion suggests the potential for developing highly efficient methods for processing and learning by capturing and exploiting the inherent model, or data’s “information level.”

To this end, the ERC project Future-Proof investigated three closely inter-related research and technology themes to develop:

1. Information scalable optimization and data acquisition that reduce the computations and the measurements required for stable solutions to inverse problems to the theoretical minimum.
2. Learning theory and methods for low-dimensional signal models that provably extract the low-dimensional model information from data and estimate key parameters from dimensionality-reduced measurements.
3. Future proof technology pursuits that remove critical sampling and energy bottlenecks in analog-to-digital conversion (ADC) and attack the increasingly important problems of neural signal acquisition as well as computer-to-computer communications.

Our results support the emerging perspective that real progress on data-to-information transition requires a coordinated effort based on combinatorial and geometric foundations that unify low-dimensional modeling frameworks with learning theory. A salient feature of our approach is computational thinking that best leverages our increasingly pervasive and sophisticated cyber-infrastructure.