Skip to main content
Aller à la page d’accueil de la Commission européenne (s’ouvre dans une nouvelle fenêtre)
français français
CORDIS - Résultats de la recherche de l’UE
CORDIS

Exploring Duality for Future Data-driven Modelling

Periodic Reporting for period 4 - E-DUALITY (Exploring Duality for Future Data-driven Modelling)

Période du rapport: 2023-04-01 au 2024-03-31

Future data-driven modelling is increasingly challenging for many systems due to higher complexity levels, such as in energy systems, environmental and climate modelling, traffic and transport, industrial processes, health, safety, and others. It is desirable to conceive new frameworks for tailoring models to the systems and data characteristics, related to tasks of regression, classification, clustering, dimensionality reduction, outlier detection and dynamical systems modelling. An important element at this point is to have a good understanding of representations of models.

The aim of the E-DUALITY project is to explore and engineer the potential of duality principles for future data-driven modelling. Duality principles in general play an important role in mathematics, physics, optimization. Within the context of this project it enables to study different representations of models. For example in support vector machines, models can be represented in primal and dual forms by feature maps or kernel functions, respectively. Depending on the dimensionality of the input space and the number of training data, one representation can be more suitable to employ than another. Another recent example is conjugate feature duality in restricted kernel machines which enables to establish new unexpected connections between kernel machines, neural networks and deep learning.

The overall objective of the project is to obtain a generically applicable framework with unifying insights that includes both parametric and kernel-based approaches and is applicable to problems with different system complexity levels.
We have worked on achieving a unifying framework between deep learning, neural networks and kernel machines. Most promising are to consider core models based on Restricted Kernel Machines or based on Least Square Support Vector Machines. It enables to work either in a parametric way (e.g. deep neural networks, convolutional feature maps) or kernel-based in its dual representation. Because of its connection with Restricted Boltzmann machines, Restricted Kernel Machines provide a setting for generative modelling. Moreover, it is suitable for multi-view and tensor-based models, deep learning, latent space exploration, explainability and robustness. Remarkable new connections have been shown between self-attention in transformers and kernel singular value decomposition within the least squares support vector machines framework with primal and dual representations. These new settings with solid foundations and understanding of their primal and dual representations can be exploited towards a wide range of different application fields.


Recent publications:

Pandey A., Schreurs J., Suykens J.A.K. Generative Restricted Kernel Machines: A Framework for Multi-view Generation and Disentangled Feature Learning, Neural Networks, Vol.135 pp 177-191, March 2021

Pandey A., Fanuel M., Schreurs J., Suykens J.A.K. ``Disentangled Representation Learning and Generation with Manifold Optimization'', Neural Computation, vol. 34, no. 10, 2022, pp. 2009-2036.

Fanuel M., Schreurs J., Suykens J.A.K. ``Nystrom landmark sampling and regularized Christoffel functions'', Machine Learning, vol. 11, 2022, pp. 2213-2254.

Tonin F., Lambert A., Patrinos P., Suykens J.A.K. ``Extending Kernel PCA through Dualization: Sparsity, Robustness and Fast Algorithms'', ICML 2023.

He M., He F., Shi L., Huang X., Suykens J.A.K. ``Learning with Asymmetric Kernels: Least Squares and Feature Interpretation'', IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 8, Aug. 2023, pp. 10044-10054.

Chen Y., Tao Q., Tonin F., Suykens J.A.K. ``Primal-Attention: Self-attention through Asymmetric Kernel SVD in Primal Representation'', NeurIPS 2024.

Tonin F., Tao Q., Patrinos P., Suykens J.A.K. ``Deep Kernel Principal Component Analysis for Multi-level Feature Learning'', Neural Networks, vol. 170, Feb. 2024, pp. 578-595.

Tao Q., Tonin F., Patrinos P., Suykens J., ``Tensor-based Multi-view Spectral Clustering via Shared Latent Space'', Information Fusion, vol. 108, Aug. 2024, pp. 1-15.

Tao Q., Tonin F., Lambert A., Chen Y., Patrinos P., Suykens J.A.K. ``Learning in Feature Spaces via Coupled Covariances: Asymmetric Kernel SVD and Nystrom method", ICML 2024.

Chen Y., Tao Q., Tonin F., Suykens J., ``Self-Attention through Kernel-Eigen Pair Sparse Variational Gaussian Processes'', ICML 2024.

Achten S., Tonin F., Patrinos P., Suykens J. A. K., ``Unsupervised Neighborhood Propagation Kernel Layers for Semi-supervised Node Classification'', in Proc. of the AAAI Conference on Artificial Intelligence (AAAI), Vancouver, Canada, Mar. 2024, pp. 10766 - 10774.
In the current state of the art kernel-based models such as support vector machines are often opposed to neural networks and deep learning. However, thanks to the new insights that we obtained on new synergies and a unifying framework between kernel machines, neural networks and deep learning, the results of this project are going beyond this state of the art. The duality principles play a key role in achieving this. New unexpected connections have been found through conjugate feature duality between kernel principal component analysis, least squares support vector machines, restricted Boltzmann machines and deep Boltzmann machines. Moreover, the results of this project also show that self-attention in transformers can be considered in synergy with kernel-based models through duality. Hence, we can establish new solid foundations for a wide range of tasks and applications in data-driven modelling.
Primal and dual model representations related to parametric and kernel-based models
Mon livret 0 0