Periodic Reporting for period 4 - E-DUALITY (Exploring Duality for Future Data-driven Modelling)
Période du rapport: 2023-04-01 au 2024-03-31
The aim of the E-DUALITY project is to explore and engineer the potential of duality principles for future data-driven modelling. Duality principles in general play an important role in mathematics, physics, optimization. Within the context of this project it enables to study different representations of models. For example in support vector machines, models can be represented in primal and dual forms by feature maps or kernel functions, respectively. Depending on the dimensionality of the input space and the number of training data, one representation can be more suitable to employ than another. Another recent example is conjugate feature duality in restricted kernel machines which enables to establish new unexpected connections between kernel machines, neural networks and deep learning.
The overall objective of the project is to obtain a generically applicable framework with unifying insights that includes both parametric and kernel-based approaches and is applicable to problems with different system complexity levels.
Recent publications:
Pandey A., Schreurs J., Suykens J.A.K. Generative Restricted Kernel Machines: A Framework for Multi-view Generation and Disentangled Feature Learning, Neural Networks, Vol.135 pp 177-191, March 2021
Pandey A., Fanuel M., Schreurs J., Suykens J.A.K. ``Disentangled Representation Learning and Generation with Manifold Optimization'', Neural Computation, vol. 34, no. 10, 2022, pp. 2009-2036.
Fanuel M., Schreurs J., Suykens J.A.K. ``Nystrom landmark sampling and regularized Christoffel functions'', Machine Learning, vol. 11, 2022, pp. 2213-2254.
Tonin F., Lambert A., Patrinos P., Suykens J.A.K. ``Extending Kernel PCA through Dualization: Sparsity, Robustness and Fast Algorithms'', ICML 2023.
He M., He F., Shi L., Huang X., Suykens J.A.K. ``Learning with Asymmetric Kernels: Least Squares and Feature Interpretation'', IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 8, Aug. 2023, pp. 10044-10054.
Chen Y., Tao Q., Tonin F., Suykens J.A.K. ``Primal-Attention: Self-attention through Asymmetric Kernel SVD in Primal Representation'', NeurIPS 2024.
Tonin F., Tao Q., Patrinos P., Suykens J.A.K. ``Deep Kernel Principal Component Analysis for Multi-level Feature Learning'', Neural Networks, vol. 170, Feb. 2024, pp. 578-595.
Tao Q., Tonin F., Patrinos P., Suykens J., ``Tensor-based Multi-view Spectral Clustering via Shared Latent Space'', Information Fusion, vol. 108, Aug. 2024, pp. 1-15.
Tao Q., Tonin F., Lambert A., Chen Y., Patrinos P., Suykens J.A.K. ``Learning in Feature Spaces via Coupled Covariances: Asymmetric Kernel SVD and Nystrom method", ICML 2024.
Chen Y., Tao Q., Tonin F., Suykens J., ``Self-Attention through Kernel-Eigen Pair Sparse Variational Gaussian Processes'', ICML 2024.
Achten S., Tonin F., Patrinos P., Suykens J. A. K., ``Unsupervised Neighborhood Propagation Kernel Layers for Semi-supervised Node Classification'', in Proc. of the AAAI Conference on Artificial Intelligence (AAAI), Vancouver, Canada, Mar. 2024, pp. 10766 - 10774.