"Sparse models are at the core of many research domains where the large amount and high-dimensionality of digital data requires concise data descriptions for efficient information processing. A flagship application of sparsity is compressed sensing, which exploits sparsity for data acquisition using limited resources. Besides sparsity, a key pillar of compressed sensing is the use of random low-dimensional projections.
The standard principle of general sparse and redundant representations is to rely on overcomplete dictionaries of prototype signals called atoms. The foundational vision of this proposal is that the efficient deployment of sparse models for large-scale data is only possible if supported by a new generation of efficient sparse models, beyond dictionaries, which must encompass computational efficiency as well as the ability to provide sparse and structured data representations.
Further, I believe that the true impact of compressed sensing has been to unearth an extremely powerful yet counter-intuitive tool: random projections, which open new avenues in machine learning. I envision applications to data sizes and volumes of collections that cannot be handled by today’s technologies.
A particular challenge is to adapt the models to the data by learning from a training corpus. In line with the frontier research on sparse decomposition algorithms, I will focus on obtaining provably good, yet computationally efficient algorithms for learning sparse models from collections of training data, with a geometric insight on the reasons for their success.
My research program is expected to impact the whole data processing chain, from the analog level (data acquisition) to high level processing (mining, searching), where sparsity has been identified as a key factor to address the “curse of dimensionality”. Moreover, the theoretical and algorithmic framework I will develop will be directly applied to targeted audiovisual and biomedical applications."
Fields of science
Call for proposal
See other projects for this call