Descripción del proyecto
Arrojar luz sobre la caja negra de los algoritmos bayesianos para los datos masivos
El análisis bayesiano, un método de inferencia estadística que aplica la probabilidad para actualizar nuestra creencia sobre el modelo por medio de observaciones, es fundamental en muchos algoritmos estadísticos y de aprendizaje automático para los datos masivos. Ayuda a comprender los procesos para problemas complejos, como la evaluación del cambio climático y el seguimiento de la propagación de una enfermedad. Sin embargo, los métodos bayesianos están llegando a sus límites para incluir la explosión de datos disponibles, y los intentos de acelerar el procesamiento son, en gran medida, soluciones de «caja negra». El equipo del proyecto BigBayesUQ, financiado con fondos europeos, está desarrollando una teoría para métodos bayesianos escalables que permita cuantificar el rendimiento, las limitaciones y la incertidumbre. Esto aumentará la precisión y, por consiguiente, el apoyo de una amplia comunidad de científicos e investigadores.
Objetivo
Recent years have seen a rapid increase in available information. This has created an urgent need for fast statistical and machine learning methods that can scale up to big data sets. Standard approaches, including the now routinely used Bayesian methods, are becoming computationally infeasible, especially in complex models with many parameters and large data sizes. A variety of algorithms have been proposed to speed up these procedures, but these are typically black box methods with very limited theoretical support. In fact empirical evidence shows the potentially bad performance of such methods. This is especially concerning in real-world applications, e.g. in medicine. In this project I shall open up the black box and provide a theory for scalable Bayesian methods combining recent, state-of-the-art techniques from Bayesian nonparametrics, empirical process theory, and machine learning. I focus on two very important classes of scalable techniques: variational and distributed Bayes. I shall establish guarantees, but also limitations, of these procedures for estimating the parameter of interest, and for quantifying the corresponding uncertainty, within a framework that will also convince outside of the Bayesian paradigm. As a result, scalable Bayesian techniques will have more accurate performance, and also better acceptance by a wider community of scientists and practitioners. The proposed research, although motivated by real world problems, is of a mathematical nature. In the analysis I consider mathematical models, which are routinely used in various fields (e.g. high-dimensional linear and logistic regressions are the work horses in econometrics or genetics). My theoretical results will provide principled new insights that can be used, for instance in multiple specific applications I am involved in, including developing novel statistical methods for understanding fundamental questions in cosmology and the early detection of dementia using multiple data sources.
Ámbito científico
- natural sciencescomputer and information sciencesdata sciencebig data
- natural sciencesmathematicsapplied mathematicsstatistics and probabilitybayesian statistics
- natural sciencescomputer and information sciencesartificial intelligencemachine learning
- natural sciencesmathematicsapplied mathematicsmathematical model
Programa(s)
- HORIZON.1.1 - European Research Council (ERC) Main Programme
Régimen de financiación
HORIZON-AG - HORIZON Action Grant Budget-BasedInstitución de acogida
20136 Milano
Italia