Some of the most important and exciting challenges of our ‘information age’ have led to the development of novel statistical methodology and algorithms that are designed to deal with inference settings involving high-dimensionality, graphical and network structures, inverse problems, ‘big data’, stochastic differential equations, diffusion processes, cosmic microwave background maps, brain tomography etc.
While an abundance of algorithms is now available, a scientifically rigorous theory of uncertainty quantification and statistical decision making for such procedures has not been developed yet. Traditional approaches such as maximum likelihood estimation or parametric Bayesian inference cannot be used naively in increasingly complex contemporary statistical models. The construction of confidence statements and critical values for significant hypothesis tests is, however, of crucial importance for all applications of the statistical sciences to the modern world.
In this research we propose an objective, mathematically rigorous, and practical paradigm for uncertainty quantification in modern statistical inference problems, and illustrate how this approach can be used in some of the recently emerged areas of statistics. Our theory can validate both Bayesian and frequentist approaches to statistical inference, and can be expected to be optimal in an information-theoretic sense. It has potential impact on all areas of scientific theory building, on legal and medical practice, public management of the internet, modern media and other information structures, and also on the foundations of the mathematical discipline of statistics in itself.
Field of science
- /natural sciences/computer and information sciences/data science/big data
- /natural sciences/mathematics/pure mathematics/mathematical analysis/differential equations
- /natural sciences/mathematics/applied mathematics/statistics and probability/bayesian statistics
Call for proposal
See other projects for this call