Final Activity Report Summary - RKNIVEN-MC-IIF-06 (Maximum Entropy and Combinatorial Information Theory: Concepts and Analysis of Complex Systems)
This project concerned the concept of entropy, a measure of the disorder of a system and one of the most profound but least understood discoveries of human knowledge, which underpins present-day statistical physics, thermodynamics, information theory and encoding, optimisation and data extraction. The project involved a re-examination of the combinatorial, or probabilistic, basis of entropy, the second oldest definition of entropy as given by Boltzmann in 1877, in which a probabilistic system composed of discrete entities was identified with its most probable realisation, i.e. the MaxProb principle. This was united with the generic formulation of statistical mechanics given by Jaynes in 1957, referred to as the maximum entropy principle, MaxEnt, to develop an enlarged theory for probabilistic inference, enabling the analysis of complex probabilistic systems of any kind.
The project involved both theoretical and applied components. The theoretical component included a detailed review of the combinatorial basis of entropy, with particular focus on development of a new, non-asymptotic body of theory for the analysis of systems of small numbers of entities. This had important applications in thermodynamics, quantum mechanics, nano-science, ecology and information theory. Moreover, several new statistics were derived as part of the theoretical component, including:
1. systems of particles in indistinguishable states
2. systems in which the source or prior probabilities changed with sampling in accordance with the Polya distribution, thus involving non-independently and identically distributed (non-iid) sampling
3. a combinatorial reappraisal of the Tsallis entropy function and
4. a re-examination of the Fisher information function.
The analysis of systems in accordance with the Polya distribution was shown to explain the Acharya-Swamy intermediate statistic, which interpolated between the Bose-Einstein and Fermi-Dirac statistics of quantum mechanics. On the applications component, attention was paid to the application of MaxEnt or MaxProb for:
1. the prediction of the equilibrium position of an elastic solid, thereby bringing engineering mechanics into a thermodynamic framework, which was an idea originally conceived by Gibbs in 1875-78
2. the prediction of the steady-state velocity distribution in turbulent fluid flow, a critical unsolved problem in fluid mechanics;
3. new methods for the analysis of environmental systems, including efforts to explain the maximum entropy production (MEP) principle, used to predict the steady-state of complex systems such as the earth climate system
4. applications to inverse modelling, involving a reversal of Bayesian inference and
5. the application of finite-time limit theory to ecological systems.
The project therefore was very broad in scope and led to new developments in a wide range of different fields. The project also included teaching of a graduate course, named maximum entropy analysis, at the University of Copenhagen, the initiation and hosting of the facets of entropy workshop in Copenhagen between 24 and 26 October 2007 which was attended by researchers from 18 nations, visits to research institutions and presentations to conferences in Denmark, Germany, Italy, United Kingdom, Slovakia and United States of America and, finally, research supervision of two PhD students.
The project involved both theoretical and applied components. The theoretical component included a detailed review of the combinatorial basis of entropy, with particular focus on development of a new, non-asymptotic body of theory for the analysis of systems of small numbers of entities. This had important applications in thermodynamics, quantum mechanics, nano-science, ecology and information theory. Moreover, several new statistics were derived as part of the theoretical component, including:
1. systems of particles in indistinguishable states
2. systems in which the source or prior probabilities changed with sampling in accordance with the Polya distribution, thus involving non-independently and identically distributed (non-iid) sampling
3. a combinatorial reappraisal of the Tsallis entropy function and
4. a re-examination of the Fisher information function.
The analysis of systems in accordance with the Polya distribution was shown to explain the Acharya-Swamy intermediate statistic, which interpolated between the Bose-Einstein and Fermi-Dirac statistics of quantum mechanics. On the applications component, attention was paid to the application of MaxEnt or MaxProb for:
1. the prediction of the equilibrium position of an elastic solid, thereby bringing engineering mechanics into a thermodynamic framework, which was an idea originally conceived by Gibbs in 1875-78
2. the prediction of the steady-state velocity distribution in turbulent fluid flow, a critical unsolved problem in fluid mechanics;
3. new methods for the analysis of environmental systems, including efforts to explain the maximum entropy production (MEP) principle, used to predict the steady-state of complex systems such as the earth climate system
4. applications to inverse modelling, involving a reversal of Bayesian inference and
5. the application of finite-time limit theory to ecological systems.
The project therefore was very broad in scope and led to new developments in a wide range of different fields. The project also included teaching of a graduate course, named maximum entropy analysis, at the University of Copenhagen, the initiation and hosting of the facets of entropy workshop in Copenhagen between 24 and 26 October 2007 which was attended by researchers from 18 nations, visits to research institutions and presentations to conferences in Denmark, Germany, Italy, United Kingdom, Slovakia and United States of America and, finally, research supervision of two PhD students.