This proposal concerns the concept of entropy, a measure of the disorder of a system, and one of the most profound but least understood discoveries of human knowledge. As shown by Boltzmann, entropy is based on probabilistic (combinatorial) concepts; it no w underpins present-day statistical physics, thermodynamics, information theory and encoding, optimisation and data extraction.
However, as shown by Jaynes and further developed by the researcher, there is scope to develop a far more powerful body of combinatorial information theory based on 'raw' combinatorial concepts, for the analysis of complex probabilistic systems. The potential for new methods for analysis of many engineering and environmental systems - to replace a variety of empirical and semi-theoretical methods - is especially strong.
In addition, in statistical physics a number of alternative entropy functions (e.g. Bose-Einstein, Fermi-Dirac, Rényi, Tsallis and Kaniadakis) have been introduced. These can also be interpreted in a combinatorial sense, and applied to systems with particular combinatorial properties.
The aim of this proposal is to unite these ideas, to develop a new, highly interdisciplinary field of combinatorial information theory, both in concept and application. In concept, the proposal will examine the combinatorial basis of the entropy concept, its relation to other bases and to quantum mechanics; and explore the combinatorial basis of alternative entropy functions, their properties, their optimisation, and the derivation of new forms.
In application, the proposal will bring the concept of elasticity, in engineering mechanics, into a thermodynamic framework; extend the methods of Jaynes and Chiu for the prediction of velocity profiles in turbulent fluid mechanics and turbulent boundary layers; develop new methods to analyse and optimise energy and environmental systems; and examine the inverse modelling of physical systems. A philosophical component is also proposed.
Call for proposal
See other projects for this call