## Final Report Summary - FLINT (Finite-Length Information Theory)

Shannon’s Information Theory establishes the fundamental limits of information processing systems. This notion of fundamental limit in information processing parallels that of the speed of light in physics. A concept that is hidden in the mathematical proofs of the key results in Information Theory is that in order to achieve the fundamental limits we need sequences of very long or infinite duration. For example, in order to achieve the channel capacity of the cellular wireless communication channel, we would need to transmit infinitely long messages. This implies that we would need to wait for an infinite amount of time to decode the message and hear the voice of our correspondent. Furthermore, the actual decoding process of a code of infinite length entails infinite complexity. Clearly, this is not feasible in practice.

Practical information processing systems have strict limitations in terms of length, induced by system constraints on delay and complexity. The vast majority of the Information Theory literature ignores these constraints and theoretical studies that provide a finite-length treatment of information processing are hence urgently needed. When finite lengths are employed, asymptotic techniques (laws of large numbers, large deviations) cannot be invoked and new techniques must be sought. A fundamental understanding of the impact of finite lengths is crucial to harvesting the potential gains in practice.

This project has contributed towards providing a unified framework for the study of finite-length Information Theory. Novel approaches to study the probability of error of finite-length source and channel codes have been developed during the project. In particular, an exact expression of the error probability has been derived, tight approximations have been found and new coding schemes have been proposed. The study has been based on a combination of information-spectrum combined with tight bounding techniques. An information spectrum approach seems to naturally fit the finite-length problem, as it characterises the mutual information as a random variable.

This project has advanced the area and will contribute to Information Sciences and Systems disciplines where Information Theory is relevant. Therefore, the results of this project will be of benefit to areas such as communication theory (limits of communications), probability theory (limit theorems and large deviations), statistics (hypothesis testing), physics (thermodynamics, quantum information theory), computer science (complexity), mathematics (inequalities), economics (portfolio theory, gambling), bioinformatics and computational neuroscience.

Practical information processing systems have strict limitations in terms of length, induced by system constraints on delay and complexity. The vast majority of the Information Theory literature ignores these constraints and theoretical studies that provide a finite-length treatment of information processing are hence urgently needed. When finite lengths are employed, asymptotic techniques (laws of large numbers, large deviations) cannot be invoked and new techniques must be sought. A fundamental understanding of the impact of finite lengths is crucial to harvesting the potential gains in practice.

This project has contributed towards providing a unified framework for the study of finite-length Information Theory. Novel approaches to study the probability of error of finite-length source and channel codes have been developed during the project. In particular, an exact expression of the error probability has been derived, tight approximations have been found and new coding schemes have been proposed. The study has been based on a combination of information-spectrum combined with tight bounding techniques. An information spectrum approach seems to naturally fit the finite-length problem, as it characterises the mutual information as a random variable.

This project has advanced the area and will contribute to Information Sciences and Systems disciplines where Information Theory is relevant. Therefore, the results of this project will be of benefit to areas such as communication theory (limits of communications), probability theory (limit theorems and large deviations), statistics (hypothesis testing), physics (thermodynamics, quantum information theory), computer science (complexity), mathematics (inequalities), economics (portfolio theory, gambling), bioinformatics and computational neuroscience.