Skip to main content
European Commission logo print header

An Information Theory of Simple Interaction

Periodic Reporting for period 4 - InfoInt (An Information Theory of Simple Interaction)

Reporting period: 2019-09-01 to 2021-08-31

This project is aimed at exploring a multitude of problems in information theory, inference, and related fields that naturally arise from studying two or more users communicating and interacting over noisy channels, or observing noisy correlated data. Our main goals are to better understand the fundamental limits associated with such problems, the various different primitives that are required in order to efficiently exchange information or perform various inference tasks, with an emphasis on low-complexity constructions. As such, some of the challenges we address admit practical applications and implications in real-world systems. For example, our interactive paradigm provides means of communicating much more efficiently in terms of of complexity, delay, and consumed power, relative to state of the art communication protocols that do not apply interaction. Also, some of our results characterize the amount of information that needs to be exchanged in order to perform various common distributed tasks, such as communication under various optimality criteria, hypothesis testing, and parameter estimation, and in some cases we also provide efficient means of approaching these bounds. Our approach also facilitates the design of more accurate adaptive noisy search techniques, as well as efficient low complexity synchronization methods that can be applied to improve the performance of positioning systems.
Below is a summary of the work and main results achieved during the project (order is arbitrary):

(1) A novel interactive communication protocol with noisy feedback over Gaussian channels, that can attain close to optimal performance while maintaining a very low-complexity (two orders of magnitude below state of the art systems).

(2) Extension of the above ideas to multidimensional setups, resulting in improved bounds on the error performance of such systems.

(3) A new notion of a two-way channel capacity for finite-state protocols. We described a novel simulation scheme based on universal source coding, channel coding, and concentration of measure techniques, that can simulate any finite-state protocol with accurate rate guarantees at any noise level, in some cases optimally so. This is the first such result in the interactive setting -- the interactive channel capacity for general protocols is notoriously difficult and wide open at any fixed noise level.

(4) Noisy synchronization between two terminals connected by a channel: We have completely characterized the fundamental limits under various optimality criteria, and provided a low-complexity construction achieving these limits in the vanishing error case.

(5) Introduced the graph information ratio, a new information theoretic notion that quantifies the most efficient way of communicating an information source over a channel while maintaining some structural integrity of the source. We have studied the information ratio in depth, and derived many intriguing properties.

(6) In multiuser zero-error communication setups, we derived a new outer bound on the rate region of the binary adder multiple access channel using a novel variation of VC-dimension. We studied the problem of efficiently broadcasting messages to two users under zero error, and gave new inner bounds on achievable rates, via the novel notion of what we called the rho-capacity of a graph. And we gave new inner and outer bounds for the zero error capacity region of the two-way channel, using versatile techniques.

(7) Relating feedback information theory to problems of adaptive search, we have studied the problem of optimally searching for a target under a physically motivated model of measurement dependent noise, where the noise level depends on the size of the probed area. We showed how to optimally solve this problem via feedback communication techniques, and also characterized the penalty incurred if the search is forced to be non-adaptive.

(8) We studied various information theoretic problems in a "single-bit" setup, where the output of a channel needs to be compressed into one bit (or a few bits), that would be "most informative" if sent back to the transmitter. We provided a new upper bound on the performance of the best strategy for the mutual information measure, improving the best known bound in a wide regime. When measuring "informative" in terms sequentially predictability under quadratic loss, we showed that majority vote is essentially the optimal function in low noise, and is not optimal for high noise; hence, there is no single Boolean function that is simultaneously optimal at all noise levels. This also led us to study Boolean self-predicting functions and their properties, using Fourier analysis over the Boolean cube.

(9) Distributed interactive inference under communication constraints: We characterized the minimax quadratic risk that two remotely located agents can attain when they want to estimate the correlation between their samples, under a fixed communication budget. In particular, we used novel symmetric SDPI techniques to show that interaction does not help in this problem, and described a low-complexity protocol that attains the best possible performance asymptotically, via extreme value theory. We also analyzed a distributed hypothesis testing version of the problem, and provided new bounds on the Stein exponent, improving the state-of-the-art, via novel diffusion process based techniques.

(10) Private simultaneous messages: In this problem, Alice and Bob want Charlie to compute a function of their random inputs, and (using shared randomness) to learn nothing beyond the value of the function. We showed that the best-known lower bound for this problem (from the 1990’s) had a flaw in its proof, and provided a different proof that resulted in a correct lower bound (that is now the best bound known).

(11) Hypothesis testing under computation constraints: We provided new upper and lower bounds (tight in some regimes) on the minimax error probability in testing between two Bernoulli distributions using a finite-state machine, and showed that deterministic machines are worse than randomized ones by as much as a factor of 2 in the exponent.

Finally, it should be noted that the project has kindly received a one-year extension, as well as another 6 months extension due to delays caused by the Covid-19 pandemic. This time extension was very beneficial to the success of the project, as it enabled us to finalize multiple outstanding tasks and bring them to fruition.
We have made progress that goes well beyond the state of the art in virtually all the research items discussed above, as detailed and elaborated therein (please see above).