Skip to main content

An Information Theory of Simple Interaction

Periodic Reporting for period 3 - InfoInt (An Information Theory of Simple Interaction)

Reporting period: 2018-03-01 to 2019-08-31

This project is aimed at exploring a multitude of problems in information theory, inference, and related fields that naturally arise from studying two or more users communicating and interacting over noisy channels. Our main goals are to better understand the fundamental limits associated with such problems, the various different primitives that are required in order to efficiently exchange information or perform some task over a noisy channel, with an emphasis on low-complexity constructions. As such, many of the challenges we address admit practical applications and implications in real world systems. For example, our interactive paradigm provides means of communicating much more efficiently in terms of of complexity, delay, and consumed power, relative to state of the art communication protocols that do not apply interaction. Our approach also facilitates the design of more accurate adaptive noisy search techniques, as well as efficient low complexity synchronization methods that can be applied to improve the performance of positioning systems.
(1) A novel interactive communication protocol with noisy feedback over Gaussian channels, that can attain close to optimal performance while maintaining a very low-complexity (two orders of magnitude below state of the art systems).

(2) Extension of the above ideas to multidimensional setups, resulting in improved bounds on the error performance of such systems.

(3) A new notion of a two-way channel capacity for Markovian protocols, we have described a novel simulation scheme based on universal source coding, channel coding, and concentration of measure techniques, that can simulate any Markovian protocol with accurate rate guarantees at any noise level. This is the first such result in the interactive setting -- the interactive channel capacity for general protocols is notoriously difficult and wide open at any fixed noise level.

(4) Noisy synchronization between two terminals connected by a channel: We have completely characterized the fundamental limits under various optimality criteria, and provided a low-complexity construction achieving these limits in the vanishing error case.

(5) Introduced the graph information ratio, a new information theoretic notion that quantifies the most efficient way of communicating an information source over a channel while maintaining some structural integrity of the source. We have studied the information ratio in depth, and derived many intriguing properties.

(6) In multiuser communication setups, we derived a new outer bound on the rate region of the binary adder multiple access channel with zero-error using a novel variation of vc-dimension. We also studied the problem of efficiently broadcasting messages to two users under zero error, and gave new inner bounds on achievable rates, via the novel notion of what we called the rho-capacity of a graph.

(7) Relating feedback information theory to problems of adaptive search, we have studied the problem of optimally searching for a target under a physically motivated model of measurement dependent noise, where the noise level depends on the size of the probed area. We showed how to optimally solve this problem via feedback communication techniques, and also characterized the penalty incurred if the search is forced to be non-adaptive.

(8) We studied two problems in which the output of a channel needs to be compressed into a single bit, that would be most informative if sent back to the transmitter. When informative is measured in terms of mutual information, we have provided a new upper bound on the performance of the best strategy, improving the best known bound in a wide regime. When informative is measured in terms the ability to sequentially predict the output process under quadratic loss, we showed that majority vote is essentially the optimal function in low noise, and is not optimal for high noise; hence, there is no single Boolean function that is simultaneously optimal at all noise levels.

(9) As part of our effort to understand how to interactively approximate two-party functions via simpler functions (a natural generalization of Yao's communication complexity), we have begun to develop a theory of lossy matrix compression, where some possibly random matrix needs to be represented efficiently with small worst case row/column distortions.
We have made progress that goes well beyond the state of the art in all the research items discussed above, as elaborated therein (please see above). By the end of the project, we expect to make further progress on most fronts that fall under our research agenda, with the general aim of significantly broadening our understanding of the behavior of and interplay between various information-theoretic and inference-theoretic elements that naturally come into play in the setups considered within the project. These include developing some ideas that we have been working on, as well as pushing forward in other directions. For the latter, the following are some of the main new challenges that are planned to be in special focus in the nearer future: (1) Provide a tighter characterization of the interactive channel capacity for Markovian protocols, and specifically a nontrivial upper bound on the capacity; (2) Construct and analyze good/optimal/low-complexity interactive schemes for the two-way deterministic channel model; (3) Study the limits of inference and low-complexity estimation in two-party and multi-party setup as well as the gain associated with interaction, specifically in problems of distributed/interactive correlation estimation, hypothesis testing, prediction, and optimal guessing; (4) Make progress on the problem of interaction in large networks with simple update rules, specifically on how fast information can reliably travel in such systems.