Periodic Reporting for period 3 - InfoInt (An Information Theory of Simple Interaction)
Reporting period: 2018-03-01 to 2019-08-31
(2) Extension of the above ideas to multidimensional setups, resulting in improved bounds on the error performance of such systems.
(3) A new notion of a two-way channel capacity for Markovian protocols, we have described a novel simulation scheme based on universal source coding, channel coding, and concentration of measure techniques, that can simulate any Markovian protocol with accurate rate guarantees at any noise level. This is the first such result in the interactive setting -- the interactive channel capacity for general protocols is notoriously difficult and wide open at any fixed noise level.
(4) Noisy synchronization between two terminals connected by a channel: We have completely characterized the fundamental limits under various optimality criteria, and provided a low-complexity construction achieving these limits in the vanishing error case.
(5) Introduced the graph information ratio, a new information theoretic notion that quantifies the most efficient way of communicating an information source over a channel while maintaining some structural integrity of the source. We have studied the information ratio in depth, and derived many intriguing properties.
(6) In multiuser communication setups, we derived a new outer bound on the rate region of the binary adder multiple access channel with zero-error using a novel variation of vc-dimension. We also studied the problem of efficiently broadcasting messages to two users under zero error, and gave new inner bounds on achievable rates, via the novel notion of what we called the rho-capacity of a graph.
(7) Relating feedback information theory to problems of adaptive search, we have studied the problem of optimally searching for a target under a physically motivated model of measurement dependent noise, where the noise level depends on the size of the probed area. We showed how to optimally solve this problem via feedback communication techniques, and also characterized the penalty incurred if the search is forced to be non-adaptive.
(8) We studied two problems in which the output of a channel needs to be compressed into a single bit, that would be most informative if sent back to the transmitter. When informative is measured in terms of mutual information, we have provided a new upper bound on the performance of the best strategy, improving the best known bound in a wide regime. When informative is measured in terms the ability to sequentially predict the output process under quadratic loss, we showed that majority vote is essentially the optimal function in low noise, and is not optimal for high noise; hence, there is no single Boolean function that is simultaneously optimal at all noise levels.
(9) As part of our effort to understand how to interactively approximate two-party functions via simpler functions (a natural generalization of Yao's communication complexity), we have begun to develop a theory of lossy matrix compression, where some possibly random matrix needs to be represented efficiently with small worst case row/column distortions.