Skip to main content
European Commission logo print header

Analysis of Boolean Functions for Algorithms and Complexity

Final Report Summary - TCSTURKEY (Analysis of Boolean Functions for Algorithms and Complexity)

Researcher: Dr. Ryan O'Donnell
Associate Professor, Dept. of Computer Science, Carnegie Mellon Univ., Pittsburgh USA

Project Coordinator and Host Institution:
Dr. A. C. Cem Say
Professor, Dept. of Computer Engineering, Bogazici Univ., Istanbul Turkey

The scientific focus of the project is in the field Theoretical Computer Science. This is the mathematical branch of Computer Science/Engineering research dealing with the theoretical capabilities and limitations of computation, not specifically tied to any particular technological model. Researcher Ryan O'Donnell's particular area of speciality involves bringing tools from probability and discrete analysis to the study of computation; Project Coordinator Cem Say's particular area of speciality is in theoretical models for computational complexity; especially, models based on quantum mechanics. Thus much of the research focus of the project has involved the study of quantum computational complexity and probabilistic algorithms. We discuss both the project research and the transfer of knowledge outcomes below.


Quantum complexity research The first component of the quantum research was a collaboration between O'Donnell and Say concerning the computational complexity of time travel. This may sound fanciful. However: a) so-called "closed timelike curves (CTCs)" are apparently consistent with the laws of physics; b) the topic has previously been studied by renowned physicists (e.g. David Deutsch) and computer scientists (e.g. Scott Aaronson); c) research on the model provides insights into the relative powers of quantum and classical computation; d) even with the "time travel" motivation removed, the resulting questions concerning the computational complexity of Markov chains and quantum channels have impact for basic classical and quantum complexity theory. Building on previous work by the Say and his former Ph.D. student Yakaryilmaz, O'Donnell and Say were able to show that the algorithmic benefit conferred by any fixed number w of "time-traveling quantum bits" is no greater than that conferred by a single time-traveling classical bit.

A second component of the quantum research was joint between O'Donnell's and his Ph.D. student John Wright. It concerned the following fundamental problem in quantum theory: Suppose a physical experiment produces a particle in some unknown d-dimensional quantum state. The experimenter wishes to know some information about this state; perhaps complete information (this is called the quantum tomography problem) or perhaps only some partial information (say, just about the probabilities p1, . . . , pd; this is called the spectrum testing problem). We assume the experimenter can prepare n copies of the particle and then measure the result. The fundamental question is: How large does n need to be, as a function of d, to reliably answer various questions? O'Donnell and Wright completed an extensive investigation into the spectrum testing problem, establishing several new sharp upper and lower bounds. It is to be hoped that the upper bounds provided have impact on the practical problem of quantum tomography.

Algorithmic statistics and information theory. The project pursued several objectives in this area; we highlight here only one. We consider arguably the most basic problems in statistics: Suppose we are given access to from an unknown probability distribution p1, . . . , pd on {1, 2, . . . , d}. How many samples n are required to reliably determine various properties of this distribution? (In fact, this is precisely the classical analogue of the quantum spectrum testing problem discussed above.) For many problems of this nature, the answer is — unfortunately — that n must be quite large compared to d.

For example, to estimate the entropy of the probability distribution, roughly n  d samples are required. However, recently algorithmic research has shown that this sample complexity can be dramatically lowered, to n  log2 d, if the experimenter is also allowed to make "distribution queries"; i.e. request the value of pi for any i. These algorithms may have impact for many "Big Data" applications in computer science. During the project, O'Donnell, Say, and Bogazici students Cafer Caferov and Baris Kaya collaborated to show that these algorithms are best possible; i.e. that roughly laborated to show that these algorithms are best possible; i.e. that roughly n  log2 d samples/queries are always required.

Besides the aforementioned works, several other research projects — on topics in information theory, mathematical biology, algorithms for optimization tasks, among others — are ongoing or completed. O'Donnell has completed (or nearly completed) eight separate research works during the course of the project; three with the Project Coordinator and his students, two others with European visitors to Bogazici, and the other three with his Ph.D. students. Several have already been published (in the FSTTCS 2014, ITW 2015, and STOC 2015 conferences), and the rest are being submitted for publication.

Another major component of the project is to achieve transfer of knowledge, both from the Project Coordinator and other Bogazici researchers to the Researcher, and from the Researcher to students at Bogazici in Turkey, and in Europe. In the first direction, O'Donnell has benefited from the quantum computation expertise of Dr. Say and from the mathematical and probabilistic expertise of Dr. Atilla Yilmaz from the Bogazici University Mathematics Department. We also summarize transfer of knowledge outcomes in the other direction.

Teaching and student mentoring. In the fall 2014 semester, O'Donnell taught CmpE 587: Intro- duction to Research in Theoretical Computing, variant of O'Donnell's 2013 Carnegie Mellon graduate course A theorists' toolkit (
In the spring 2014 semester, O'Donnell and Say organized a reading seminar for Dr. Say's group, to study quantum algorithms in general and the recent Ph.D. thesis of Aleksandr Belovs in particular. These two activities had impact on approximately 15 students, bringing them up to speed on the forefront of research in theoretical computer science. In addition, O'Donnell spent time mentoring and advising several Bogazici students; indeed, this led to a joint research paper with one master's student (Cafer Caferov) and one undergraduate (Baris Kaya).

Outreach to undergraduates students in Turkey and in Europe. Over the summer of 2014, O'Donnell gave two week-long, 15-hour "summer school" courses on the topic of Analysis of Boolean Functions — a mathematical tool heavily used in the theory of computation. These took place at Turkey's Nesin Mathematics Village and in Stockholm at KTH's Swedish Summer School in Computer Science Approximately 50 undergraduates and graduate students from Turkey and Europe were impacted by these lectures.

International Theoretical Computer Science seminar series. O'Donnell organized a series of lectures at Bogazici University and at the Istanbul Center for Mathematical Sciences, with internationally-renowned European researchers: Sangxia Huang (KTH Stockholm); Dr. Guy Kindler (Hebrew University); Dr. Ilias Diakonikolas (Edinburgh University); Dr. Rahul Santhanam (Edinburgh University). In addition, O'Donnell facilitated an invited lecture by his Carnegie Mellon University colleague, the noted Human Computation researcher/entrepreneur Luis von Ahn. On Sept. 29, 2014, von Ahn inaugurated the 2014-2015 Bogazici Lecture Series by speaking to people from all departments of the university on the topic of Massive Education for the Future.

Powiązane dokumenty