Skip to main content
Vai all'homepage della Commissione europea (si apre in una nuova finestra)
italiano italiano
CORDIS - Risultati della ricerca dell’UE
CORDIS

Sensitivity, Stability, and Computation

Periodic Reporting for period 4 - SensStabComp (Sensitivity, Stability, and Computation)

Periodo di rendicontazione: 2023-12-01 al 2025-07-31

We studied noise stability and sensitivity of mathematical systems and models with special emphasis on noisy intermediate quantum (NISQ) computers.
The notion of noise sensitivity can be explained in terms of voting rules: "how likely it is that mistakes in counting the votes will affect the outcomes of the election". For example the electoral US method is more sensitive compared to the popular vote method.
Fourier analysis turned out to be an important mathematical tool for the study of noise stability and sensitivity.  The thumb rule is that "high frequencies" are sensitive and  "low frequencies" are stable.  
We addressed a variety of mathematical problems in this direction and also tried to attack well known open problems with these methods as well as with other methods.  
We also extended substantially Fourier methods for the study of noise stability and noise sensitivity. This had broad implications for solving problems in combinatorics and in the interface between probability theory, combinatorics, and the theory of computing.  
Of particular importance was the development by Noam Lifshitz with several collaborators of novel "hypercontractive" inequalities that found applications in many fields.

The PI continued to develop his theory regarding the impossibility of quantum computers. This theory, which is based on analysis of noise stability and sensitivity,
asserts that building large scale quantum computers and even reaching certain early milestones toward this goal are fundamentally impossible.
These early milestones refer to quantum computers in the small and intermediate scales with at most a few hundreds of "qubits".
(Such quantum computers are called NISQ computers where NISQ stands for "Noisy Intermediate Scale Quantum".)

A major challenge to the PI's theory regarding quantum computers came with recent experimental claims: First, the 2019 announcement by a team from Google regarding achieving "quantum supremacy" via random quantum circuits.
This was followed by a 2020 announcement of "quantum supremacy" using a photonic system by a team from USTC.
These experimental assertions (and subsequent claims) have led us to an extensive study of mathematical and statistical aspects of NISQ computers in general and of those two systems by Google and USTC in particular.
Regarding the Google announcement, the PI developed with Rinott and Shoham statistical tools that enabled careful examinations of the mentioned claims.
The PI's early 2014 work with Kindler casts serious doubts on the claims of the group from USTC.

A few sentences about the importance for society. Let us start with the plenty mathematical problems that we tried to solve -
we believe that tackling open problems in mathematics, at times problems that are very easy to state and very hard to solve, can serve as an objective goal for improving technical methods that ultimately can lead to practical implications.
The frustrating pursuit of solutions, sometimes over many decades, reflects the human spirit and human culture and we hope it can lower borders between individuals and between nations.

The question if quantum computers are possible is a vastly important scientific and technological question with huge implications on society.
It is also important to put particular empirical claims under scrutiny, and insights and tools developed for this study may have wider implications.
Developing statistical tools to study the noisy quantum states of NISQ computers and of samples coming from them is important even (or especially even) if quantum computers are possible; the approach and
tools developed in our study seem unique and valuable.  
I will mention eight directions and some noteworthy achievements of the research.

I. Quantum computers - the argument against quantum computers

We gave a computational complexity argument against the feasibility of quantum computers. We identify a very low complexity class of probability distributions described by noisy intermediate-scale quantum computers,
and explain why it will allow neither good-quality quantum error-correction nor a demonstration of "quantum supremacy." Some general principles governing the behavior of noisy quantum systems are derived.

II. Statistical analysis of data coming from noisy intermediate quantum computers

The claim of quantum supremacy presented by Google's team in 2019 consists of demonstrating the ability of a quantum circuit to generate, albeit with considerable noise, bitstrings from a distribution that is considered hard to simulate on classical computers. Verifying that the generated data is indeed from the claimed distribution and assessing the circuit's noise level and its fidelity is a purely statistical undertaking. We study statistical aspects involved in demonstrating quantum supremacy, different approaches to testing the distributions generated by the quantum computer, and various noise models.

III. Hypercontractivity and applications

In the paper "Global Hypercontractivity and its Applications," by Keevash, Lifshitz, Long, and Minzer,
and several subsequent papers a major new theory of hypercontractive inequalities and had substantial progress on a large variety of problems in probabilistic and extremal combinatorics.

IV. Social choice theory

Maskin conjectured that for three and more candidates a certain relaxation of Arrow's famous condition,
allows only the Borda's rule. Gabriel Gendler found a counterexample to Maskin's conjecture and proved that the assertion of the conjecture holds true when there are four or more candidates.
The PI and Lifshitz established sharp connections between the Shapley value - a power measure for voting rules - and the "sharp threshold" behavior.

V. Foundation of physics and computation and philosophical aspects related to the project.

Both noise and computation are related to foundational questions in physics ranging from the question of "What Nature computes?" to paradoxes about black holes. Galina Weinstein, a senior researcher in our project, explored foundational and historical questions in physics and, in particular, relations with black holes. In a paper from 2022, the PI argued that a world devoid of quantum computers supports the possibility of free will.

There was substantial research in three additional directions: VI. the study of models of statistical physics, VII. theoretical computer science, and VIII. Gaussian models in discrete geometry.
1. New powerful hypercontractive inequalities for global functions were found. This has led to various applications in extremal combinatorics, additive combinatorics, theoretical computer science, representation theory, probability theory, and the study of models of statistical physics.
Fourier methods were applied for social choice theory.

2. I further developed his theory explaining the impossibility of quantum computers and studied physical consequences and connections with foundational questions in physics and the theory of computing.
If correct, this represents progress well beyond the state of common scientific understanding.

3. We developed statistical tools for data stemming from noisy intermediate scale quantum computers, and carefully examined statistical claims for "quantum supremacy" and other experimental claims for NISQ computers.
Our analysis has profound consequences on accessing where we stand today in experimental quantum computing.
The levels of noise, computation ability and noise sensitivity.
The advantage of MLE compared to XEB for fidelity estimation
Comparing the empirical distribution with simulation - the Google supremacy experiment
Lecture in Lahore, Pakistan.
Il mio fascicolo 0 0