Skip to main content
European Commission logo print header

Extremal Sparse Graphs and Graph Limits

Periodic Reporting for period 1 - ExtSpGraphLim (Extremal Sparse Graphs and Graph Limits)

Berichtszeitraum: 2017-09-01 bis 2019-08-31

The project Extremal Sparse Graphs and Graph Limits (ExtSpGraphLim - REP-747430-1) focused on understandings of large networks through quantitative measures. The problems studied in this project are in the intersection of mathematics, statistical physics and computer science. The general problem of the area is to give an approximation of a certain quantity of a network based on (partial) information about the network (the mathematical terminology for a network is graph). The information that we get might be the whole graph, but in real life application it is more likely that we only get some local statistics of the graph. For instance, we may only get the degree distribution of the network, where the degree of a node is simply the number of its neighbors in the network. So it might occur that we are only given the information that 40% of the nodes have 3 neighbours. 30% of the nodes have 4 neighbours and another 30% of the nodes have 5 neighbours. Our task is to give meaningful approximation of a pre-described quantity of the network based on such a limited information. It might also occur that we can discover the network from some random vertices in some depth: for instance, we can go from links to links in the world wide web from some random websites to get more accurate (but still statistical) information about the network. Sometimes the graph (network) is not finite, but an explicitly given infinite graph, for instance the infinite grid might be our network. This scenario is particularly common in statistical physics. All these questions have a complelety precise scientific counterpart. For instance, giving an efficient approximation algorithm for a graph parameter given the whole graph is probably one of the most classical problem in computer science. The mathematical language of graph limit theory can handle local statistics and infinite graphs. When the information about the graphs is only reduced to degree distribution of the network, then the questions naturally land in the area of extremal graph theory, because the best approximation we can do is to give the minimum and maximum value of the quantity beside the constraints.

The importance of understanding large networks cannot be exaggerated as they surround us everywhere, let it be the internet, the network of people (facebook) or the neural network of a brain. Our focus was to develop tools to investiagte the above mentioned problems from a mathematical point of view, and not to study particular networks. Shortly, our goal was to improve on the already existent ideas, let it be the zero-free region of graph polynomials, the understanding of a statistical physical heuristics called Bethe approximation or advancing tools like the use of graph covers. These questions also lead to pure mathematical questions like Sidorenko's conjecture.
We submitted the following papers (and many more papers are in preparation):

1. (with B. Szegedy) On Sidorenko's conjecture for determinants and Gaussian Markov random fields

2. (with F. Bencs) Note on the zero-free region of the hard-core model

3. (with M. Borbényi) On degree-contsrained subgraphs and orientations

4. (with A. Imolay) Covers, factors and orientations

The first paper resolves a determinantal version of the Sidorenko's conjecture and it also builds a bridge between homomorpism numbers (important statistics of networks) and Gaussian Markov random fields, a very important concept in statistics and probability theory.

In the second paper we extended the zero-free region of the hard-core model on bounded degree graphs. Hard-core model is one of the most fundamental models in statistical physics. It has also the speciality that many other models like the monomer-dimer model or the Widom-Rowlinson model can be reduced to the understanding of this model. Giving a zero-free region of such a model directly means that we can efficiently approximate many important quantities in certain parameter regimes of the model.

In the third paper we study a tool called gauge transformation. This is a fantastic tool that enables one to transform a computational problem to another one. It also provides a surprising path to the statistical physical heuristics called Bethe approximation. In this paper we used this tool to study subgraph counting problems together with counting orientations. We gained new proofs and insights to old mathematical results.

Is some sense the fourth paper was motivated by the third paper. We wanted to give simpler proofs for certain facts of the third paper that does not use the deep machinery of gauge transformation. This lead to an interesting elementary method that seems very powerful. This method is practically a double counting method providing unusual recursion formula for certain natural graph parameters. This method also turned out to be related to the method of graph covers, a method that were earlier developed to study Bethe approximation. I found it extremely satisfactory how all these ideas fit together.
There are many things that go well beyond the state of the art. Already the first paper establishes an interesting special case of Sidorenko's conjecture and at the same time builds a bridge between homomorphism numbers, Gaussian Markov random fields, logarithmic convergence and an upper bound on the number of spanning trees. It also reveals an interesting phase transition for regular graphs of large girth. The second paper considerably extends our understanding of the zero-free region of the hard-core model. The third paper not only simplifies the machinery of gauge transformation, but also explains old mathematcial results about the number of Eulerian orientations. I am also sure that the new formulas obtained in the fourth paper will have many further applications.
publication-image.png