CORDIS - Resultados de investigaciones de la UE
CORDIS

Connectivity in Complex Networks of interacting stochastic nonlinear systems. Applications in neuroscience

Final Report Summary - COCONET (Connectivity in Complex Networks of interacting stochastic nonlinear systems. Applications in neuroscience)

This report describes the activities developed during my Marie Curie Fellowship in the Math&Stats department of the University of Melbourne, followed by the last year spent in GIPSAlab, Grenoble with CNRS.

In preamble I have to say that I arrived in Melbourne in August 2010 and that the Marie Curie Fellowship started
on March 1st 2011. The work done during the six month prior the effective beginning of the fellowship concerned the research proposed in the project CoCoNet in August 2009, and are therefore part of this report.

The common theme of the different activities I develop is connectivity in complex networks, with applications to neural networks. These different activities have lead to publications that are detailed below.

Pooling networks.

I am collaborating with Dr M. McDonnell, from University of South Australia, Adelaïde. A part of my time when I arrived consisted in finishing a paper published at the end of 2010. This work concerns the possibility of applying stochastic pooling networks to nanoscale measurement devices. A motivation for this is the fact that at the nanoscale, electronic noises will be non negligible. We have thus studied how redundancy naturally provided by the stochastic pooling networks could be useful in the processing of information.

The collaboration has turned now to problems that are more related to the research project CoCoNet. We have begun a study on embedded pooling networks. By this we think of a stochastic pooling networks embedded in a larger network, the activity of which may be considered as noise by the pooling network. Therefore the embedding provides the stochasticity to the pooling net. In the model adopted, the embedding net is a sparsely randomly connected network of low-firing rate pyramidal neurons and interneurons. The activity of this network is due to thalamic inputs in the form of Poisson processes. The pooling network is composed of a set of identical pyramidal neurons. The common input to these neurons is provided by the simultaneous excitation of identical synapses in each neuron. There are a certain number of these synapses, each possessing its own weight or synaptic efficacy, and each being present on every neuron of the pooling net. The noise synapses are synapses between the neurons of the pooling net and neurons of the embedding network. As output of the pooling network, we look at the total number of spikes emitted by the neurons of the pooling net in a short time interval. We then have shown the correlation coefficient of signal synapse weights and output space-rate code improves with the pooling net size and is optimal for moderate signal variance. This first result allows to conjecture that parallel redundancy and background synaptic noise enable analog synaptic weights to be read-out by a space-rate population code with a very small time lag. Aspects of this work has led to two communications in conferences. We are currently writing a paper on the subject to be submitted hopefully during 2013. Interestingly, we are collaborating now on this subject with M. S. To, from the Australian National University, Canberra. M.S. To is a PhD candidate working in experimental and theoretical neuroscience.

Multivariate fractal signals.

In many applications, neuroscience, networks, economy, there is a need of new models of multivariate fractal signals. The multivariate aspect reflects the progress made in measurements in this fields where it is nowadays common to have at hand several measurement of the same system simultaneously recorded from different sensors. For example in neuroscience, functional Magnetic Resonance Imaging produces sets of about 100 signals recorded simultaneously that allows to follow the processing of information across the whole brain. These kind of multivariate signals, from which another example is given by currencies change rates, show strong irregularities at many scales, and are well modeled by fractal (and multifractal) signals. Many models exist for monovariate signal but a few for multivariate.

I have collaborated in the last two years with J.F. Coeurjolly on the multivariate extension of the fractional Brownian motion. Since my arrival in Australia, I have finished some theoretical work on the analysis of the process, and extensively work on the identification of the process. This has lead to three papers, two already published, one in press.

I have also initiated some work here at the Department of Mathematics and Statistics with Dr. Jones and Decrouez on the use of the so-called crossing tree for the estimation of the spectrum of multifractal signals. This work is beginning but we have encouraging results. The interest in the technique is that the crossing-tree is an adaptive tool. These first results are submitted for publication in a conference.

Directed information theory.

I have finalized the work on the connection between directed information theory and Granger causality. In a first step, the connection has been established especially for Gaussian processes, and this has lead to the submission of a first paper. However, some definition used in this first work appeared too strong, and I have worked on a weakened version of it. This weaken definition essentially concerns what is called instantaneous coupling between signals. This type of coupling can occur in practice due to essentially two possible ingredients: phenomenon beween two signals more rapid than the time constant of the measurement devices may appear instantaneous; two signals may appear instantaneously related if they are linked via a third party which is not measured nor known.

It appears that two definitions of instantaneous coupling may be given. The link between Granger causality and directed information theory then depends on which definition is adopted. All this is the subject of the last paper submitted in april 2012. Furthermore, we have written a review paper on the subjet which appeared recently in the journal Entropy. Finally, I was invited to the first workshop on directed information theory held in may 2012 in Munich to present my work on this subject.

Linear Granger causality and Partial Directed Coherence.

In the case of linear models, Granger causality inference essentially relies on the identification of multivariate ARMA models, with all the problems that this type of inference rises (notably the model orders choice). In the linear case, once identification of the models have been done, a frequency interpretation can be provided by the so-called Partial Directed Coherence.

I have developped a new estimator of the directed partial coherence which does not need estimation of a model order, and which is very rapid. It relies on spectral estimation and uses a spectral factorization method (Newton-Raphson based algorithm) which converges quadratically. We found that this idea has been used once recently but in our opinion poorly exploited (used to built a Granger causality measure in the time domain). However, our algorithm allows to study in all frequency bands the transfer of energy between different signals. Prior publishing, we need to study the statistical significance of the estimates, and this part is a difficult task. We are then currently working on this. Furthermore, we have use the technique to work on data provided by S. Crochet, Univ. Lyon, and measured by intracranial electrode in the rat. For different sleep states, our technique allows to see which part of the brain communicate with another one. Neuroscientists are currently investigating the result of the processings.

The method has been submitted to Biological Cybernetics and is still under review at the date of this report. The paper can be accessed
at the arxiv website under the reference arXiv:1311.6345

Granger causality and nonlinear regression using kernel methods.

One of the goal of CoCoNet is to study the use of reproducing kernel Hilbert spaces techniques for the study of connectivity in networks. I have developed an original approach to test Granger causality using a particular approach to nonlinear regression using kernel methods. The proposal relies on a Bayesian view of kernel methods and is called Gaussian process regression approach. In this regression method approach, the regression function is search for in a functional space, and a Bayesian point of view is adopted, giving this function a Gaussian process prior. The covariance function of this prior is the kernel. The Bayesian solution is exactly the same as the usual machine learning regularized solution. However, a free by product of the Bayesian approach is an evaluation of the model which is used in the testing procedure. This method has been presented as a lecture in the last international conference on signal processing (ICASSP2012).

From February 2012 to August 2012, I worked with a Master student form Grenoble Institute of Technology on a direct use of regression using kernel machines for Granger causality testing.
In this work, which has been published in IEEE workshop on Machine Learning in Signal Processing 2012, we apply kernel regression to test both dynamical Granger causality and instantaneous coupling. The whole process is described, from optimization of the kernel parameters using cross-validation technique to the testing procedure, including statistical significance (based on an approximate surrogate data approach) and multiple testing corrections.


Toolbox

During the last year, all the codes developed in the previous three items (Directed information theory, linear Granger causality, kernel methods) has been put into a toolbox, both in Python and Matlab, and is available at the adress
http://www.gipsa-lab.grenoble-inp.fr/~guillaume.becq/src/dinfo_v1p0.zip
This toolbox has been written with Guillaume Becq, a CNRS engineer, who has translated the matlab codes into Python, and collected allt he different codes into a usable friendly toolbox. This toolbox is one of the main outcome of CoCoNet.





Sparsification for kernel independence measures

Another approach to use kernels to assess Granger causality is to define conditional independence measures based on embeddings of probability measure in RKHS. Such measures are very appealing because they basically evaluates covariances on nonlinear transforms of the data. They are therefore quite easily implemented. This idea was developed in the last ten years by some people in the machine learning community. However, for practical applications, the measures suffer an important problem. They rely on a so-called Gram matrix which has the size of the data analysed. Thus when the data set is large, or when it comes to define a time varying measure, approximation of the Gram matrix must be be used. Usual approximation are for example based on low rank approximations.

For the goal of having an on-line measure of causality (for example to study the exchange of information between two parts of the brain as time and environment evolve), we need to design on-line version of the independence measure, and then to implicitly answer the question of approximating the Gram matrix (since the size of the data set grows linearly for on-line processing). We have imported for this problem an idea used in on-line filtering using kernel. The idea is to create from the data a dictionary of informative data. New arriving data are then tested to judge the pertinence of including them or not into the dictionary. Based on this strategy, we have designed a recursive version of the independence kernel-based measure. Furthermore, we have provided an interpretation of the sparsification technique in term of vector quantization of the original data. This work is in progress but the first results are accepted for publication for the next IEEE ICASSP in Vancouver. Our goal now is to theoretically understand the method, and to link it with the information bottleneck method, which constitute a mean for defining approximate sufficient statistics.

Others.

I have a collaboration on nonlinear filtering on manifolds and signals indexed by groups with the group of J. Manton, Electrical Engineering Dept, University of Melbourne. A short conference paper concerning particle filtering on manifold was published in the selective conference MNST in 2012. We have pursued our collaboration in another direction. Stochastic processes indexed by groups have been studied for a long time, the first and fundamental results dating back to the early sixties. However, it appeared that nothing was developed concerning generative models for such processes. We have work with S. Said to the theoretical development of second order partial differential equation on compact Lie groups whose input is a white noise indexed on the group. We have studied the properties of the solution in term of their Sobolev regularities. A paper will be published soon on the subject. It is rather theoretical but opens the ways to useful models of processes as for example diffusion on the sphere.


Research organisation:

As announced in the annex B of CoCoNet in 2009, I have also spent time to create an interdisciplinary group in GIPSAlab which federates reseachers of the lab working on the brain. This group, entitled 'cerveau@gipsa' or 'brain@gipsa', is intended to be one of the interdisciplinary group of the new GIPSA. This goup has begun in february 2014. For the moment, the members meet once a month during seminar. It is expected that members will collaborate and that this group will be a fertilizers for crossdisciplinary research.