European Commission logo
English English
CORDIS - EU research results
CORDIS
Content archived on 2024-06-18

Understanding the interaction between timescales of single neurons, networks and the environment

Article Category

Article available in the following languages:

Long timescales support neuron activity and the amazing human brain

Physical actions are the result of the interactions between current external stimuli and a lifelong history. Memory is a crucial part of a fully functional nervous system giving rise to a phenomenal level of features such as adaptability, plasticity and survival systems.

Fundamental Research icon Fundamental Research

Neurons in the brain are connected in vast networks and this connectivity endows properties that are not found in individual neurons. For example, memory has been studied extensively and is the result of presence of long timescales at the network level. If the memory of a stimulus is beneficial for a long time after it ended, long timescales are needed. A large body of research pointed to the network level as the source of these timescales. Individual neurons, however, do have biophysical processes with long timescales. These processes have been usually studied using detailed models that make it hard to connect these properties to the network level. The EU-funded project Multiple timescales, under a Marie-Curie Action: "Career Integration Grants", has investigated multiple timescales. The project worked to understand the dynamical and computational implications of connecting many single neurons, where each one is endowed with multiple timescales. Simple instructions and a complex model As Professor Omri Barak, project coordinator explains, “the command ‘Take a left turn at the next junction’, while seemingly easy to interpret, involves integration of information across at least two different timescales.” The sensory information of the word ‘take’ has to be retained for about a second until the sentence is over. On the other, the meaning of the sentence has to be retained for about a minute until the junction is actually reached. This example presents two timescales out of a large range relevant for daily functioning alone. “We proposed a novel model, that provides a very good approximation of single neuron excitability over prolonged timescales, while remaining relatively simple,” outlines Prof. Barak. These results have been presented at international conferences and were published in the peer-reviewed Journal of Neuroscience. Furthermore, he continues, “I gained a deeper understanding of the dynamics of trained recurrent neural networks without excitability dynamics.” Even our simplest actions, like raising a hand, involve millions of neurons in the brain connected in a complex network. New analysis sheds light on the interplay between the simplicity of the network's output and the complexity of its internal dynamics. As a model we rely on artificial neural networks. These are engineering systems inspired by the brain’s structure. Human level performance from artificial neural networks Recently these networks achieved human level performance in areas such as image and speech recognition. The initial connectivity of the network is random, and training sculpts the connectivity to obtain the desired output. Previously, how training affects the dynamics of these artificial networks was unknown. Multiple timescales showed how the demands imposed on the network's output translate to modifications of the internal dynamics. “We found that a small number of dynamical modes are recruited to support a desired outcome.” Also presented at seminars and international conferences, results have been published in Physical Review Letters and an invited opinion paper in Current Opinions in Neurobiology. Research progress continues into the future Inroads into how trained recurrent neural networks result in sophisticated output such as playing a game of tennis continues. Using a reverse engineering approach, Prof. Barak has started to analyse the effect of training networks in which individual neurons have slow excitability dynamics. “Specifically, we are now training recurrent neural networks that are controlling an agent navigating in a virtual environment. We showed that incorporating slow excitability dynamics can enhance the spatial memory of the agent and are investigating how this happens,” he outlines. Prof. Barak describes how the Marie Curie funds were invaluable during the initial period, before obtaining other sources of funding. Continuing the evolution of the work, being able to attend conferences as a team contributed to the formation of the lab. “Apart from making significant headway into how multiple timescales can result in top level brain function through neural networks, the career integration grant helped me, as the name implies, integrate into my new job as principle investigator,” concludes Prof. Barak.

Keywords

Multiple timescales, network, neuron, long timescales, memory, artificial neural network

Discover other articles in the same domain of application