Skip to main content

Understanding the interaction between timescales of single neurons, networks and the environment

Final Report Summary - MULTIPLE TIMESCALES (Understanding the interaction between timescales of single neurons, networks and the environment)

Adaptation over multiple timescales is evident at every level of organization. Our actions are the result of the interactions between current external stimuli and a lifelong history. Likewise, the response of single neurons to a stimulus depends on a long stimulation history. This observation implies that a neural system is usually encountered in a different state each time it is stimulated or observed, and hence adaptation has profound implications both for decoding neural activity and for stimulating neural systems. The aim of this project was to understand the interaction between timescales from three sources: those present in individual neurons, those emerging from neural networks, and those presented to a network through the environment.

On the single neuron front, our group has refined existing models of single neurons by analysis of recent experimental data. Specifically, we developed a novel computational model for the long-term dynamics of excitability in single neurons. The model provides a compact description that fits the neural response to a wide range of stimuli.

On the network and environment front, we advanced a deeper understanding of the dynamics of trained recurrent neural networks (without excitability dynamics). Even our simplest actions, like raising a hand, involve millions of neurons in the brain connected in a complex network. Our new analysis sheds light on the interplay between the simplicity of the network's output and the complexity of its internal dynamics. As a model, we rely on artificial neural networks. These are engineering systems, inspired by the brain’s structure. Recently these networks achieved human level performance in areas such as image & speech recognition. The initial connectivity of the network is random, and training sculpts the connectivity to obtain the desired output. So far, it was not known how training affects the dynamics of these artificial networks, let alone in their biological counterparts. In this work, we showed how the demands imposed on the network's output translate to modifications of the internal dynamics. We found that a small number of dynamical modes are recruited to support the desired outcome.

The overall approach of using trained recurrent neural networks has received increasing attention in recent years. I wrote an invited opinion paper on this topic, and was invited to present this approach in many international venues.

During the grant period, my research group integrated both into the Rappaport Faculty of Medicine at the Technion, and into the interdisciplinary Network Biology Research Laboratories at the Technion. I have received external research grants from multiple sources, and have supervised several graduate students. My teaching of undergraduate students was recognized by an excellence award, and the lab’s work was presented at many international meetings and conferences, and published in peer-reviewed journals.