Skip to main content
Vai all'homepage della Commissione europea (si apre in una nuova finestra)
italiano italiano
CORDIS - Risultati della ricerca dell’UE
CORDIS

Cognition in an Insect Brain

Periodic Reporting for period 3 - COGNIBRAINS (Cognition in an Insect Brain)

Periodo di rendicontazione: 2023-07-01 al 2024-12-31

There is a common perception that larger brains mediate higher cognitive capacity. Social insects, however, demonstrate that sophisticated cognition is possible with miniature brains. Honey bees display higher-order learning, which are unique among insects. These capacities are mediated by a miniature brain with only 950 000 neurons. Yet, results on cognitive processing in bees were mostly obtained from studies using free-flying honey bees trained to solve visual discriminations. These experiments pose a problem for accessing the neural mechanisms underlying visual cognition, as they involve flying animals. Thus, access to the neural bases of cognitive processing has not been possible even if the relatively low number of neurons of the bee brain would render this goal accessible. Our project thus aims at filling this void via the development of virtual-reality environments in which tethered honey bees walk stationary while being immersed in a virtual landscape in which their movements define objects displacements and expansions. This unique environment offers an exceptional opportunity to uncover the minimal circuits that mediate higher-order forms of cognitive processing in the brain of a behaving bee. We have recently shown that bees learn to visual discriminations in this experimental context, which allows integrating behavioral, neurobiological and computational approaches to unravel the neural mechanisms underlying different levels of learning in the honey bee. The goal of this project is thus to combine behavioral recordings of bees learning simple and complex discriminations in a virtual reality environment, with access to their brain via multiphoton calcium imaging and multielectrode recordings of neural populations. In this way, the neural circuits and computations necessary for elemental and non-elemental visual learning will be determined along the visual circuit of the bee brain. Integrating all project levels will allow expanding the currently available information on the neurobiology of insect learning and providing the first complete account of the mechanisms that underlie visual cognition in a miniature nervous system.
Despite the difficulties imposed by the COVID situation, we rendered our virtual reality (VR) system for honey bees fully operative. Starting from a 2D VR system in which visual objects could be displaced only laterally but not in three dimensions (3D), we moved to a naturalistic and immersive 3D VR. Work in the 2D VR led to a 1st publication showing that bees learn to discriminate visual stimuli in this artificial landscape. In the 3D VR, the study of visual learning was enriched by analyses of motor components as bees explored the virtual arena and exhibited motor patterns consistent with their learning performances. We showed that attentional processes play a fundamental role in learning visual stimuli and that a trade-off between speed and learning accuracy was visible in the VR. These results were published in two scientific articles and opened the door to our first analyses of underlying neural activity: using an Immediate Early Gene (IEGs) approach, we detected for the first time the areas of the bee brain that are activated during associative visual learning. Using 2D and 3D VR, we compared two different forms of visual learning, one more restrictive (2D) and one allowing exploratory components (3D). We showed that the neural phenomena underlying these two forms of learning are different as they lead to either excitatory (3D VR) or inhibitory (2D VR) neural activity in key structures of the visual circuit such as the optic lobes or the mushroom bodies.

Our current electrophysiological investigations focus on mushroom-body output neurons. The first data are very promising as bees learn in our preparation that combines neuron recording with electromyograms of the muscle that controls appetitive responding and video-recordings of the animal’s performance. In the case of calcium-imaging recordings, our analyses were delayed by the late delivery of the multiphoton imaging microscope. We could synchronize animal stimulation with imaging acquisition and then the VR system with the multiphoton. In order to optimize the color stimulation delivered by our VR system, we focused on honey bee opsins and imaged the compound eyes and ocelli using fluorescence in situ hybridization. We have set up the parameters for successful calcium imaging and developed a versatile pipeline for the analysis of calcium data. We are currently trying to record from key visual areas of the bee brain.

In addition, we produced a review on conceptual learning by honey bees, which corresponds to the framework of the project, and an educational review on the merits of C.H. Turner who was the first African American in studying insect cognition, unfairly ostracized when he was alive.

Lastly, despite the presence of COVID, an intensive dissemination activity focusing on honey bees was developed to raise public awareness about the changes that we are imposing to our environment. The focus was set on schools, citizen and beekeeper associations, and extended beyond France. The actions started at the ERC project and expanded progressively towards questions that imply a reconsideration of the human role in an environment that we share with animal species whose cognitive capacities we mostly ignore.
A major goal reached was the study of different forms of visual learning under distinct VR conditions. This breakthrough goes beyond the state of the art as it allowed the comparison between learning forms of different complexity at both the behavioral and neural levels in common virtual scenarios. Under 2D and 3D VR conditions,we determined if associative learning leads to changes in immediate early gene (IEG) expression in specific areas of the bee brain. This kind of analysis had never been performed in the case of controlled associative-learning protocols. In both cases, bees learned to discriminate virtual stimuli and retained the information learned. In the 3D VR, successful learners exhibited IEG up-regulation in the mushroom bodies, thus uncovering a privileged involvement of these brain regions in associative color learning. In the 2D VR, closed-loop conditions restricted stimulus control to lateral displacements and successful learning led to a down-regulation of IEGs in the main regions of the visual circuit, the optic lobes and the mushroom bodies. This is consistent with an inhibitory trace that may relate to the motor patterns required to solve the discrimination task, which are different from those underlying path-finding in 3D VR scenarios allowing for navigation and exploratory learning. Our work thus revealed for the first time that color discrimination learning in VR conditions induces a neural signature that is distributed along the sequential pathway of color processing and that varies according to the nature of the VR and of the learning task.

The incorporation of electrophysiological and calcium-imaging analyses of brain activity during learning in VR conditions will allow an unprecedented comprehension of the miniature neural architectures that mediate simple and complex forms of visual learning. We hope, in this way, to influence future directions in a broad spectrum of research fields, especially in comparing humans with one of the best-known invertebrate systems.
virtual-reality-bee.png
Il mio fascicolo 0 0