European Commission logo
italiano italiano
CORDIS - Risultati della ricerca dell’UE
CORDIS

Neural Computations Underlying Social Behavior in Complex Sensory Environments

Periodic Reporting for period 2 - NeuSoSen (Neural Computations Underlying Social Behavior in Complex Sensory Environments)

Periodo di rendicontazione: 2021-08-01 al 2023-01-31

Animals often interact in groups. Animal groups constitute complex sensory environments which challenge the brain and engage complex neural computations. This behavioral context is therefore fruitful for understanding how sophisticated neural computations give rise to behavior in the healthy and diseased brain. However, it is also technically difficult since many of the relevant sensory cues arise from the members of the group and they are therefore hard to quantify or control. Consequently, we only incompletely understand how the brain drives complex social behaviors in naturalistic contexts.
To uncover the neural computations underlying social behavior in groups, we are using Drosophila, which provides unprecedented experimental access to the nervous system via genetic tools. Drosophila gathers on rotten fruit to feed and mate. Courtship and aggression dominate social interactions and rely on the recognition of sex-specific chemical cues and the production of context-specific acoustic signals. How are these multi-modal cues integrated to control and switch between different modes of social interaction? How is unstable and conflicting sensory information resolved to promote stable behavioral strategies? How does sensory processing adapt to socially crowded environments in order to efficiently target behavior at individual members of the group?
We address these issues by combining computational modeling and genetic tools. Using machine learning, we quantify and model the fine structure of social interactions to identify the social cues that drive behavior. Optogenetics and calcium imaging in behaving animals allows us to test the models and to ultimately reveal how the brain integrates, selects and combines social cues to drive social interactions. This multi-disciplinary approach will uncover the computational principles and mechanisms by which sensory information is processed to drive behavior in the complex sensory environment of animal groups.
1. To achieve the above objectives, we have established a suite of tools for recording and modelling social behavior. In the process, we developed and shared with the scientific community a novel tool for analyzing acoustic recordings, called DeepAudioSegmenter. The tool employs modern machine-learning techniques to automate the detection and classification of acoustic signals from raw, multi-channel audio recordings and is now in use by researchers that study animal vocalizations in rodents, birds, anurans, non-human primates in neuroscientific and conservation settings.
2. Next, we wondered how the sex of an interaction target modulates the dynamics of behavioral interactions. This modulation could arise because of target-specific interaction rules, or because of target-specific reactions to the behaviors produced by an individual. To address this question, we exploited the fact that male flies direct courtship-like behaviors, including song, at females but also at other males. Computational modelling revealed that males use the same rules for both female and male-targeted. Manipulations of behavioral feedback demonstrates that this target-specific behavior arises from the reaction of the singing target, not from the rule choice of the singing male, and this is mediated by the P1 neurons, which integrate social cues to control social arousal in flies.
3. To address how conflicting social cues are resolved during social behavior, we looked at how chemical and acoustic cues are combined to drive male courtship. Chemical cues are sex-specific: female pheromones drive and male pheromones suppress courtship. Acoustic cues – courtship songs – drive courtship. Using physical and optogenetic manipulations and playback of song combined with a model-based analysis, we identified the weights of the chemical and acoustic cues and revealed that they are linearly combined to drive courtship.
4. Lastly, we have used the fact that male flies produce two types of signals – air-borne song and substrate-borne vibrations – to study the circuit principles underlying the choice of social behaviors. This revealed that the production of both signals is tightly integrated in the central brain – major drivers of courtship produce both signals. Optogenetics combined with circuit modelling revealed several core computations - feedforward excitation from social cues, recurrence, mutual inhibition – that shape this behavioral choice.
These results constitute progress beyond the state of the art on several fronts:
1. Our newly developed tool for song analysis, DeepAudioSegmenter, greatly improves on existing tools and makes modern machine learning tools for analyzing sound recordings accessible to non-experts. This impact goes beyond the funded project and already enables other researchers to analyze acoustic data at unprecedented scale and precision.
2. We have developed a modelling framework that allows the identification of behavioral rules across contexts (in our case social targets), and that will be useful to researchers in the neural, behavioral and cognitive sciences. By manipulating behavioral feedback during social interactions, we were able to disentangle individual rule choice from target reactions. This is an important step towards untangling social interactions. However, given that social behavior is closed-loop, we need modelling frameworks that take this feedback loop into account. Therefore, we will employ dynamical system models to directly integrate dynamical behavioral feedback into our model. This will reveal how different types of feedback modulate interaction dynamics and give rise to stable social interaction states.
3. We have provided a first quantitative and comprehensive description of how conflicting multi-modal cues are resolved at the behavioral level. In the next step, we will perform calcium imaging to identify the locus and the molecular mechanisms of cue integration and conflict resolution in the brain.
4. Our work on the choice and coordination of multi-modal signal production – song and vibration – revealed principal circuit computations for action selection that are relevant beyond the scope of this project. Notably, we have revealed a circuit with strong intrinsic dynamics. Future work will employ calcium imaging to determine how these intrinsic dynamics are shaped by social cues to produce adaptive social behavior.