All animals need to solve some key tasks in order to survive: they need to find food, mates, and avoid threats. To do this they have evolved sensory systems of various modalities, to obtain relevant information from their environment. However, sensation does not happen in isolation: in order to respond to salient environmental cues, sensory information needs to be coupled to movement, and thus motor systems. Furthermore, sensation itself is an active process: animals sense environment by moving in it. This movement in space (whether eyes scanning the environment, or the animal itself moving through it) sets the reference frame for sensory information. We have developed a virtual reality arena that allows online tracking of the fly locomotion, and control of their visual environment. When exploring their environment Drosophila locomotion consists of straight runs and fast saccadic turns.
However, when sensing a directional aversive cue (the heated wall of the arena) the flies perform a fast, evasive saccadic turn directly away from it. In order to do this optimally, they need to integrate their self-motion with the aversive stimulus, by using the former as a reference frame for the latter, and compute the angle of the evasive maneuvre. We will use this active sensation behaviour (with components of self-motion, spatially localized sensory signal, and a directional motor response) as a model for how the fly brain responds to sensory stimuli in space. We will first assess the behavioural capacities of the flies in this context, and whether their behaviour changes as a function of experience. We will identify underlying visuomotor pathways by functional imaging of targeted neurons while the flies are performing the behaviour in a virtual reality environment, and by Electron Microcopy based reconstruction of neuronal connectivity. Finally, we will determine the computations that the underlying circuits are performing by combination of functional imaging and modelling.
Call for proposal
See other projects for this call