Skip to main content
European Commission logo
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary
Contenido archivado el 2024-06-18

Neural bases of visually guided walking in the fly

Final Report Summary - SENSORIMOTORFLY (Neural bases of visually guided walking in the fly)

To control complex actions in space, such as locomotion, the brain must accurately estimate the direction and speed of the moving body, and generate a spatial representation of such movements. It is thought that different aspects of the visual scene of a walking animal could provide critical information for both internal representations, i.e. self-movement estimation, and space. Self-generated visual stimuli, here defined as visual feedback for simplicity, have a global structure that is intimately related to changes in gaze and body positions in time. The local structure of the visual feedback, especially when an animal moves forward, can inform about the relative distance of objects to the retina, or the speed of locomotion. Importantly, this local structure also enables the brain to detect other animals or objects moving in the visual scene. Thus, visual feedback is a rather complex stimulus that continuously excites the retina of an animal moving through a dynamic and uncertain sensory space. Consequently, it has been very difficult to understand how the brain maps the different aspects of this visual feedback onto body movement to estimate the direction and speed of locomotion. Moreover, it also remains unclear how the brain integrates this internal self-movement estimate with “external” events to drive oriented behaviors or navigational decisions.
Vision is one of the many sensory modalities that are “self-excited” during locomotion. Proprioceptive, somatosensitive, and vestibular mechanical signals provide complementary information about the moving body. In addition, during movement, copies of the motor commands, known as corollary discharge signals, are broadcasted widely across many regions in the brain. Thus, the brain of a walking animal can make use of all these multimodal information to create an accurate representation of the body movements. This is important because vision alone is a rather ambiguous signal. As mentioned before, changes in gaze, or movements of other animals or objects in the visual scene affect visual feedback such that its specific relation with body movements will be degraded. Our lab’s main goals are: 1) to identify which circuits are involved in internally estimating the direction and speed of the moving body. 2) To identify circuits involved in detecting and responding to external visual events. 3) To understand how these circuits interact with circuits representing self-movement estimation to guide oriented locomotion. In sensorimotorFly, we have focused our efforts on the first two aims.
We study visuomotor integration in Drosophila melanogaster because we take advantage of four important characteristics of this organism. First, her small size allows us to study walking with high level of detail in laboratory-controlled conditions. Second, about 60% of the fly brain is dedicated to visual processing, strongly suggesting the importance of vision in Drosophila’s life. Third, we can exploit her ever-expanding genetic tools to establish a systematic approach to identify, record and perturb neurons, and thereby establish a mechanistic level of understanding of the function of a neural circuit during behavior. Last but not least, we, and others, have developed techniques for simultaneous recordings of neural activity and locomotion in the fly to study sensorimotor integration. In the fly, two different populations of visual neurons are thought to be involved in processing the global or local structure of visual feedback. The first comprises a population of large-field motion sensitive neurons that respond to image shifts across the entire retina. The second type of neurons is small-field sensitive: these neurons respond to small areas of the field of view of the fly, and are thought to inform the brain about local structure of the visual feedback, and the spatial location of an attractive/repulsive target. While the former class has been extensively characterized long ago, both in Drosophila and in other fly species, the latter class has been less well understood.
Leveraging from previous results from us, and from other labs, we hypothesized that a group of large-field visual neurons was involved in estimating one aspect of the fly’s walking behavior, namely, her angular movements. In sensorimotorFly’s first aim, we asked the question whether these cells, so called HS-cells, receive walking-related, non-visual information. To answer this question, we performed electrophysiological recordings from these cells in simultaneous with walking (by placing head-fixed flies on an air suspended ball), and found that these neurons receive quantitative signals highly correlated with the angular velocity of the fly. These cells also receive signals correlated to the forward movement of the animal. Both the angular- and the forward-velocity related signals are present in HS-cells independent of vision. When HS-cells are artificially activated, by the selective expression of a mammalian cation channel not expressed endogenously, the fly turns consistently, suggesting HS-cells have a premotor function. In other physiological experiments performed in walking flies under visual excitation, we show that HS-cells integrate visual and non-visual information in a cooperative manner. In other words, the HS-cell network seems to be part of a circuit estimating self-movement of a walking fly. Using virtual reality and selective silencing different elements within the network, we showed that motion-sensitive cells are necessary to control the straight course of the freely moving fly. Altogether, our results uncovered a novel function of HS-cells to represent self-movement in an accurate manner, and strongly suggest that these cells are involved in the control of the angular aspect of walking. We are currently actively working on the identification of the cell providing non-visual information to HS-cells to test their function during the control of the fly angular movements.
For sensorimotorFly’s second aim, we study an ethological-relevant orientation behavior. First, we have characterized the natural behavior and found that it critically depends on vision. Second, we performed a silencing screen where we systematically silenced the activity of different genetically identified populations of visual neurons, and found distinct populations of cells to contribute to different aspects of the oriented behavior. Third, using the results from this screen, we successfully modeled the fly behavior such that we can predict with high accuracy the response of the fly given the visual stimuli. Four, to understand better the contribution of the identified neurons on the different aspects of the oriented behavior, we adapted the behavior to a virtual reality world, where we can control the visual experience of the fly with high precision. Five, we have characterized these visual neurons as small-field neurons. Current efforts include a more detailed characterization of their visual properties both in non-behaving, as well as in behaving conditions.
A central question in neuroscience is how distributed populations of cells in the brain represent the integrative processes underlying sensorimotor integration and oriented motor planning. Altogether, our results open the door for mechanistic investigations of the neural computations involved in spatially complex, and sensory-driven sequences of actions. The identified principles underlying these integrative processes will establish a framework that can be tested in other scenarios, particularly important are those associated with neurological disorders such as Schizophrenia. In addition, understanding the algorithms behind sensorimotor transformations in numerically simpler brains, such as in Drosophila, may help us design robotic prostheses and bio-inspired autonomous robots.