Skip to main content
Go to the home page of the European Commission (opens in new window)
English English
CORDIS - EU research results
CORDIS

Neuromechanics of Insect Vision during Aerial Interactions with Applications in Visually Guided Systems

Periodic Reporting for period 3 - Vision-In-Flight (Neuromechanics of Insect Vision during Aerial Interactions with Applications in Visually Guided Systems)

Reporting period: 2021-11-01 to 2023-04-30

Visual guidance is in growing demand for autonomous mobile robots. While image processing and artificial neural networks running on powerful hardware have demonstrated promise, they generate other challenges. For instance, training a network for autonomous control requires large quantities of image data, and incurs a large energy cost. The decisions made by such systems can also be erratic when new situations are encountered while travelling at speed. In contrast, biological vision excels at enabling fast sensorimotor control using less accurate hardware and significantly less energy. One key aspect is the clever use of active vision: knowing where to look, when to look, and how to look for gaining critical information in a timely manner. Biological active vision shapes the visual input for the task at hand (e.g. avoiding an obstacle). Understanding its principles can improve the way we structure and train artificial networks, leading to more reliable and efficient movement of robots.

This project aims to reveal how active vision enables flying insects (e.g. dragonflies) to engage in complex aerial interactions in a remarkably accurate and fast way. Insects have a relatively simple and direct neural wiring from the visual system to the motor system which makes them an ideal model to study vision-guided behaviour at speed. Unique to this project, we combine the latest motion capture techniques to analyse flight behaviours and an ultralight wireless system to record from the insect’s nervous system during flight. The marriage of these two cutting-edge technologies enables us to study how visual input is transformed into steering signals. The scientific output includes a general design framework for visual processing informed by the motor controller. These design principles will be implemented in micro aerial systems (e.g. drones) to demonstrate how bioinspired innovations can improve current robots: fly, swim, or run.
• Uncovered the functional fovea in a biological vision
The fovea is normally defined as an area on the retina with a higher density of photoreceptors in an animal. In predatory insects, such as the dragonfly, a “functional fovea” is a retinal area in which the prey is actively maintained during a hunt. With analyses of behaviour and neural recordings, we have discovered how “visual receptive fields” selectively overlap within this area to allow insects to predict the movement of their target prey. On-going work aims to generalize this strategy for creating task-specific “artificial foveae”.

• Developed a method to characterize biomechanical constraints underlying the flight control
Any mobile system in the real world, be it an animal or a drone, is constrained by its biomechanics and mode of locomotion. These features determine how steering can be achieved and must be considered for the execution of visual guidance. We have developed an experimental and analysis pipeline to study these features: given appropriate mechanical models, we can now empirically define key steering constraints for flying agents, insects or drones.

• Demonstrated bioinspired object detection as the basis for obstacle negotiation
Bioinspired motion vision techniques, to date, have tended to use optic flow (i.e. wide-field motion) for obstacle detection. However, this strategy is not robust against small, isolated objects. We explored a more generic guidance framework inspired by how dragonflies detect flying prey. Specifically, we use motion information to identify objects as targets or obstacles in-flight. Preliminary obstacle avoidance behaviour has been implemented on a micro drone.
• A bioinspired design framework to optimize visual guidance for a particular mobile system
How do we design the visual guidance algorithm of a mobile system to optimise its performance? In animals, successful visual guidance is highly “customised” to the visual system, motor controller, biomechanical constraints, and behavioural objectives. This project explores all these elements across diverse flight behaviours in dragonflies and other large flying insects. The result will generate a new bioinspired design framework for autonomous mobile robots.

• A bioinspired highspeed target tracking algorithm for machine vision
One crucial element in visual guidance is reliable target tracking. Another important research outcome from this project is a comprehensive characterisation of high-speed target tracking behaviour in the dragonfly. Inspired by how predatory insects follow targets and predict their movements, we will develop a custom smart camera system to keep a fast moving subject in view. We will test this system by tracking flying insects in their habitats during our fieldwork to provide some performance evaluations.

• Autonomous drone racing with bioinspired visuomotor control
Drone racing has pushed the limits of visuomotor control in flight as a sport and really challenged relevant technologies to keep up. Autonomous drone racing is a great way to demonstrate visual guidance in small flying systems at high speed. Combining target tracking and guidance strategies, we will demonstrate the benefits and limitations of bioinspired visuomotor control in real-life prototypes.
My booklet 0 0